Data Architect

Chicago, Illinois

Experis
Apply for this Job
Role : Data Architect
Location : Chicago IL (Downtown)
Term : 18 Month Contract to Hire
Compensation : W2 Negotiable (Cannot work Corp to Corp)

Summary :
We are seeking a Data Architect with expertise in data modeling, data profiling, Python scripting, and SQL query development to design and optimize our data architecture. The ideal candidate will play a key role in defining and implementing scalable data solutions that support business intelligence, analytics, and operational data needs.

Key Responsibilities :
  • Design and implement data models (conceptual, logical, and physical) to support business and analytical needs.
  • Perform data profiling and analysis to ensure data quality, consistency, and integrity.
  • Develop and optimize SQL queries for data extraction, transformation, and reporting.
  • Create Python scripts for data processing, automation, and integration with other data services.
  • Collaborate with cross-functional teams, including Data Engineers, Analysts, and Business Stakeholders, to align data architecture with business objectives.
  • Define and enforce data governance, security, and compliance best practices.
  • Support data migration and integration initiatives across different databases and platforms.
  • Work with modern cloud-based data platforms (AWS, Azure, or Google Cloud) to design scalable data solutions.
  • Provide technical leadership and mentorship to junior data professionals.
Qualifications :
  • 8+ years of experience in data architecture, data modeling, and data engineering roles.
  • Strong experience in designing relational and non-relational databases (e.g., SQL Server, PostgreSQL, Oracle, MySQL, NoSQL databases).
  • Deep understanding of data modeling techniques (e.g., star schema, snowflake schema, normalized forms).
  • Proficiency in SQL development, query optimization, and performance tuning.
  • Experience with Python for data processing, scripting, and automation.
  • Knowledge of ETL/ELT processes and tools such as Apache Airflow, Talend, Informatica, or dbt.
  • Familiarity with big data technologies (e.g., Spark, Hadoop) is a plus.
  • Strong problem-solving skills and ability to communicate technical concepts effectively to both technical and non-technical stakeholders.
Preferred Qualifications :
  • Experience in data cataloging, metadata management, and data lineage tracking.
  • Familiarity with machine learning pipelines and AI-driven analytics.
  • Knowledge of DevOps and Infrastructure as Code (Terraform, Ansible) for data infrastructure management.
  • Experience with Kafka or other event-driven architecture for real-time data processing.
Date Posted: 02 April 2025
Apply for this Job