Job Title: GCP Data Engineer
Contract: 6+ Months
Location: Chicago, IL
Onsite: 3-days per week
Interview Process: 2 video interviews
Contract Type: W2
Top Skills Needed:
- Core programing skills: Hands on with Python, Spark/Pyspark
- Advanced SQL: Knowledge of SQL for querying and manipulating data, especially when working with structured data.
- GCP Fundamentals: Deep understanding of GCP concepts like regions, zones, projects, and resource hierarchies.
- Data Processing: Experience with Dataflow for data pipelines, Dataproc for big data processing, and BigQuery for data warehousing.
- Machine Learning: Familiarity with AI Platform and AutoML for building and deploying machine learning models.
- ETL: Expertise in data ingestion, transformation, and loading (ETL) processes, data modeling, and data warehousing.
- Data Visualization: Skills in data visualization tools like Looker or Power BI to create insightful reports and dashboards.
- Streaming Data: Experience with streaming data tools like Kafka, Flink or RabbitMQ
- Certification in GCP