Job title: Azure Data Engineer Location & commitments: - Nearshore or offshore would be ideal for location. Our expectations would be in line with the location.
- Contract role with Staples, full time hire to BC Turkey or BC India. Flexible US EST hours over lap.
- Full-time
- $25-27 per hour for candidates
About the role: This role will be part of a team that helps migrate data
workloads (data engineering / Client). from Azure Databricks into Snowflake (specifically Snowpark). This person will work on a team of 5-10 at Staples, including one of our current teammates who works on that team.
About the project: This project will begin as a POC where we prove out 2-3
workloads (data engineering / Azure Client). and why it would be faster & cheaper to house
these workloads (data engineering / Azure Client) in Snowflake instead of Databricks. Once proven, the intent is to bring this teammate onboard Staples fulltime and have them start moving 20-30 other workloads into Snowpark/Snowflake. They will receive other work from Staples as well.
Qualifications or must have candidate requirements: - Knowledge on Airflow Administration and DAG development is good to have.
- Experience in SQL, Python, and Object-oriented programming.
- Experience migrating data from one system to another (e.g., from Cloudera)
- Hands-on professional Azure Cloud experience.
- Understanding of Databricks, both positives and negative features.
- Knowledge of Azure DevOps, data factory, Azure Databricks, and Snowflake
- Knowledge of AI Client / Azure Client is preferred.Good to have skills:
- Knowledge of data warehouse, database concepts, and ETL tools (Informatica, DataStage, Pentaho, etc.) is preferable.
- Knowledge of DBT.
- Knowledge of HDFS, Hive queries.