Job Title: AWS Databricks Engineer / Architect
Duration: 6+ Months
Location: Nyc, NY- Hybrid (3 Days onsite)
End-Cleint: Insurance Client
Exp. Level: 10-12+ yearsJob Summary Our client is seeking an experienced
AWS Databricks Engineer to lead the design, implementation, and optimization of large-scale data processing systems using
AWS cloud infrastructure and
Databricks. The ideal candidate will have deep expertise in both
AWS services and
Databricks, with a strong background in
data engineering, cloud architecture, and
big data technologies. This role will involve collaborating with cross-functional teams to deliver scalable, robust, and efficient data solutions that meet business needs.
Responsibilities - Design and develop scalable data architectures using AWS services (e.g., S3, EMR, Redshift, Lambda, Glue) and Databricks.
- Define and implement best practices for data engineering and data management on Databricks and AWS platforms.
- Architect end-to-end data pipelines to process and analyze large datasets.
- Develop ETL processes and data pipelines using Databricks, Apache Spark, and other relevant technologies.
- Optimize Databricks clusters and jobs for cost and performance efficiency.
- Implement data security, governance, and compliance frameworks in line with organizational policies and industry standards.
- Work closely with data scientists, data engineers, and business stakeholders to understand data requirements and translate them into technical solutions.
- Provide technical leadership and mentoring to the data engineering team.
- Lead discussions and decisions on cloud architecture, data strategy, and best practices.
- Monitor, troubleshoot, and optimize data pipelines and infrastructure for high performance and reliability.
- Conduct regular reviews of existing architectures to ensure they meet evolving business needs.
- Document architecture designs, implementation steps, and operational procedures.
- Create and maintain dashboards and reports to monitor data pipeline health and performance.
QualificationsEducation: Bachelor's or Master's degree in Computer Science, Engineering, Information Technology, or related field.
Experience - 7+ years of experience in data engineering, cloud architecture, or related fields.
- 3+ years of hands-on experience with Databricks and AWS services.
Technical Skills - Expertise in AWS services such as S3, EC2, Lambda, Glue, Redshift, IAM, etc.
- Proficient in Databricks, Apache Spark, and related big data technologies.
- Strong programming skills in Python, Scala, or Java.
- Experience with SQL and NoSQL databases.
- Knowledge of data warehousing, ETL processes, and data lake architecture.
- Familiarity with data governance, security, and compliance practices.
Soft Skills - Strong problem-solving and analytical skills.
- Excellent communication and collaboration abilities.
- Ability to work independently and as part of a team in a fast-paced environment.
Preferred Qualifications - AWS Certified Solutions Architect or similar certification.
- Experience with other big data platforms (e.g., Hadoop, Kafka).
- Experience with machine learning workflows and integration with Databricks.
- Familiarity with CI/CD pipelines and DevOps practices.