Apply for this Job
Responsibilities
- Understanding different aspects of the Careignition product and strategy to inform product, infrastructure and pipeline design
- Have data quality and performance top of mind while designing key engineering products
- Owning core data pipelines and products that power our industry leading insights platform
- Design, build, and maintain robust ETL/ELT pipelines, reusable components, frameworks, and libraries to process data from a variety of data sources ensuring data quality and consistency
Qualifications
- Strong programming and data engineering skills, with proficiency in distributed data processing
- Experience building data models and data pipelines on top of large datasets (on the order of 500TB to petabytes)
- Extensive experience with cloud-based data services (e.g., RDS, DynamoDB), containerized infrastructure (e.g., EKS, ECS, Docker), and data orchestration systems (e.g. Airflow, Argo, Step Functions)
- Experience working with different performant warehouses, data lakes, lakehouses; Iceberg, Redshift, Snowflake, Databricks.
- Experience with CI/CD pipelines, version control (Git), and DevOps practices in a data engineering context
- Existing background in backend development (Python/FastAPI/Django) is a big plus
Date Posted: 24 April 2025
Apply for this Job