Senior Data Engineer We are looking for an experienced and highly skilled Senior Data Engineer to join our Data Engineering Team. In this role, you will design, develop, and optimize data solutions leveraging Microsoft Azure data services. You will collaborate closely with a team of data professionals, business stakeholders, and data analysts to deliver robust, efficient, and secure data solutions.
Key Responsibilities: - Implement robust ETL and data integration processes to ingest, transform, and securely store data from diverse sources.
- Collaborate with Data and BI analysts, and key stakeholders to understand business requirements and deliver effective, data-driven solutions.
- Optimize data storage, retrieval, and processing for high performance and cost-efficiency.
- Monitor and maintain data pipelines, swiftly addressing issues and ensuring consistent data availability.
- Enforce best practices around data security, compliance, governance, and data quality standards.
- Provide technical guidance and mentorship to junior data engineers and team members.
- Stay current with advancements in Azure data technologies, proactively recommending enhancements and innovations.
- Create and manage detailed data transformation pipelines within Azure.
- Perform source data profiling, analysis, and ensure accurate data mapping from source to target systems.
- Design, implement, and manage complex data modeling, data tagging, transformation processes, and automated quality validation routines.
- Develop and maintain shell scripts and utilize orchestration tools such as Apache Airflow to automate workflows.
- Actively participate in agile ceremonies including sprint planning, retrospectives, code reviews, and technical reviews.
- Support business stakeholders throughout requirements gathering, scoping, and testing phases.
- Provide reliable on-call support for production data systems, managing incidents, code fixes, updates, and patches.
Qualifications: - Bachelor's or Master's degree in Computer Science, Information Technology, or related field.
- Minimum of 5 years of experience in Data Engineering, with significant hands-on experience in Microsoft Azure.
Required Skills: - Expert-level experience with Azure Data Services including Azure Data Factory, Azure Databricks, Azure SQL Database, and Azure Data Lake Storage.
- Proficiency in SQL, Python, Shell scripting, and PowerShell.
- Extensive experience with data integration, ETL processes, data warehousing, and data modeling concepts.
- Strong knowledge of data governance, data security, and best practices.
- Proven capability in developing and maintaining high-performance data ingestion pipelines.
- Familiarity with various integration connectors (APIs, File systems, SFTP/FTPS, Streaming Data, DB Connectors, SOAP, REST).
- Demonstrated experience in securing communications between data endpoints.
- Experience with orchestration tools, especially Apache Airflow.
- Experience with Agile methodologies and software development lifecycle (SDLC).
Preferred Skills: - Experience using BI tools and reporting solutions.
- Advanced knowledge of performance tuning and optimization of data services.
Personal Attributes: - Exceptional problem-solving skills with the ability to work independently and collaboratively within diverse teams.
- Excellent communication skills, capable of clearly articulating complex technical concepts to non-technical stakeholders.
- Strong organizational and time management abilities, ensuring timely delivery of high-quality solutions.
- Passion for continuous learning and professional growth in data engineering and related fields.
Why Join Us at APCO Holdings At APCO Holdings, we are a trusted leader in the automotive industry, providing F&I products and services through our renowned brands, including EasyCare, GWC Warranty and MemberCare. With over 35 years of experience, we've protected more than 11 million drivers and paid out over $3.5 billion in claims, underscoring our commitment to excellence and customer satisfaction.
This role will be critical in driving the successful execution of complex data engineering initiatives, supporting scalable and secure data pipelines, and helping build a modern data ecosystem for one of the nation's leading automotive F&I products and services companies.