Job Description and Requirements
Department:
The MetLife Corporate Systems Technology (CST) organization is evolving to enable MetLife's New Frontier strategy. With a strong vision in place, we are a global function focused on driving digital technology strategies for key corporate functions within MetLife including, Finance, Actuarial, Legal, Human Resources, Employee Experience, Risk, Treasury, Audit and Compliance. In partnership with our business leaders, we develop and deliver seamless technology experiences to our employees across the entire employee lifecycle. Our vision and mission is to create innovative, transformative and contemporary technology solutions to empower our leaders and employees so they can focus on what matters most, our customers. We are technologists with strong business acumen focused on developing our talent to continually transform and innovate.
About the Role:
The Control Functions Technology area within Corporate Systems Technology is focused on enabling technology solutions for Audit, Compliance, Risk, and Legal business functions. We are seeking an experienced engineering lead with hand-on experience in designing, developing, and maintaining modern data pipelines and integrations. The ideal candidate will possess a strong background and proven expertise in cloud technologies, DevOps practices and agile delivery methodologies.
Primary Responsibilities:
- Lead the design, development, and maintenance of scalable data architectures and pipelines within Azure Cloud environments.
- Develop comprehensive strategies for the integration of both structured and unstructured risk data sources to enable advanced analytics.
- Build and optimize data models and ETL/ELT processes using Databricks, Scala, and Spark.
- Develop API-based and event-driven data integration solutions to ensure seamless data exchange.
- Implement and adhere to a robust data governance process to ensure data security and quality.
- Utilize DevOps and CI/CD principles to automate the deployment, testing, and monitoring of data pipelines.
- Troubleshoot and optimize data flow performance, ensuring high availability and resilience.
- Lead and mentor junior data engineers, providing technical guidance and support.
- Stay current with emerging technologies in the data ecosystem and drive innovation.
- Enhance solution capabilities by identifying high-impact AI/ML use cases, building robust data infrastructure to support AI solutions, and driving the continuous evolution and expansion of AI-powered capabilities.
- Prepare and maintain software architecture documentation and obtain approval for new capabilities, solutions, tools, and technologies from the Global Architecture Review Board.
- Ensure disaster recovery, privacy, and security measures are aligned to enable application/platform stability, including technology currency management.
- Establish technical standards, best practices, and quality metrics; implement comprehensive technology controls in the managed portfolio to proactively manage technology risks.
- Collaborate with cross-functional teams to understand business requirements and deliver robust data solutions.
Required Technical Skills:
- Proven experience (8 years) in data engineering, big data processing, and distributed systems.
- Expertise in designing scalable data pipelines, data lakes, and Lakehouse architecture using Databricks.
- Proficiency in Azure services (Data Factory, Azure Synapse, Azure Data Lake, or Event Hubs).
- Strong hands-on experience with Databricks and Spark for big data processing.
- Deep understanding of ETL/ELT processes, data modeling, and data warehousing principles.
- Familiarity with cloud-native architectures and CI/CD pipelines.
- Experience with data governance, security, and compliance practices.
- Excellent problem-solving and communication skills.
- Strong leadership and mentoring abilities.
- Proven track record of leading successful technology transformations in the insurance or financial services industry.
Preferred Skills:
- Knowledge of additional programming languages like Python and SQL.
- Expertise in Scala programming for data processing workflows.
- Familiarity with risk management and risk assessment platforms and tools such as OpenPages, Archer, and OneTrust.
- Proficiency in data visualization tools like Power BI.
Equal Employment Opportunity/Disability/Veterans
If you need an accommodation due to a disability, please email us at . This information will be held in confidence and used only to determine an appropriate accommodation for the application process.
MetLife maintains a drug-free workplace.