Senior Data Engineer
Special Instructions
Title: Senior Data Engineer
Location: Westlake, TX - Not considering other sites.
Profile: Data Engineer with Snowflake, AWS (S3, Glue, Lambda, Aurora Postgres)
Language Needs - Python SQL - needs to have strong querying with relational DB - complex SQL statements.
Orch Tool: Control M
Scope of work: This role includes the modeling of data systems, creation of Data Pipelines, Automation for batch process and multi-dimensional Data Systems (Snowflake)
High Priority:
1) Snowflake 1b) AWS 1c) Python, SQL
2.) Control M
Process: 2 rounds of interviews. 1st round with HM for 30 minutes and second round one hour with tech panel and Squad Lead.
Description:
The Role
We are seeking an experienced Senior Data Engineer with a passion for delivering high impact operational and analytical data solutions that integrate across a large organization. In this role, you will build and modernize the database, business rules and API layers of our Customer Relationship Management (CRM) platform. This platform enables all Fidelity customer facing associates with simple, integrated, and modern tools to provide superb customer and client service.
You will apply a variety of cloud-native (AWS and Snowflake) technologies to develop innovative solutions to sophisticated problems. This position is a critical element to delivering Fidelity's promise of crafting the best customer experiences in financial services.
The Expertise and Skills You Bring
- Bachelor's degree in Computer Science (or closely related).
- Proven expertise in Data Modeling, Data Profiling, Data Analysis, Data Quality, Data Governance and Data Lineage.
- Experience migrating databases from On Prem to AWS Cloud.
- Hands-on experience building highly resilient, scalable, and efficient solutions using AWS services like Lambda, Glue, step functions, etc. is a must.
- Hands-on experience with Snowflake (building pipelines).
- SQL specialist; ETL background and experience with schedulers a must.
- Proven experience developing, debugging and tuning complex SQL statements, PL/SQL packages and procedures.
- Hands-on experience with Aurora Postgres.
- Experience with DevOps or CI/CD Pipelines using Maven, Jenkins, Terraform, Github, Ansible, etc.
- Strong in managing API to database connections using different relational database drivers (Oracle, PostgreSQL, etc.).
- Knowledge of Messaging Technologies (Kafka, Kinesis, SNS, SQS).
- Desire and ability to learn and implement new technologies.
- Knowledge of how to develop highly scalable distributed systems using Open-Source technologies.
- Experience in Cloud-based (AWS) Architecture.