Data Integration Engineer

Bethesda, Maryland

Sunayu, LLC
Job Expired - Click here to search for similar jobs
Location: Bethesda, MD
Category: Data Management
Travel Required: None
Remote Type: Hybrid Remote
Clearance: Top Secret/SCI Description Sunayu LLC has an opening for you as our next TS/SCI cleared Data Integration Engineer supporting the National Media Exploitation Center (NMEC) under our 10-year DOMEX Technology Platform (DTP) contract. Have impact as part of a mission focused, solutions oriented, and adaptive team that values innovation, collaboration, and professional development. Your job will be to design, implement, maintain, and monitor data pipelines, both in support of R&D prototypes and production pipelines. You succeed through effective cross functional collaboration in areas such as, but not limited to, development, product, and QA in a dynamic and fast paced environment. While most work is conducted on-site at our client location in Bethesda, MD, we offer a flexible schedule and, some unclassified development tasks may be performed remotely. Percentage of remote work will vary based on client requirements/deliverables. As a senior member of the team, you bring deep expertise in data engineering and will work closely with other infrastructure and network engineers, data scientists, and system engineers on the following key tasks: Perform Database builds, installs, configuration, administration, and troubleshooting of database systems (e.g., MariaDB/MySQL, Postgres, Elasticsearch, Qdrant, Milvus, etc.)
Ensure data integrity by performing employing data engineering best practices
Maintain database and data pipeline documentation, data dictionaries, and system diagrams You demonstrate clear devotion to the data engineering best practices and meet the following qualifications (required): Bachelor's Degree and 12+ years of prior relevant experience or Masters with 10+ years of prior relevant experience
Experience with SQL, NoSQL, and vector databases such as MSSQL, MySQL, PostgreSQL, Redis, FAISS, Milvus, Qdrant, etc.
Experience designing and maintaining ETL and ELT pipelines with technologies such as Spark, Airflow, Dagster, Prefect, Argocd, Metaflow, Kubeflow, etc.
Experience with DevOps / MLOps, using CI/CD methodology with data pipelines, and cloud-native deployment paradigms
Must possess an active Secret clearance and the ability to obtain and maintain a TS/SCI with Polygraph
Experience with database design, implementation, maintenance, monitoring, performance tuning, and optimization.
Expertise in data profiling techniques and understanding the content from both a data quality and business perspective
Experience with data quality and accuracy evaluation techniques
Experience with Agile practices
Development experience with Python
Experience on data model development, modification and migration, and maintenance
Strong verbal and written communication skills
Enthusiastic with the ability to work well on a team and a self-starter who can work independently. You will wow us even more if you have some of these skills: An active TS/SCI clearance
Experience supporting data teams and data scientists
Experience on a production/ enterprise system
Experience in air-gapped environments
Application development and deployment in an AWS environment
AWS certifications
Date Posted: 16 May 2024
Job Expired - Click here to search for similar jobs