Data Engineer

San Antonio, Texas

IAMUS
Apply for this Job
Description We are seeking a talented Data Engineer to support the ingestion of mission-critical and mission-support data sets into a big data environment. The ideal candidate will have a background in supporting cyber and/or network-related missions within military spaces, as either a developer, analyst, or engineer. Additionally, this role includes responsibilities for leading the department, requiring advanced expertise and leadership capabilities. We are proud to be an equal opportunity employer and do not discriminate based on race, ethnicity, gender identity, sexual orientation, age, disability, religion, or any other characteristic that makes each of us unique. We are dedicated to promoting a professional culture where everyone can thrive. We welcome anyone who is willing to contribute positively to the success of our clients and their missions. Requirements Essential Job Responsibilities Work with big data systems and complex structured and unstructured data sets.
Support government data acquisition, analysis, and sharing efforts.
Focus on software infrastructure for big data scaled ingestion, including understanding and ingesting new data sources, modifying existing data sources, implementing testing, establishing monitoring, and troubleshooting.
Lead the department in achieving strategic goals, ensuring data integrity, and fostering a collaborative and innovative work environment. Minimum Qualifications Security Clearance-Current active TS, SCI eligible
At least 9 years of experience in one of the following: Software development/engineering, data engineering, database engineering, etc. (+4 additional years relevant experience can substitute for a BS degree)
Experience with Unit and Integration testing.
Fluency with data extraction, custom translation development, and loading including data prep and labeling to enable data analytics.
Familiarity with various log formats such as JSON, XML, and others.
Experience with data flow, management, and storage solutions (i.e. Kafka, NiFi, and AWS S3 and SQS solutions)
Ability to decompose technical problems and troubleshoot both system and dataflow issues.
Must be certified DoD IAT II or higher Willingness to do a programming challenge during the interview process.
Experience with Zoom/Teams/Meet style team meetings Languages: Java: Experience with Java, including unit and integration testing.
Python: Experience with Python is desired.
SQL: Familiarity with SQL schemas and statements. Tools and Technologies: Data Flow Solutions: Experience with Kafka, NiFi, AWS S3, and SQS.
Version Control and Build Tools: Proficiency with Maven and GitLab.
Data Formats: Familiarity with JSON, XML, SQL, and compressed file formats.
Configuration Files: Experience using YAML files for data model and schema configuration.
Apache NiFi: Significant experience with NiFi administrationand building/troubleshooting data flows.
AWS S3: bucket administration.
IDE: VSCode, Intellij / Pycharm, or other suitable Technical Expertise : ETL creation and processing expertise.
Experience with code debugging concepts
Expertise in data modeling design, troubleshooting, and analysis from ingest to visualization.
Conducted data model reviews covering data meaning, needs, storage, creation, ingest, and processing.
Developed custom logic for data modification from various sources.
Database and Software Admin & Troubleshooting:Experience with databases, ODBC drivers Linux OS (BASH), and basic server administration.
Expert in data model analysis and troubleshooting issues.
Analyzed and solved complex technical problems. Desired Skills: Prior experience in cyber/network security operations.
Familiarity with Agile environments.
Good communication skills.
Developed documentation and training in areas of expertise.
Amazon S3, SQS/SNS Admin experience Apache Airflow Workloads via UI or CLI a plus
Experience with Mage AI a plus
Kubernetes, Docker
Date Posted: 01 May 2025
Apply for this Job