Manager Fleet Services

Atlanta, Georgia

Lcec
Apply for this Job
PAR Government is excited to welcome a Senior Data Engineer to the our Intelligence and Readiness Operations. As a Senior Data Engineer you will support a large-scale intelligence processing system that is focused on digital document exploitation.

Establish a data engineering processes to support the understanding of information needs to maximize the use and value of data and information assets by consumers. You will manage information consistently across the enterprise and align data management efforts and technologies with business needs. Key task areas on the program include forensic image processing, machine learning model production, knowledge graph construction and reasoning, agile development, system security, technology transition, and system operations.

Responsibilities and Duties:
  • Analyze, design, build, test, deploy, operate, and maintain solutions, capabilities, and services to meet data needs.
  • Plan and lead major technology assignments, evaluate performance results, and recommend major changes affecting short-term project growth and success.
  • Conduct requirements analysis.
  • Conduct requirements design.
  • Implementation solutions
  • Maintenance databases of related solution components
Required Skills
  • Must have an active or in scope US Top Secret Clearance with SCI Eligibility
  • Expertise in analyzing. designing, building, testing, deploying. operating, and maintaining solutions, capabilities, and services to meet data needs, including requirements analysis, and design, implementation, and maintenance of databases' related solution components.
  • Bachelors degree or equivalent with a minimum of 10 years of experience as a Data Engineer
  • Experience with defining and recording data requirements and delivering data requirements specifications.
  • Experience with developing and maintaining conceptual data models and delivering conceptual data model diagrams, logical data models, physical data models, and physical databases .
  • Experience with managing data model versions and integrating and delivering data model libraries.
  • Expertise in designing data integration services and delivering source-to-target maps, data extract-transform- load (ETL) design specifications, and data conversion designs.
  • Expertise in writing software code and scripts to distributed the processing of information extraction tasks to identify entities, events, and relationships from large corpus of structured and unstructured data and multimedia stored in a distributed file system ox object store.
  • Experience with applying data cleansing, transformation, and augmentation methods to measure and improve data quality.
  • Experience building, testing, and delivering data integration services
  • Experience with establishing Golden Records and delivering reliable reference and master data.
  • Experience defining, delivering, and maintaining hierarchies and affiliations that define the meaning of data within the context of its interrelationships with other data
  • Experience with importing and exporting data between an external RDBMS and a Hadoop cluster, including the ability to import specific subsets, change the delimiter and file format of imported data during ingest, and alter the data access patten or privileges.
  • Experience ingesting real-time and near-real time (NRT) Streaming data into the Hadoop File System (HDFS), including the ability to distribute to multiple data sources and convert data on ingest from one format to another.
  • Expertise in loading data into and out of the Hadoop File System (HDFS) using the HDFS command line interface; converting sets of data values in a given format at stored in Hadoop File System (HDFS) into new data values and/or a new data format and writing them into HDFS or Hive/HCatalog.
  • Expertise in filtering, sorting, joining, aggregating, and transforming one or more data sets in a given format (e.g., Parquet, Avro, JSON, delimited text, and natural language text) stored in the Hadoop Distributed Filesystem (HDFS)
Required Experience

Qualifications:
  • Must have an active or in scope US Top Secret Clearance with SCI Eligibility
  • Expertise in analyzing. designing, building, testing, deploying. operating, and maintaining solutions, capabilities, and services to meet data needs, including requirements analysis, and design, implementation, and maintenance of databases' related solution components.
  • Bachelors degree or equivalent with a minimum of 10 years of experience as a Data Engineer
  • Experience with defining and recording data requirements and delivering data requirements specifications.
  • Experience with developing and maintaining conceptual data models and delivering conceptual data model diagrams, logical data models, physical data models, and physical databases .
  • Experience with managing data model versions and integrating and delivering data model libraries.
  • Expertise in designing data integration services and delivering source-to-target maps, data extract-transform- load (ETL) design specifications, and data conversion designs.
  • Expertise in writing software code and scripts to distributed the processing of information extraction tasks to identify entities, events, and relationships from large corpus of structured and unstructured data and multimedia stored in a distributed file system ox object store.
  • Experience with applying data cleansing, transformation, and augmentation methods to measure and improve data quality.
  • Experience building, testing, and delivering data integration services
  • Experience with establishing Golden Records and delivering reliable reference and master data.
  • Experience defining, delivering, and maintaining hierarchies and affiliations that define the meaning of data within the context of its interrelationships with other data
  • Experience with importing and exporting data between an external RDBMS and a Hadoop cluster, including the ability to import specific subsets, change the delimiter and file format of imported data during ingest, and alter the data access patten or privileges.
  • Experience ingesting real-time and near-real time (NRT) Streaming data into the Hadoop File System (HDFS), including the ability to distribute to multiple data sources and convert data on ingest from one format to another.
  • Expertise in loading data into and out of the Hadoop File System (HDFS) using the HDFS command line interface; converting sets of data values in a given format at stored in Hadoop File System (HDFS) into new data values and/or a new data format and writing them into HDFS or Hive/HCatalog.
  • Expertise in filtering, sorting, joining, aggregating, and transforming one or more data sets in a given format (e.g., Parquet, Avro, JSON, delimited text, and natural language text) stored in the Hadoop Distributed Filesystem (HDFS)
Date Posted: 22 April 2025
Apply for this Job