Data Integration Specialist with Security Clearance

Gwynn Oak, Maryland

KOAR Cloud Solutions
Job Expired - Click here to search for similar jobs
Data Integration Specialist Key Required Skills:
Worked with advanced technical principals, theories, and concepts; well versed in technical products; able to work on complex technical problems and providing innovative solutions; and can work with highly experienced and technical resources. Position Description:

• Demonstrates ability to communicate technical concepts to non-technical audiences both in written and verbal form.

• Assembles large, complex data sets to meet business requirements.

• Works in tandem with Data Architects to align on data architecture requirements provided by the latter.

• Creates and maintains optimal data pipeline architecture.

• Identifies, designs, and implements internal process improvements: automating manual processes, optimizing data delivery.

• Implements big data and NoSQL solutions by developing scalable data processing platforms to drive high-value insights to the organization.

• Supports development of Data Dictionaries and Data Taxonomy for product solutions.

• Demonstrates strong understanding with coding and programming concepts to build data pipelines (e.g., data transformation, data quality, data integration, etc.).

• Builds data models with Data Architect and develops data pipelines to store data in defined data models and structures.

• Demonstrates strong understanding of data integration techniques and tools (e.g., Extract, Transform, Load (ETL) / Extract, Load, Transform (ELT tools and database architecture.

• Demonstrates strong understanding of database storage concepts (data lake, relational databases, NoSQL, Graph, data warehousing).

• Identifies ways to improve data reliability, efficiency, and quality of data management.

• Conducts ad-hoc data retrieval for business reports and dashboards.

• Assesses the integrity of data from multiple sources.

• Manages database configuration including installing and upgrading software and maintaining relevant documentation.

• Monitors database activity and resource usage.

• Performs peer review for another Data Engineer's work.

• Assists with development, building, monitoring, maintaining, performance tuning, troubleshooting, and capacity estimation.

• Sources data from the operational systems.

• Prepares the database-loadable file(s) for the Data Warehouse.

• Manages deployment of the data acquisition tool(s).

• Monitors and maintains Data Warehouse/ELT.

• Monitors, reports, and resolves data quality.

• Works closely with all involved parties to ensure system stability and longevity.

• Supports and maintains Business Intelligence functionality.

• Evaluates, understands, and implements patches to the Data Warehouse environment.

• Loads best practices and designs multidimensional schemas.

• Attend all customer technical discussions/design/development meetings and provide technical inputs to further enhance the code quality/process.

• Provide guidance/support to other junior/mid-level developers.
• Impact functional strategy by developing new solutions, processes, standards, or operational plans that position Leidos competitively in the marketplace.

• All other duties as assigned or directed.
Skills Requirements: Selected candidate must reside within two (2) hours of SSA Headquarters Woodlawn, MD
FOUNDATION FOR SUCCESS (Basic Qualifications)
This experience is the foundation a candidate needs to be successful in this position:

• Bachelor's degree in computer science, IT, or related field/experience

• Must be able to obtain and maintain a Public Trust. Contract requirement.

• 7 years of experience with Data Engineering, working with large-scale data processing and ETL pipelines.

• 5 years of hands-on experience with data modeling, architecture, and management.

• 5 years of experience with Relational Database Systems, Data Design, RDBMS Concepts, ETL

• 5 years of experience working with data in cloud environments such as AWS (preferred), Azure, GCP

• 3 years of experience in T-SQL, SQL, ELT/ETL performance tuning.
FACTORS TO HELP YOU SHINE (Required Skills)
These skills will help you succeed in this position:

• Programming experience in Snowflake, Hadoop, or other Data Warehouse technologies; Snowflake preferred.

• Experience in Microsoft SQL Server 2008R2 or newer.

• Experience with SSIS or equivalent ETL tool.

• Experience in SQL/Stored Procedure development.
HOW TO STAND OUT FROM THE CROWD (Desired Skills)
Showcase your knowledge of modern development through the following experience or skills:

• Understanding of semi-structured / unstructured data using JSON, AVRO, Parquet, CSV, etc.

• Experience with complex multi-server environments and high availability environments.

• Experience using Azure Data Factory, Five Tran, Spark or other similar data integration tools.

• Experience with system monitoring, log management, and error notification.

• Fundamental network and IT infrastructure knowledge.

• Highly motivated; able to work independently, multi-task, respond to changing priorities, and initiative to own specific tasks.

• Strong problem determination, troubleshooting, and resolution skills.

• Excellent written and oral communication skills & customer service skills.

• Ability to work analytically to solve both tactical and strategic problems.
Education:
• Bachelor's degree in computer science, IT, or related field and 7+ years of experience

• Must be able to obtain and maintain a Public Trust. Contract requirement.
Date Posted: 25 March 2024
Job Expired - Click here to search for similar jobs