Application Developer

Naperville, Illinois

SRK Systems Inc
Apply for this Job
SRK Systems, Inc. is an IT firm engaged in the business of providing data processing consulting services to major corporations, government agencies and other business concerns throughout the United States.

SRK Systems Inc. has an opening for the position of Application Developer.

Educational Requirement: Bachelors degree in computer science, computer information systems, information technology, or a combination of education and experience equating to the U.S. equivalent of a Bachelors degree in one of the aforementioned subjects.

Job Responsibilities:
Develop large scale data structures and pipelines to organize, collect and standardize data that helps generate insights and addresses reporting needs.
Implementing data pipelines using frameworks developed in Shell scripting and python to fetch the data from various sources (Delimited Flat Files, Fixed-Length Files, Oracle, MySQL, DB2, Teradata, etc), making transformations over it and loading the data to hive.
Developing, building, and maintaining the Enterprise Data Platform(EDP) for multiple projects and making the data available from multiple external sources in the formats like Parquet, Avro, ORC and JSON formats according to the business requirements in EDP.
Collaborate, transform data, integrate algorithms and models into automated process.
Working with Hadoop architecture, HDFS commands, designing & optimizing queries against data in the HDFS environment.
Working extensively on performance tuning hive queries by parallelizing the queries, using operations like cluster by and distribute by clauses. Tuning the performance of Spark and python-based jobs by choosing the right file format to make transformations over the data.
Validate data and conduct system performance on the developed modules using Python, Pytest and fine tune jobs/process to higher performance & fix critical/complex jobs.
Writing and Making use of User Defined Functions (UDFs) in python and java and making them available in the version control like GIT which can be reused to encrypt the data before loading to the EDP.
Work with Python and Py-Spark scripts to handle data extraction, transformation and migration of the data from On prem to Google Cloud.
Worked on setting up SLO's and SLI's with different cloud services (BigQuery, DataProc, GKE, Cloud Storage, etc) with alerting and monitoring using Cloud Monitoring in GCP.

Required travel to client locations throughout the USA. Please mail resumes to 1811 W. Diehl Rd, Suite 400, Naperville, IL 60563 or Email to (link removed) No phone calls or walk-ins please.
Date Posted: 14 June 2024
Apply for this Job