Apply for this Job
Tekfortune is a fast-growing consulting firm specialized in permanent, contract & project-based staffing services for world's leading organizations in a broad range of industries. In this quickly changing economic landscape, virtual recruiting and remote work are critical for the future of work. To support the active project demands and skills gaps, our staffing experts can help you find the best job for you.
Title: ETL IICS Developer
Location: McLean, VA (Onsite 5 Days in a week)
Duration: Long Term Contrat
Job Description:
Need mandatory :
1) ETL : yrs exp
2) IICS : yrs exp
3) SQL : yrs exp
4) Python / Pysprak : yrs
5) Shell scripting : yrs
6) AWS : yrs
Job Summary
We are seeking a skilled ETL Developer to design, develop, and maintain robust data pipelines and integration solutions.
The ideal candidate will have a strong background in ETL/ELT processes using Informatica Intelligent Cloud Services (IICS), Python, and PySpark, coupled with proficiency in shell scripting, AWS services, and CI/CD practices.
This role involves collaborating with cross-functional teams to ensure efficient data flow and integration across various platforms.
Key Responsibilities
ETL/ELT Development: Design and implement scalable ETL/ELT pipelines using IICS, Python, and PySpark to process and transform large datasets from diverse sources.
Data Integration: Integrate data from multiple systems, ensuring data quality, consistency, and reliability throughout the data lifecycle.
Shell Scripting: Develop and maintain shell scripts to automate routine data processing tasks and support ETL workflows.
AWS Services: Utilize AWS services such as S3, Glue, Lambda, Redshift, and EMR to build and manage cloud-based data solutions.
CI/CD Implementation: Implement and manage CI/CD pipelines using tools like Jenkins or GitHub Actions to automate testing, deployment, and monitoring of data pipelines.
Collaboration: Work closely with data architects, analysts, and other stakeholders to gather requirements and deliver data solutions that meet business needs.
Performance Optimization: Monitor and optimize the performance of data pipelines and ETL processes to ensure efficiency and scalability.
Required Qualifications
Education: Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
Experience:
Proven experience in designing and developing ETL/ELT pipelines using IICS, Python, and PySpark.
Strong proficiency in shell scripting for automation of data processing tasks.
Hands-on experience with AWS services, including but not limited to S3, Glue, Lambda, Redshift, and EMR.
Familiarity with CI/CD tools such as Jenkins or GitHub Actions for automating data pipeline deployments.
Preferred Qualifications
Experience with additional AWS services like Athena, Step Functions, and CloudWatch.
Knowledge of containerization and orchestration tools such as Docker and Kubernetes.
Familiarity with data security and privacy best practices.
For more information and other jobs available please contact our recruitment team at . To view all the jobs available in the USA and Asia please visit our website at .
Date Posted: 01 May 2025
Apply for this Job