Job description:
The ideal candidate will be responsible for designing, developing, and documenting Talend ETL processes, technical architecture, data pipelines, and performance scaling using tools to integrate Talend data and ensure data quality in a big data environment.
Remote, 3+ month contract
What's needed for success:
- Strong experience in Data warehousing using ETL Talend Data Integration tool - Talend Big Data platform, Informatica, AbInitio
- Experience designing and delivering complex, large-volume data warehouse applications
- Senior Level Talend ETL development (4+ years of hard-core Talend experience)
- Teradata experience, including Multiload, Fastload, T-Pump and preferably TPT
- Experience in design and build data integration jobs
- Standard Jobs and Big Data/Hadoop Jobs using Talend
- Experience on AWS services like EMR, Redshift, RDS, Lambda, S3 etc
- Experience with Talend DQ and Profiling
- Should be able work into Agile sprint teams
- CI/CD pipeline setup using Bamboo or Jenkins
- Experience with Talend DI, Talend Bigdata edition, AWS (S3, EMR, Redshift), Spark
- Strong experience in SQL programming
It's a plus if you have:
- Talend Certification
- AWS Certifications