Apply for this Job
Title: Data Engineer Duration: 11 months Location: Phoenix, AZ 85027- Hybrid Manager Notes: - Hybrid role, this person will need to start on site, 2-3 days/week in the Phoenix, AZ location
- Strong preference for local Arizona candidates
- Candidates need specific engineering experience within GCP (google cloud), must do design work
- Python, Pyspark and Big Query, this person must have experience with 2 of those 3.
- The team will work within Lumi (Client internal GCP Platform) and utilize the suite of google services
- Looking for 5-7+ years of experience
Job Description: - s a Data Engineer, you will be responsible for designing, developing, and maintaining robust and scalable framework/services/application/pipelines for processing huge volume of data. You will work closely with cross-functional teams to deliver high-quality software solutions that meet our organizational needs.
Key Responsibilities: - Design and develop solutions using Bigdata tools and technologies like MapReduce, Hive, Spark etc.
- Extensive hands-on experience in object-oriented programming using Python, PySpark APIs etc.
- Experience in building data pipelines for huge volume of data.
- Experience in designing, implementing, and managing various ETL job execution flows.
- Experience in implementing and maintaining Data Ingestion process.
- Hands on experience in writing basic to advance level of optimized queries using HQL, SQL & Spark.
- Hands on experience in designing, implementing, and maintaining Data Transformation jobs using most efficient tools/technologies.
- Ensure the performance, quality, and responsiveness of solutions.
- Participate in code reviews to maintain code quality.
- Should be able to write shell scripts.
- Utilize Git for source version control.
- Set up and maintain CI/CD pipelines.
- Troubleshoot, debug, and upgrade existing application & ETL job chains.
Required Skills and Qualifications: - Bachelor's degree in Computer Science Engineering, or a related field.
- Proven experience as Data Engineer or similar role.
- Strong proficiency in Object Oriented programming using Python.
- Experience with ETL jobs design principles.
- Solid understanding of HQL, SQL and data modelling.
- Knowledge on Unix/Linux and Shell scripting principles.
- Familiarity with Git and version control systems.
- Experience with Jenkins and CI/CD pipelines.
- Knowledge of software development best practices and design patterns.
- Excellent problem-solving skills and attention to detail.
- Strong communication and collaboration skills.
- Experience with cloud platforms such as Google Cloud.
Date Posted: 20 March 2025
Apply for this Job