Apply for this Job
Required Skills: - Strong in SQL and Python, 8+ years' experience.
- Experience with big data frameworks (i.e. Hadoop and Spark), 5+ years' experience.
- Experience building automated data pipelines, 5+ years of experience.
- Experience performing data analysis and data exploration, 5+ years' experience.
- Experience working in an agile delivery environment.
- Strong critical thinking, communication, and problem solving skills.
- Experience with Google Cloud Platform.
- Experience with GCP services (GCS Bucket, Cloud Functions, Dataproc, Dataflows, Pub-Sub)
- Experience working in multi-developer environment, using version control (i.e. Git)
- Experience with orchestrating pipelines using tools (i.e. Airflow, Azure Data Factory)
- Experience with real-time and streaming technology (i.e. Azure Event Hubs, Azure Functions Kafka, Spark Streaming)
- Experience with API build. Exposure/understanding DevOps best practice and CICD (i.e. Jenkins)
- Exposure/understanding of containerization (i.e. Kubernetes, Docker)
Date Posted: 26 March 2025
Apply for this Job