Data Engineer

Bentonville, Arkansas

Compunnel
Apply for this Job
Job Summary

The Senior Data Engineer will be responsible for designing, developing, and maintaining big data applications using the latest open-source technologies. The role involves creating scalable data pipelines, automating workflows, and ensuring data integrity across cloud platforms, particularly GCP. The candidate will work in an Agile environment, leading standups, mentoring junior engineers, and collaborating with stakeholders to deliver efficient data solutions.

Key Responsibilities

†Design and develop big data applications using open-source technologies.

†Work in an offshore model with a managed outcome approach.

†Develop logical and physical data models for big data platforms.

†Automate workflows using Apache Airflow.

†Create data pipelines with Apache Hive, Apache Spark, and Apache Kafka.

†Provide ongoing maintenance and enhancements to existing systems.

†Participate in rotational on-call support.

†Learn business domain and technology infrastructure, sharing knowledge actively.

†Mentor junior engineers and lead daily standups and design reviews.

†Groom and prioritize backlog using JIRA.

†Act as the point of contact for assigned business domains.

†Ensure adherence to engineering standards, coding practices, and project timelines.

†Optimize application performance and database efficiency.

†Manage defect resolution, root cause analysis, and quality improvements.

†Work closely with stakeholders to clarify requirements and provide technical guidance.

†Interface with customer architects to finalize design solutions.

†Manage complex user stories and support project execution.

Required Qualifications

†10+ years of hands-on experience developing data warehouse solutions and data products.

†6+ years of experience with distributed data processing platforms (Hadoop, Hive, Spark, Airflow).

†5+ years of experience in data modeling and schema design for data lakes or RDBMS platforms.

†4+ years of recent experience with Google Cloud Platform (GCP).

†Experience building data pipelines in GCP using Dataproc, GCS, and BigQuery.

†Proficiency in programming languages: Python, Java, Scala.

†Experience with scripting languages: Perl, Shell.

†Hands-on experience with large-scale data processing (multi-TB/PB scale).

†Exposure to test-driven development and automated testing frameworks.

†Background in Agile/Scrum methodologies.

†Strong analytical and problem-solving skills.

†Excellent verbal and written communication skills.

†Bachelor's degree in Computer Science or equivalent experience.

Preferred Qualifications

†Experience with Gitflow and Atlassian tools (BitBucket, JIRA, Confluence).

†Experience with CI/CD tools like Bamboo, Jenkins, or TFS.

Location: Bentonville, AR

Education: Bachelors Degree
Date Posted: 13 April 2025
Apply for this Job