Job Expired - Click here to search for similar jobs
Dear Partner, Good Morning ,Greetings from Nukasani group Inc ., We have below urgent long term contract project immediately available for GCP Data Engineer, Bentonville, AR , Onsite need submissions you please review the below role, if you are available, could you please send me updated word resume, and below candidate submission format details, immediately. If you are not available, any referrals would be greatly appreciated. Interviews are in progress, urgent response is appreciated. Looking forward for your immediate response and working with you. Candidate Submission Format - needed from youFull Legal NamePersonal Cell No ( Not google phone number)Email IdSkype IdInterview AvailabilityAvailability to start, if selectedCurrent LocationOpen to RelocateWork AuthorizationTotal Relevant ExperienceEducation./ Year of graduationUniversity Name, LocationLast 4 digits of SSNCountry of BirthContractor Type : mm/ddHome Zip Code Assigned Job Details Job Title : GCP Data EngineerLocation: Bentonville, AR , OnsiteRate : Best competitive rate Company Overview We are seeking an experienced GCP Data Engineer to join our team on a long-term contract assignment with a leading client in Bentonville, AR. This is a hands-on role for a data engineering expert who thrives in a fast-paced, collaborative environment and is passionate about leveraging cutting-edge technologies to build scalable data solutions. Key Responsibilities As a Senior Data Engineer, you will: Design and develop big data applications using modern, open-source tools and cloud-native technologies. Develop and maintain logical and physical data models for big data platforms. Build and maintain scalable, efficient, and reliable data pipelines using Apache Spark, Hive, Kafka, and Airflow. Implement solutions using GCP services, including Dataproc, BigQuery, and Google Cloud Storage (GCS Automate workflows and orchestrate jobs using Apache Airflow. Provide ongoing support, maintenance, and enhancements to existing systems. Lead and participate in daily standups, design reviews, backlog grooming, and sprint planning using Agile methodologies (Scrum Mentor junior team members and contribute to knowledge sharing across the team. Serve as a technical liaison for your assigned business domain, ensuring clear communication with stakeholders. Required Skills and Experience Core Requirements 12+ years of hands-on experience in data engineering and data warehouse development. 8+ years of experience withScala and data engineering best practices. 6+ years of experience withApache Spark. 5+ years of experience withdata modeling and schema design for data lakes or relational databases. 4+ years of recent experience working withGoogle Cloud Platform (GCP), specifically: Dataproc BigQuery GCS (Google Cloud Storage) Strong experience with distributed processing tools such as Hadoop,Hive, Airflow, and similar technologies. Proficiency in programming languages such as Python, Java, and Scala. Scripting experience in Shell, Perl, or similar. Experience managing and processing large-scale datasets (multi-terabyte/petabyte scale Familiarity with GitFlow and CI/CD tools (e.g., Jenkins,Bamboo, TFS Solid understanding of Agile development processes, particularly Scrum. Preferred Experience Experience working in a global delivery/offshore model. Exposure to test-driven development (TDD) and automated testing frameworks. Familiarity with Atlassian tools including Bitbucket, JIRA, and Confluence. Strong problem-solving abilities and a self-starter mindset. Excellent verbal and written communication skills. Educational Requirements Bachelors degree in computer science, Engineering, or a related field, or equivalent industry experience.
Date Posted: 23 April 2025
Job Expired - Click here to search for similar jobs