Data Engineer w/ GCP

Atlanta, Georgia

A2C
Apply for this Job
  • Job Summary:As a Data Engineer will have the opportunity to design and execute vital projects such as re-platforming our data services on Cloud and on-prem and delivering real-time streaming capabilities to our business applications.
  • This position will bring a clear point of view on data processing optimization, data modeling, data pipeline architecture, data SLA management.
  • The Data Engineer holds accountability for the quality, usability, and performance of the solutions.
  • Our mission is design & implement a data and analytics platform/infrastructure that enables a future-state analytics lifecycle, data monetization opportunities, data acquisition, analysis & feature engineering, model training, impact analysis, reporting, predictive and quantitative analysis, & monitoring.
  • What You Get to Do:An ideal candidate is intellectually curious, has a solution-oriented attitude, and enjoys learning new tools and techniques.
  • You will have the opportunity to design and execute vital projects such as re-platforming our data services on Cloud and on-prem, and delivering real-time streaming capabilities to our business applications.
  • Brings a clear point of view on data processing optimization, data modeling, data pipeline architecture, data SLA management.
  • Holds accountability for the quality, usability, and performance of the solutions.
  • Leads design sessions and code reviews to elevate the quality of engineering across the organization.
  • Design, develop data foundation on cloud data platform using GCP tools and techniques e.g.: Google Cloud Platform, Pub/Sub, Big Query, Cloud SQL, BigTable, BigLake DataForm, DataFlow, DataStream, Google cloud storage, Cloud Composer/DAG, Cloud Run, Cloud RESTAPI, ADO GITREPO, CI/CD Pipelines, Secret Manager, Cloud IAM Terraform/YAML etc.
  • ETL pipeline using Python builds and scalable solutions.
  • Multi-level Data Curation and modeling.
  • Data design and architecture.
  • Hands on experience in building complete CI/CD Pipeline creation and maintenance using Azure DevOps and Terraform/Terragrunt.
  • Increase the efficiency and speed of complicated data processing systems.
  • Collaborating with our Architecture group, recommend and ensure the optimal data architecture.
  • Analyzing data gathered during tests to identify strengths and weaknesses of ML Models
  • increase the efficiency and speed of complicated data processing systems.
  • Collaborate across all functional areas to translate complex business problems into optimal data modeling and analytical solutions that drive business value.
  • Lead the improvements and advancement of reporting and data capabilities across the company, including analytics skills, data literacy, visualization, and storytelling.
  • Develop a certified vs. self-service analytics framework for the organization.
  • Collaborating with our Architecture group, recommend and ensure the optimal data architecture.
  • Highly skilled on RDMS (Oracle, SQL server), NoSQL Database, and Messaging services (Publish / Subscribe) systems.
  • Extensive knowledge/coding skills of Python including understanding of data modeling and data engineering.
  • What You Bring to the Table:Bachelor's degree in computer science, Engineering, Mathematics, Sciences, or related field of study from an accredited college or university; will consider a combination of experience and/or education.
  • Ideally 3+ years of experience in developing data and analytics solutions and approximately 4+ years data modeling and architecture.
  • Expertise in programming languages including Python and SQL.
  • Familiarity with certain software development methodologies such as Agile, or Scrum.
  • Critical thinking.
  • Leveraging cloud-native services for data processing and storage.
  • Storage - BigQuery, GCS, Cloud SQL, BigTable, BigLake
  • Event processing - Pub/Sub, EventArc
  • Data pipeline and analytics - Dataflow, DataForm, Cloud Run, Cloud Run Function, DataStream, Cloud Scheduler, Workflows, Composer, Dataplex, ADO GITREPO, CI/CD Pipelines, Terraform/YAML
  • Security - Secret Manager, Cloud IAM
  • Others - Artifact Registry, Cloud Logging, Cloud Monitoring
  • Work with distributed data processing frameworks like Spark.
  • Strong knowledge of database systems, and data modeling techniques.
  • Ability to adapt to evolving technologies and business requirements.
  • Ability to explain technical concepts to nontechnical business leaders.
  • Monitor system performance and troubleshoot issues.
  • Ensure data security.
  • Proficiency in technical skills, cloud tools and technologies.
  • Got Extra to Bring?GCP - Professional Data Engineer Certification
  • Ideally 2+ years in the Energy Sector.
  • Documenting all steps in the development process
  • Manage the data collection process providing interpretation and recommendations to management
Date Posted: 29 April 2025
Apply for this Job