Senior Data Platform Engineer

Boston, Massachusetts

Global Partners
Apply for this Job
The Senior Data Platform Engineer is a crucial member of our Data Team, responsible for building and managing advanced data architectures that drive our data and analytics initiatives throughout the organization. You will spearhead the development of cutting-edge big data platforms that are essential to our Global Partners data ecosystem, supporting a wide array of data-driven innovations.

Your expertise with platforms such as AWS, Snowflake, Dagster, and dbt, along with your proficiency in deployment tools like Kubernetes, Docker, and Terraform, will empower you to lead our data excellence efforts. Your experience ranges from best practices in data storage to the continual assessment and integration of new technologies, ensuring Global Partners remains on the forefront of innovation. As a problem solver and automation enthusiast, you will design robust solutions for efficient data engineering and platform management.

You are not just an engineer; you have a passion for visualizing the artistry in orchestrating data. By collaborating with various teams, providing strategic direction, and promoting best practices, you will help shape the future of our data analytics landscape. If you are excited about leading technological evolution and merging strategy with data, we invite you to join us. Global Partners fosters a collaborative environment that invests in cultivating a culture of data-driven excellence.

For more than 90 years, Global Partners LP has been essential in providing the energy that fuels community growth and movement. From Alltown Fresh with its culinary innovations to our extensive network of liquid energy terminals across the eastern seaboard and beyond, we are dedicated to delivering value to our guests and customers. As we embrace the future, we are investing in energy transition initiatives like GlobalGLO and supporting the communities where we operate through charitable efforts.

We are eager to explore the next 90 years at Global Partners, driven by innovative ideas for our guests and customers. We seek passionate individuals who are motivated by progress and ready to take their careers to the next level.

The Qualities You Bring
  • Strong written and verbal communication skills.
  • A self-motivated approach and a proactive mindset.
  • A collaborative spirit and a positive attitude.
  • Effective time management abilities.
Key Responsibilities
  • Design and implement scalable, cloud-native data platforms, leveraging technologies such as AWS, GCP, or Azure, along with Python, Docker, and Kubernetes.
  • Automate deployment (CI/CD) pipelines for data infrastructure and applications, utilizing tools like Jenkins, GitLab CI, or GitHub Actions to facilitate swift and reliable deployments.
  • Apply Infrastructure as Code (IaC) practices with tools like Terraform or CloudFormation to manage and version control cloud resources efficiently.
  • Develop and sustain effective data orchestration workflows using modern tools such as Apache Airflow, Dagster, or Prefect, ensuring smooth data processing and transformation.
  • Create automated solutions and self-service platforms to streamline the setup, configuration, and monitoring of data environments for developers.
  • Optimize data storage and processing systems, including data lakes and data warehouses (e.g., Snowflake, BigQuery, Redshift), focused on cost-effectiveness and high performance.
  • Implement observability and monitoring solutions for data pipelines and infrastructure with tools like Prometheus, Grafana, or DataDog to enhance system reliability.
  • Lead the promotion of DataOps methodologies, encouraging collaboration among data engineering, data science, and operations teams to improve the entire data lifecycle.
What We Offer
  • Competitive Compensation: We're committed to offering competitive salaries and growth opportunities, supported by an excellent Talent Development Team that fosters professional development.
  • Comprehensive Health Benefits: Enjoy medical, dental, vision, and life insurance, along with additional wellness resources.
  • Retirement Savings: We provide a 401k plan with matching contributions.
  • Professional Growth Support: Benefit from tuition reimbursement after six months of service.
  • Community Engagement: We believe in giving back, offering paid volunteer time off for you to contribute to causes that matter.
The Interview Process
  • If you're interested, please submit your application.
  • A member of our talent acquisition team will review your resume in collaboration with the hiring manager. If your background aligns with our needs, a recruiter will reach out to you.
  • We'll conduct both in-person and virtual interviews.
Qualifications
  • A Bachelor's or Master's degree in Computer Science, Engineering, Mathematics, or a related field, or equivalent experience in Data Engineering, DataOps, MLOps, or Software Engineering, with a minimum of 5 to 7 years of experience.
  • Strong expertise in crafting and deploying scalable, cloud-native (containerized) data platforms utilizing Infrastructure as Code (e.g., Terraform, Docker, Kubernetes).
  • Advanced programming skills in Python suitable for data-intensive applications. Proficient in SQL and experienced with cloud data warehouses (e.g., Snowflake, BigQuery).
  • Demonstrated experience in establishing CI/CD pipelines for data infrastructure and applications, using tools like Jenkins, GitLab CI, or GitHub Actions.
  • In-depth understanding of big data technologies (e.g., Apache Spark, Kafka) and data orchestration tools (e.g., Apache Airflow, Dagster), alongside knowledge in data transformation frameworks such as dbt and ETL/ELT processes in cloud settings.
  • Solid foundation in data security, governance, and metadata management. Experience with IAM/RBAC policies, encryption, and data access controls in cloud environments is essential.
  • Capability in implementing monitoring, logging, and alerting solutions for data infrastructure (e.g., Prometheus, Grafana, ELK stack).
  • Adeptness in creating automated tools and self-service platforms that facilitate efficient data environment management for data scientists and analysts.
  • Experience in optimizing data storage and processing systems for responsiveness and cost-efficiency. Familiarity with MLOps and integrating ML models into production is a plus.
  • Strong team player with excellent communication skills, ability to collaborate with cross-functional teams, and willingness to mentor others.
  • Proficient in modern Agile development methodologies, accompanied by strong problem-solving capability and a metrics-driven approach.
We value passion and potential. If you feel excited about a role and believe you can make a significant contribution, we encourage you to apply, even if you don't meet every qualification. We celebrate diverse perspectives, experiences, and backgrounds at Global Partners LP, an equal opportunity employer fostering a culture of inclusion where every idea counts. We respect applicants' diversity and do not discriminate based on race, color, religion, sex, age, national origin, sexual orientation, gender identity, disability, or any other legally protected status. If you require an accommodation to apply due to a disability, please reach out to our recruiting department.

Disclaimer: At Global Partners, we do not use lie detector tests for any employment decisions. In Massachusetts, it is illegal to require or administer a lie detector test as a condition for employment. Any violations of this law could result in criminal penalties and civil liability.

Date Posted: 05 April 2025
Apply for this Job