DBT/Snowflake

Pleasanton, California

SysMind Tech
Job Expired - Click here to search for similar jobs
Will be Onshore Lead for 2 Teams- Supply Chain and Merchandising

12- 15 years exp

Architect or Senior Engineer but they will be doing development work

Hands on exp- not looking for manager

Cloud data warehousing

Snowflake

Azure

Aws GCP is secondary- Snowflake is primary

Python, SQL

DBT Experience

Previous retail (Prev Albertsons would be great.)

Communication is key, speaking to stakeholders

Will need to present to leadership

Python sql technical round

Ask to provide how code works, how do you want to do this

Snowflake

Communication skills, explain implementation teams

Strong communication

Communication skills

Everyone's resume says every cloud

Relocation

Interview Process:

45 min with Manager

1 hour technical round sr data engineer

Leadership round with Manager and VP

Work with functionally split within retail analytics side

Person would be working on hub and spoke model- supply chain and merchandising

Two domains supply chains and store sites- lead engineers do review to other on team

GENERAL PURPOSE:

Briefly summarize the overall purpose of the position. This is a short explanation of the job's primary purpose and functions.

The Data Engineer III plays a critical role in engineering of data solutions that support Ross reporting and analytic needs. As a key member of the Data engineering team, will work on diverse data technologies such as Steramsets, dbt, data ops and others to build insightful, scalable, and robust data pipelines that feed our various analytics platforms.

ESSENTIAL FUNCTIONS:

List the core duties or tasks that are fundamental to the performance of the job. This section provides detailed information about the job's tasks, duties, responsibilities, tool and equipment uses. Define the purpose, function and the result to be accomplished. Duties and responsibilities should be listed in order of their importance, occurrence, or time requirements.
  • Design and Model data engineering pipelines that support Ross reporting and analytic needs.
  • Engineer efficient, adaptable, and scalable data pipelines for moving data from different sources into our Cloud Lakehouse
  • Understand and analyze business requirements and translate into well-architected solutions that demonstrate the modern BI & Analytics platform
  • Be a part of data modernization projects providing direction on matters of overall design and technical direction, acts as the primary driver toward establishing guidelines and approaches
  • Develop and deploy performance optimization methodologies
  • Drive timely and proactive issue identification, escalation & resolution
  • Collaborate effectively within Data Technology teams, Business Information teams to design and build optimized data flows from source to Data visualization
QUALIFICATIONS AND SPECIAL SKILLS REQUIRED:

List Education level, Years of Experience, Technical Knowledge, and/or Certifications required for the position.
  • 12 + years in-depth, data engineering experience and execution of data pipelines, data ops, scripting and SQL queries
  • 5+ years proven data architecture experience - must have demonstrable experience data architecture, accountable for data standards, designing data models for data warehousing and modern analytics use-cases (e.g., from operational data store to semantic models)
  • At least 3 years experience in modern data architecture that support advanced analytics including Snowflake, Azure, etc. Experience with Snowflake and other Cloud Data Warehousing / Data Lake preferred
  • Expert in engineering data pipelines using various data technologies - ETL/ELT, big data technologies (Hive, Spark) on large-scale data sets demonstrated through years of experience
  • 5+ years hands on data warehouse design, development, and data modeling best practices for modern data architectures
  • Highly proficient in at least one of these programming languages: Java, Python
  • Experience with modern data modelling tools, data preparation tools
  • Experience with adding data lineage, technical glossary from data pipelines to data catalog tools
  • Highly proficient in Data analysis - analyzing SQL, Python scripts, ETL/ELT transformation scripts
  • Highly skilled in data orchestration with experience in tools like Ctrl-M, Apache Airflow. Hands on DevOps/Data Ops experience required
  • Knowledge/working experience in reporting tools such as MicroStrategy, Power BI would be a plus
  • Self-driven individual with the ability to work independently or as part of a project team
  • Experience working in an Agile Environment preferred, Familiarity with Retail domain preferred
  • Experience with Streamsets, dbt preferred
  • Strong communication skills are required with the ability to give and receive information, explain complex information in simple terms and maintain a strong customer service approach to all users
  • Bachelor's Degree in Computer Science, Information Systems, Engineering, Business Analytics, Business Management required
Date Posted: 03 April 2025
Job Expired - Click here to search for similar jobs