Data Engineer

Sunnyvale, California

Lorven Technologies
Apply for this Job
Hi,

Our client is looking for Data Engineer in Sunnyvale, CA below is the detailed requirements.

Job Title : Data Engineer

Location : Sunnyvale, CA

Duration : Fulltime

Note: Coding exp needed

Job Description:
  • Bachelor's or master's degree in computer science, Information Technology, or a related field.
  • Design, develop, and maintain scalable data pipelines and ETL processes using Snowflake and Python.
  • Build and optimize data architectures to support business intelligence, analytics, and machine learning initiatives.
  • Collaborate with data analysts, data scientists, and stakeholders to understand data requirements and ensure smooth dataflows.
  • Manage and administer Snowflake data warehouses, including schema design, performance tuning, and data security.
  • Write efficient, reusable, and maintainable code for data processing and transformation tasks using Python.
  • Implement data quality checks and validation processes to maintain data integrity.
  • Automate data workflows and improve data reliability and performance.
  • Troubleshoot and resolve data-related issues in a timely manner.
  • Maintain and document data engineering solutions, best practices, and coding standards.
  • Proven experience as a Data Engineer with hands-on expertise in Snowflake and Python.
  • Strong proficiency in SQL for querying and manipulating data within Snowflake.
  • Knowledge of Snowflake architecture, data sharing, cloning, and security features.
  • Experience in developing and managing ETL pipelines and workflows.
  • Familiarity with cloud platforms (AWS, Azure, or GCP) and data storage solutions.
  • Proficient in data modeling, data warehousing concepts, and database optimization techniques.
  • Experience with version control systems (e.g., Git) and CI/CD pipelines.
  • Strong problem-solving and debugging skills with attention to detail.
Preferred Skills:
  • Experience with data orchestration tools like Airflow or Prefect.
  • Knowledge of other big data technologies such as Databricks, Spark, or Kafka.
  • Familiarity with REST APIs and data integration from external sources.
  • Exposure to machine learning pipelines and AI workflows is a plus.
Date Posted: 05 May 2025
Apply for this Job