We are seeking a Senior Database Developer/Engineer with a minimum of 10 years of experience in designing, developing, and optimizing enterprise data solutions. The ideal candidate will have strong expertise in Big Data Platforms (Snowflake, databricks, bigquery, synapse) and a deep understanding of data modeling concepts, including star schema, snowflake model, and denormalized structures. This role requires proficiency in conceptual, logical, and physical data modeling, as well as expertise in data governance, data catalog tools, and security implementations (such as row- and column-level security). Experience with GCP and BigQuery is a huge plus.
PRIMARY RESPONSIBILITIES
Lead the design, development, and optimization of data models on GCP.
Drive the team's adoption of best practices in data management and coding standards.
Act as a subject matter expert in BigQuery and SQL scripting.
Provide technical guidance and support to the team, ensuring project timelines and deliverables are met.
Collaborate with cross-functional teams to integrate data solutions into applications.
Identify skill gaps within the team and provide training and mentorship.
Implement star schema, snowflake model, and denormalized structures based on business requirements.
Develop and maintain conceptual, logical, and physical data models, ensuring scalability, performance, and maintainability.
Define and enforce data governance best practices, including metadata management, data lineage, and data quality controls.
Work with data catalog tools to improve data discoverability, classification, and accessibility.
Implement row- and column-level security to ensure compliance with regulatory and business requirements.
Collaborate with data engineers, analysts, and business stakeholders to understand and optimize data storage and retrieval strategies.
Ensure best practices for query performance tuning, partitioning, clustering, and cost optimization in BigQuery.
Evaluate and recommend new technologies, frameworks, and best practices to enhance data architecture and governance.
We are a company committed to creating inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity employer that believes everyone matters. Qualified candidates will receive consideration for employment opportunities without regard to race, religion, sex, age, marital status, national origin, sexual orientation, citizenship status, disability, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to Human Resources Request Form . The EEOC "Know Your Rights" Poster is available here .
To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: .
Required Skills & Experience
REQUIRED KNOWLEDGE/SKILLS/ABILITIES
10+ years of experience as a Database Developer, with strong expertise in BigQuery and cloud-based data warehousing.
Deep understanding of data modeling techniques, including dimensional modeling (star schema, snowflake model) and denormalized structures.
Strong knowledge of conceptual, logical, and physical data models and experience translating business requirements into scalable data structures.
Hands-on experience with data governance frameworks, metadata management, and data catalog tools (e.g., Google Data Catalog, Collibra, Alation).
Experience implementing data security policies, including row- and column-level security.
Strong proficiency in SQL optimization and query performance tuning in BigQuery.
Experience with ETL/ELT pipelines and data ingestion frameworks.
Familiarity with data orchestration tools like Apache Airflow, Google Cloud Composer, or similar.
Strong problem-solving skills and ability to work independently in a fast-paced, evolving environment.
Excellent communication and collaboration skills to work effectively with cross-functional teams.
Experience with Google Cloud Platform (GCP) and its data ecosystem (Cloud Storage, Dataflow, Pub/Sub, Looker) is a plus.
Knowledge of big data processing techniques and distributed computing frameworks.
Familiarity with machine learning and AI-driven analytics in Google Cloud.
Experience in cost management and optimization strategies for cloud-based data storage and processing.
Benefit packages for this role will start on the 31st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.
Date Posted: 18 May 2025
Job Expired - Click here to search for similar jobs