Full-Stack Engineer

Dearborn, Michigan

Dechen Consulting
Apply for this Job
About Dechen Consulting Group (DCG)

Dechen Consulting Group (DCG) is a rapidly expanding, innovative IT Professional Services and Management Consulting company with a track record of more than twenty-five years in delivering skilled professionals to our clients across diverse sectors.

Opportunity Overview

We are currently seeking a professional for a W2 contract opportunity in Dearborn, MI. This role has the potential to extend over multiple years, with the chance to transition to a direct hire position with our client. We provide healthcare, vacation, relocation assistance, and visa sponsorship/transfer. This is a W2 position, not C2C. THIRD PARTIES NEED NOT APPLY. This role offers excellent prospects for career progression.

Position Description

Within the Global Data Insight and Analytics (GDIA) Data Platform, an exciting opportunity awaits you on a team responsible for supporting some of Client's most visible initiatives. GDIA Data Platform is the business owner of the enterprise data lake for all global embedded modem, Smart Mobility experiment, Manufacturing and other enterprise data. The engineer will ensure that the platform delivers capabilities for ingesting, classifying, cataloguing, quality, data governance, and data availability in the data lake. Join us and challenge your Software Engineering expertise and analytical skills to help create a smart future.

Skills Required
  • Technical Skills: Proficient in Java, Angular, or any JavaScript technology with experience in designing and deploying cloud-based data pipelines and microservices using GCP tools like BigQuery, Dataflow, and Dataproc.
  • Service-Oriented Architecture and Microservices: Strong understanding of SOA, microservices, and their application within a cloud data platform context. Develop robust, scalable services using Java Spring Boot, Python, Angular, and GCP technologies.
  • Full-Stack Development: Knowledge of front-end and back-end technologies, enabling collaboration on data access and visualization layers (e.g., React, Node.js).
  • Database Management: Experience with relational (e.g., PostgreSQL, MySQL) and NoSQL databases, as well as columnar databases like BigQuery.
  • Data Governance and Security: Understanding of data governance frameworks and implementing RBAC, encryption, and data masking in cloud environments.
  • CI/CD and Automation: Familiarity with CI/CD pipelines, Infrastructure as Code (IaC) tools like Terraform, and automation frameworks.
  • Problem-Solving: Strong analytical skills with the ability to troubleshoot complex data platform and microservices issues.
  • Certifications (Preferred): GCP Data Engineer, GCP Professional Cloud
Responsibilities
  • Design and Build Data Pipelines: Architect, develop, and maintain scalable data pipelines and microservices that support real-time and batch processing on GCP.
  • Service-Oriented Architecture (SOA) and Microservices: Design and implement SOA and microservices-based architectures to ensure modular, flexible, and maintainable data solutions.
  • Full-Stack Integration: Leverage your full-stack expertise to contribute to the seamless integration of front-end and back-end components, ensuring robust data access and UI-driven data exploration.
  • Data Ingestion and Integration: Lead the ingestion and integration of data from various sources into the data platform, ensuring data is standardized and optimized for analytics.
  • GCP Data Solutions: Utilize GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Functions, etc.) to build and manage data platforms that meet business needs.
  • Data Governance and Security: Implement and manage data governance, access controls, and security best practices while leveraging GCP's native row- and column-level security features.
  • Performance Optimization: Continuously monitor and improve the performance, scalability, and efficiency of data pipelines and storage solutions.
  • Collaboration and Best Practices: Work closely with data architects, software engineers, and cross-functional teams to define best practices, design patterns, and frameworks for cloud data engineering.
  • Automation and Reliability: Automate data platform processes to enhance reliability, reduce manual intervention, and improve operational efficiency.
Experience Required

Minimum 5 years of experience as a Software Engineer

Education Required

Bachelor's degree in Computer Science, Data Engineering, Information Systems, or a related field. Master's degree or equivalent experience preferred.

Additional Information

Hybrid Position For now, it is 2 days a week But may increase in the future

We Are a People-Focused Company with a deep emphasis on family values and look forward to working with you.

Contact Manager

Anna Mastrogiovanni
Date Posted: 07 April 2025
Apply for this Job