Job Description: Data Architect - Azure & Databricks
What You'll Do
- Architect scalable, high-performing data solutions to meet both functional and non-functional requirements.
- Own the architecture, design, and optimization of Databricks platforms within cloud-based ecosystems (Azure preferred).
- Lead hands-on implementation of data pipelines and infrastructure using best practices for DevOps and infrastructure-as-code.
- Partner with data engineering teams to ensure platform performance, reliability, and maintainability.
- Integrate Databricks solutions across systems, aligning with enterprise architecture standards and delivery best practices.
- Create and review architecture and solution design documents for large-scale data initiatives.
- Evangelize reuse and modular design through shared services and common data models.
- Enforce architectural standards and guide teams on patterns, tools, and delivery methods.
- Mentor and provide technical guidance to engineers during development and delivery.
- Contribute to risk management through proactive identification and mitigation of technical delivery risks.
- Operate at varying levels of abstraction-solutioning high-level architecture while diving deep when needed.
What You'll Bring
- Bachelor's or Master's degree in Computer Science, Information Technology, or related field.
- 8+ years of experience in data architecture, cloud data warehousing, or large-scale data integration.
- 3+ years of hands-on experience delivering production-ready Databricks solutions.
- Strong proficiency in Python, SQL, data modeling, and Azure services (Data Factory, Event Hub, Synapse, DevOps).
- Experience implementing infrastructure using Terraform, ARM templates, or similar IAC tools.
- Comfort working in Agile/Scrum environments and supporting CI/CD pipelines.
- Excellent communication skills-you can explain complex ideas to tech teams and business stakeholders alike.
- Bonus: Familiarity with ML lifecycle concepts and MLOps tools, even if you're not a model builder yourself.