Job Expired - Click here to search for similar jobs
- Role :BIG Data Architect
- Loc :Denver,CO (Onsite)
- Exp:10+ Yrs
Key Responsibilities
- Design and Implement Data Architectures: Develop end-to-end data solutions on AWS, including data lakes, data warehouses, and analytics platforms.
- Develop Scalable Data Pipelines: Create efficient ETL (Extract, Transform, Load) processes to handle large volumes of data using AWS services like Glue, Lambda, and EMR.
- Collaborate with Cross-Functional Teams: Work closely with data scientists, analysts, and engineers to understand data requirements and translate them into technical solutions.
- Ensure Data Security and Compliance: Implement best practices for data governance, security, and compliance in alignment with industry standards and regulations.
- Optimize Data Storage and Processing: Enhance performance and cost-efficiency of data systems by leveraging AWS services such as S3, Redshift, and Athena.
- Automate Deployment and Monitoring: Utilize AWS tools to automate deployment processes and monitor the health of data systems.
- Provide Technical Leadership: Guide and mentor junior team members, ensuring adherence to best practices and fostering continuous learning.
Essential Skills & Tools
- Programming Languages: Proficiency in Python, Java, Scala, and SQL.
- AWS Services: Experience with AWS Glue, Lambda, EMR, Redshift, Athena, S3, DynamoDB, and Kinesis.
- Big Data Technologies: Familiarity with Hadoop, Spark, Kafka, and Databricks.
- Data Modeling: Expertise in designing logical and physical data models for large-scale data environments.
- Data Governance: Knowledge of data cataloging, lineage, and compliance standards.
- Automation and Monitoring: Experience with AWS CloudFormation, CloudWatch, and other automation tools.
Date Posted: 01 June 2025
Job Expired - Click here to search for similar jobs