- Education: Bachelor's or Master's degree in Computer Science, Information Technology, Engineering, or a related field.
- 15+ years of overall IT experience with a demonstrated track record of major solution design/architecture contributions in a large enterprise. Preferably in Supply Chain Domain
- Minimum of 5 years in the customer facing roles with a strong focus on data/integration operations
- Expertise in ETL (Extract, Transform, Load) tools such as Informatica PowerCenter, IBM DataStage, Talend, Microsoft SSIS (SQL Server Integration Services), or Apache NiFi/Airflow and ability to design and implement complex transformation in Python, Spark, Microsoft SQL, Spark-SQL, JOLT transformation.
- Proficiency in data modeling techniques, including dimensional modeling, star schema, snowflake schema, and data vault modeling.
- Deep understanding of distributed computing principles, data management concepts, and big data ecosystem components (e.g., Hadoop, Spark, Kafka, Apache Delta Lake).
- Strong understanding of various integration methodologies via batch using various data formats (CSV, XML, JSON, and Parquet), and hands-on experience in Rest APIs,
- Hands-on experience with cloud platforms (e.g., AWS, Azure, Google Cloud Platform) and containerization technologies (e.g., Docker, Kubernetes), code management tools and frameworks (e.g., Ci/CD, Git) for understanding and managing big data solutions.
- Strong analytical and problem-solving skills, with the ability to translate business requirements into technical solutions.
- Excellent communication and interpersonal skills, with the ability to collaborate effectively across teams and influence decision-making.
- Certifications in relevant technologies (e.g., AWS Certified Solutions Architect, Cloudera Certified Professional) are a plus.
Technical Architect-Big Data