AWS Cloud Engineer V

Cincinnati, Ohio

Indus Valley
Job Expired - Click here to search for similar jobs
Req no: 1387
Title: AWS Cloud Engineer V
Location: REMOTE (work/support EST business hrs if PST, the hours would be 10a EST start time)
Duration: 12 month contract possible extensions
Interviews: MS Teams
Visa: Must be W2 eligible (no H1B)

Job Description:

TECHNICAL SKILLS
Must Have
  • AWS services- Bedrock, SageMaker, ECS and Lambda
  • Demonstrated contributions to open-source AI/ML/Cloud projects
  • Demonstrated proficiency in Python and Golang coding languages
  • Experience implementing RAG architectures and using frameworks and ML tooling like: Transformers, PyTorch, TensorFlow, and LangChain
  • LLM
  • Ph.D. in AI/ML/Data Science
Nice To Have
  • Demonstrated experience with AWS organizations and policy guardrails (SCP, AWS Config)
  • FinOps
GENERAL FUNCTION:

We are hiring a Distinguished Cloud AI Software Engineer who has actually built AI/ML applications-not just read about them. You will operate as a trusted advisor in a hands-on capacity for the development of retrieval-augmented generation (RAG) systems, fine-tuning LLMs, and AWS-native microservices that drive automation, insight, and governance in an enterprise environment.

You'll design and deliver scalable, secure services that bring large language models into real operational use-connecting them to live infrastructure data, internal documentation, and system telemetry.

You'll be part of a high-impact team pushing the boundaries of cloud-native AI in a real-world enterprise setting. This is not a prompt-engineering sandbox or a resume keyword trap. If you've merely dabbled in SageMaker, mentioned RAG on LinkedIn, or read about vector search-this isn't the right fit. We're looking for candidates who have architected, developed, and supported AI/ML services in production environments.

This is a builder's role within our Public Cloud AWS Engineering team. We aren't hiring buzzword lists or conference attendees. If you've built something you're proud of-especially if it involved real infrastructure, real data, and real users-we'd love to talk. If you're still learning, that's great too-but this isn't an entry-level role or a theory-only position.

DUTIES AND RESPONSIBILITIES:
  • Design, develop, and maintain modular AI services on AWS using Lambda, SageMaker, Bedrock, S3, and related components-built for scale, governance, and cost-efficiency.
  • Lead the end-to-end development of RAG pipelines that connect internal datasets (e.g., logs, S3 docs, structured records) to inference endpoints using vector embeddings.
  • Design and fine-tune LLM-based applications, including Retrieval-Augmented Generation (RAG) using LangChain and other frameworks.
  • Tune retrieval performance using semantic search techniques, proper metadata handling, and prompt injection patterns.
  • Collaborate with internal stakeholders to understand business goals and translate them into secure, scalable AI systems.
  • Own the software release lifecycle, including CI/CD pipelines, GitHub-based SDLC, and infrastructure as code (Terraform).
  • Support the development and evolution of reusable platform components for AI/ML operations.
  • Create and maintain technical documentation for the team to reference and share with our internal customers.
  • Excellent verbal and written communication skills in English.
REQUIRED KNOWLEDGE, SKILLS, AND ABILITIES:
  • 10+ years of proven software engineering experience with a strong focus on Python and GoLang and/or Node.js.
  • Demonstrated contributions to open-source AI/ML/Cloud projects, with either merged pull requests or public repos showing real usage (forks, stars, or clones).
  • Direct, hands-on development of RAG, semantic search, or LLM-augmented applications, using frameworks and ML tooling like Transformers, PyTorch, TensorFlow, and LangChain-not just experimentation in a notebook.
  • Ph.D. in AI/ML/Data Science and/or named inventor on pending or granted patents in machine learning or artificial intelligence.
  • Deep expertise with AWS services, especially Bedrock, SageMaker, ECS, and Lambda.
  • Proven experience fine-tuning large language models, building datasets, and deploying ML models to production.
  • Demonstrated success delivering production-ready software with release pipeline integration.
NICE-TO-HAVES:
  • Policy as Code development (i.e., Terraform Sentinel) to manage and automate cloud policies, ensuring compliance
  • Experience optimizing cost-performance in AI systems (FinOps mindset).
  • Awareness of data privacy and compliance best practices (e.g., PII handling, secure model deployment).
  • Demonstrated experience with AWS organizations and policy guardrails (SCP, AWS Config).
Skills: AWS/Bedrock/SageMaker/ECS/Lambda, AI/ML, Python/Golang, RAG, Transformers/PyTorch/TensorFlow/LangChain, LLM, Ph.D, SCP, AWS Configuration, FinOps.
Date Posted: 17 May 2025
Job Expired - Click here to search for similar jobs