Data Engineer

Plano, Texas

eTeam
Apply for this Job
Responsibilities:
  • Extensive experience in designing, configuring, deploying, managing and automating AWS Core Services like S3, IAM, EC2, Route53, SNS, SQS, ELB, CloudWatch, Lambda and VPC.
  • Experience in automating cloud deployments using Terraform and Python
  • Experience in DevOPS, administering the deployment, management, and monitoring of applications deployed on AWS via CI/CD.
  • Maintaining the Jenkins pipeline and Perform code promotions through change management
  • Experience in AWS data platform services like Redshift, DynamoDB, Databricks, Glue, MLops and Athena.
  • Ensuring the compliance of the data science operations on AWS.
  • Monitoring usage, cost, and implement optimizations of a variety of AWS resources.
  • Provision DynamoDB tables with encryption and grant access using the IAM policies
  • Deploy and manage AWS Serverless application running on API Gateway and LAMBDA
  • Deploy Redshift Clusters into VPC with encryption, enable cross region snapshots, configure subnet groups and setup monitoring, and resize the cluster using elastic and classic methods
  • Manage Denodo VDBs, address performance issues, support onboarding of new apps and datasets
  • Manage Denodo stored procedures, healthchecks, HA setup, and address connectivity issues.
  • Provide detailed capacity assessment on a regular basis for Denodo, Redshift, and Databricks.
Required Skills:
  • AWS (experience mandatory): S3, IAM, EC2, Route53, SNS, SQS, ELB, CloudWatch, Lambda and VPC
  • Automation (experience mandatory): Terraform and Python
  • Bigdata (Experience with atleast 2): Denodo, Redshift, DynamoDB, Databricks, Glue, and Athena.
  • DevOPS (Mandatory): GitHub Actions, Python/Shell scripting
Date Posted: 22 April 2025
Apply for this Job