Job Expired - Click here to search for similar jobs
In this age of disruption, organizations need to navigate the future with confidence by tapping into the power of data analytics, robotics, and cognitive technologies such as Artificial Intelligence (AI). Our Strategy & Analytics portfolio helps clients leverage rigorous analytical capabilities and a pragmatic mindset to solve the most complex of problems. By joining our team, you will play a key role in helping to our clients uncover hidden relationships from vast troves of data and transforming the Government and Public Services marketplace. Work you'll do As a Data Engineer, you will be required to interpret business needs and select appropriate technologies and have experience in implementing data governance of shared and/or master sets of data. You will work with key business stakeholders, IT experts, and subject-matter experts to plan and deliver optimal data solutions. You will create, maintain, and optimize data pipelines as workloads move from development to production for specific use cases to ensure seamless data flow for the use case. You will perform technical and non-technical analyses on project issues and help to ensure technical implementations follow quality assurance metrics. You will analyze data and systems architecture, create designs, and implement information systems solutions. The team Deloitte's Government and Public Services (GPS) practice - our people, ideas, technology and outcomes-is designed for impact. Serving federal, state, & local government clients as well as public higher education institutions, our team of over 15,000+ professionals brings fresh perspective to help clients anticipate disruption, reimagine the possible, and fulfill their mission promise. The GPS AI & Data Operations offering is responsible for developing advanced analytics products and applying data visualization and statistical programming tools to enterprise data in order to advance and enable the key mission outcomes for our clients. Our team supports all phases of analytic work product development, from the identification of key business questions through data collection and ETL, and from performing analyses and using a wide range of statistical, machine learning, and applied mathematical techniques to delivery insights to decision-makers. Our practitioners give special attention to the interplay between data and the business processes that produce it and the decision-makers that consume insights. Qualifications Required: Bachelor's Degree in a relevant field Must live (or be willing to relocate to) the DC Metro area, and willing to report on client site 3-4x a week 4+ years of cloud architectures and enabling tools and technologies, such as, AWS Cloud (Gov Cloud/CS2) and ETL pipelining 4+ years of experience with datastores like PostgreSQL, S3, Redshift, MongoDB/DynamoDB, Redis, Elasticsearch/OpenSearch and SQL. 4+ years experience in Python with key libraries like pandas and PySark, NiFi, Airflow, AWS Lambda or similar technologies. 4+ years experience with software platforms and services, such as, Docker, Kubernetes, JMS/SQS, SNS and Kafka Must have an Active TS/SCI or above government security clearance Must be legally authorized to work in the United States without the need for employer sponsorship, now or at any time in the future Preferred: 2+ years of prior professional services or federal consulting experience, preferably in an Agile environment 2+ years of experience with DevOps environments. Creativity and innovation - desire to learn and apply new technologies, products, and libraries Strong written and verbal communication skills Strong organizational skills Information for applicants with a need for accommodation:
Date Posted: 21 December 2024
Job Expired - Click here to search for similar jobs