Title: Data Scientist (Big Data Engineer)
Location: Austin, TX
Duration: Long Term
You can
ApplyHere or click on the link to the
Arytic AI Hiring PlatformJob Description:
Perform complex Data Scientist work. Analyze information in TWC's Enterprise Data Warehouse and design, develop, and test processes for the data warehouse. Develop and document processes to extract data from the data warehouse for accurate reporting. Conduct complex data analysis and visualization tasks along with gathering, determining data quality/profiles, data cleansing, and preparing data from various sources. Perform other analysis and documentation tasks as required to facilitate and support the Enterprise Data Warehouse. Define and document the technical architecture. Provide a written report of assessment methodology, findings, gap analysis, conclusions, and recommendations for a more robust method of reporting data. Communicate with the project stakeholders, management, and other relevant parties to discuss methodology, approach, and schedule. Perform other duties as assigned to maintain operations.
TWC to obtain expert services to:
- Assess TWC's Workforce Innovation and Opportunities Act (WIOA) customer data contained in TWC's Virtual One Stop case management system for potential data quality issues impacting TWC's ability to produce WIOA performance data that is consistent, accurate, and reliable, including production of U.S. Department of Labor Participant Individual Record Layout (PIRL) Reports for submission in the Workforce Integrated Performance System (WIPS).
- Assess and evaluate the current analytics model used to create data (files/detail tables), merge data, process and calculate data, and produce outcomes data that reliably supports tracking performance outcomes for WIOA services provided by the State and its twenty-eight (28) Local Workforce Development Boards.
- Provide a written report containing the assessment methodology, findings, conclusions, and recommendations for corrective actions. The report would also include a proposed specific resolution plan that contains specific actionable tasks to implement in order to effectively resolve identified findings.
Required Skills: - 8 years of experience in Design and develop Extraction, Transformation, Load (ETL) processes for enterprise data warehouse using Informatica; Deep hands-on expertise with Informatica ETL processes, development, data management, data profiling, data flows, data relationships, and data quality standards and processes.
- 8 years of experience with data warehouse development and testing (relational and dimensional) and proven evidence to implement using databases like Oracle, SQL Server etc. but preferably Oracle.
- 8 years of experience in Data Science.
- 8 years of experience with data modeling preferably in a data warehouse environment.
- 8 years of experience with Informatica product suite in the Informatica cloud4-7 years of experience with a Test Tool, such as Application Lifecycle Management (ALM) - Octane (Or earlier versions of ALM).
- 8 years of experience devising and utilizing algorithms and models to mine data; perform data and error analysis to improve models; clean and validate data for uniformity and accuracy.
- 3 years of experience with detecting and providing solutions to addressing mismatched data in case management systems.
Preferred Skills: - 5 years of experience in more than one type of RDBMS (DB2, SQL Server, Oracle).
- 5 years of experience in Erwin or other equivalent data modeling tools.
- 3 years of experience in Amazon Web Services.
- 2 years of experience with detecting and providing solutions to addressing mismatched data in case management systems.
Note: To Access Esolvit jobs and open roles you can visit our website
or visit our
AI Hiring Platform Arytic Inc. , where you can create User ID and complete the signup process to explore more jobs or positions.