Data Architect
Location: Chicago Hybrid onsite 3x a week (local only)
Interview Mode: Virtual 2 Rounds 1st technical round, 2nd questions based - previous project details 1-hour each
Type: Contract
Key Skills: -Ability to Design and develop (must be hands on).
-Python: ability to create own scripts for dependency injection into Airflow (scheduling, workflows). Expert level
-Airflow: strong familiarization
-Snowflake: primary database
This role will operate as an individual contributor,
Description: - The ideal contractor will be responsible for designing, developing, testing, and deploying software solutions for Hedge Fund Services.
- Propose new designs and modify existing ones to continuously improve performance, functionality, and stability of the system.
- Partner with business leaders and business unit partners to define priorities and deliver custom solutions to solve business problems or address business needs.
- Must be competent to work at the highest technical level of all phases of system design and implementation.
- Provides comprehensive consultation to Business Unit and IT management and staff at the highest technical level on all phases of the project development cycle.
- Acts as the principal designer for major systems and their subsystems utilizing a thorough understanding of available technology, tools, and existing designs.
- Design and develop high-performance programming language components used by trading applications.
- Provide technical expertise to support and enhance core-trading applications.
- Provides leadership and guidance to staff, fostering an environment that encourages employee participation, teamwork, and communication.
- Seasoned multi-disciplinary expert with extensive technical and / or business knowledge and functional expertise. Works at the highest technical level of all phases of system design and implementation.
- Focus of role is on execution of strategic direction of business function activities
- Carries out complex initiatives involving multiple disciplines and/or ambiguous issues
- Displays a balanced, cross-functional perspective, liaising with the business to improve efficiency, effectiveness, and productivity
Experience Level:Qualifications: - A BS degree in Computer Science, Mathematics, or related Computer Engineering or Science curriculum is required.
- Strong programming skills in Snowflake, Python, AirFlow, DBT, Linux
- Strong server-side programming experience with automation and backend support.
- Experience with Snowflake.
- Experience with agile project methodology and collaboration.
- Excellent communication skills, analytical ability, strong judgment and management skills, and the ability to work effectively with client and IT management and staff required.
- Strong skills in working with Opensource technologies, Database technologies, micro service architecture, cloud-native development, continuous build, continuous integration and continuous deployment.
- Ability to work effectively with end users to define requirements.
- Leadership and organizational skills are required to determine the Business Unit's goals, resources needed, and to assess and develop the skills of staff.
- Experience designing and building cloud-native applications using microservices architecture.
- Hands-On experience with Kafka and overall use for developing an Event driven architecture model
- Experience in Domain Driven Design.
- Experience with continuous integration and collaboration tools like JIRA, Bitbucket, GitHub, and Confluence.
- Experience with building Data pipelines to Snowflake.
Specific Technical Responsibilities: - Overall (applies to all technology platforms listed below):
- Provide production support for several data analytics solutions used every day
- Ability to perform as a technical lead in addition to being a contributing developer
- Code review of other teams' members
- Create and enhance data architecture models
- Ability to troubleshoot and identify root causes for a variety of production and data issues
Snowflake: - Data transformation (ETL)
- Write Snowflake SQL including stored procedures and complex queries involving CTEs and temp tables
- Help design data models for new data to be ingested
- Snowflake SQL performance tuning
- Help complete migration of existing SQL Server based Data Vault into Snowflake
- Continue to support and work on future enhancements for Snowflake Data Vault
- Data ingestion (familiarity with Python and Kafka connectors is a nice to have but not necessarily required)
- Python/Linux
Nice to Have: - A MS Degree is preferred. Experience with multi-threaded application design and development; including testing and deployment phases