Responsibilities include but are not limited to:
- Drive Valley's Enterprise Integration delivery including gathering business requirements, defining product roadmap, overseeing design, managing delivery and implementation of ETL interfaces.
- Align integration design with technical design authority & architecture review board, implemented as intended and within Valley guidelines and policies.
- Delivery via MVP and iterative, agile-style cycles of development.
- Promote an evolutionary architectural approach, allowing designs to evolve as new requirements emerge.
- Manage transparency and flow of information related to project delivery and include consideration of operational challenges when building solutions.
- Facilitate business analysis, data acquisition, data storage, query optimization, archiving and recovery strategy, data ingress and egress, security and change management at the enterprise level.
- Leverage cloud-based application architectures and development, integration, distributed data management, and application testing.
- Balance short-term tactical decisions with longer term aspirations to ensure that technical debt is kept at an acceptable level.
- Work across teams to support the creation of non-functional requirements, including but not limited to performance thresholds and security considerations.
- Translate strategic IT requirements and parallel data initiatives into the integration framework, which may include reference data management, data protection, RBAC, data quality and release management.
- Develop internal and external checks and controls to ensure proper governance, security, and quality of data assets.
- Interface across several business areas, acting as a leader and team member to proactively assist in defining the direction for future projects.
- Communicate complex technical information to all areas of Valley Business and Technology Leadership and project stakeholders, including those without a technical background to capture non-functional requirements.
Required Skills: - Demonstrated technology and personal leadership experience in architecting, designing and building highly scalable cloud-based data solutions.
- Enterprise scale expertise in data management best practices such as data integration, data security, data warehousing, data analytics, metadata management and data quality.
- Extensive knowledge and experience in architecting modern data integration frameworks, highly scalable distributed systems and emerging data architecture designs/patterns.
- Experience with Azure Data Factory, Azure DataBricks and SnowFlake (similar cloud technology experience will also be considered).
- Expert ability to evaluate, prototype and recommend open source and vendor technologies and platforms.
- Proven experience in relational, NoSQL, Graph, Microservices, ELT technologies and in-memory databases.
- Hands on development experience in one or more of the following: PL/SQL, Java, Python, ETL tools, MQ/kafka messaging, MuleSoft.
- Knowledge of the Software development life cycle as well as Agile methodologies.
- Familiarity with regulations applicable to a financial institution - consumer privacy laws, management of sensitive and confidential PII/PCI datasets.
Required Experience: - Bachelor's degree in quantitative/relevant field such as Statistics, Computer Science, Economics, Mathematics or Data Science.
- Minimum of 5 years of experience with large and complex systems in a financial institution with a minimum of 2 years managing technology teams.
- Technology Program delivery in a hands-on manner.
- Experience as a systems engineer, solution architect, data scientist and ETL developer.
- Expertise in data warehousing, data integration and development design patterns.