Data Developer
Key responsibilities
• Process complex financial data - Handle large-scale financial datasets from multiple sources, including banks, payment gateways and processors, ensuring data integrity across different formats and structures.
• Build and maintain complex pipelines - Develop and optimise pipelines that apply intricate business rules, financial calculations and transformations for accurate transaction processing.
• Time-bound and event-driven processing - Design event-driven architectures to meet strict SLAs for transaction processing, settlement and reconciliation with banks and payment partners.
• Enable reporting and AI insights - Structure and prepare data for advanced analytics, reporting and AI-driven insights to improve payment success rates, detect fraud and optimise transaction flows.
What you'll bring
• Programming: Strong proficiency in Python and OOPS concepts, should be able to write modularise and structured code and should be an excellent problem solver.
• Big data technologies: Hands-on experience with frameworks like Spark (PySpark), Kafka, Apache Hudi, Iceberg, Apache Flink or similar tools for distributed data processing and real-time streaming.
• Cloud platforms: Familiarity with cloud platforms like AWS, Google Cloud Platform (GCP), or Microsoft Azure for building and managing data infrastructure.
• Data warehousing and modeling: Strong understanding of data warehousing concepts and data modeling principles.
• ETL frameworks: Experience with ETL tools such as Apache Airflow or comparable data transformation frameworks.
• Data lakes and storage: Proficiency in working with data lakes and cloud-based storage solutions like Amazon S3, Google Cloud Storage, or Azure Blob Storage.
• Version control: Expertise in Git for version control and collaborative coding.
• Handling Complex System: Expertise in working with complex systems and building complex system from scratch, should have good exposer in solving complex problems.
Experience and requirements
• Bachelor's degree in Computer Science, Information Technology, or equivalent experience.
• 1-5 years of experience in data engineering, ETL development, or database management.
• Prior experience in cloud-based environments (e.g., AWS, GCP, Azure) is highly desirable.
• Proven experience working with complex systems and building complex system from scratch, with a focus on performance tuning and optimisation.