Hybrid, 3 days onsite, 2 days remote
We are unable to sponsor as this is a permanent Full time role
A prestigious company is looking for a Director, Java Software Engineering. This Director will lead a software development team working with Java, Python, Flink, Spark, Kafka, big data processing, DevOps tools, data warehousing/management, etc.
Responsibilities:
- Manage, lead, build, and mentor software development team
- Serve as technical product owner flushing out detailed business, architectural, and design requirements
- Develop solutions to complex technical challenges while coding, testing, troubleshooting and documenting the systems you and your team develop
- Recommend architectural changes and new technologies and tools that improve the efficiency and quality of company systems and development processes
- Lead the efforts to optimize application performance and resilience though analysis, code refactoring, and systems tuning
Qualifications:
- BS degree in Computer Science, similar technical field, or equivalent practical experience. Master's degree preferred
- 8-10 years of experience in building high performance, large scale data solutions
- Hands-on development experience with multiple programming languages such as Python and Java
- Experience with distributed message brokers like Flink, Spark, Kafka Streams, etc.
- Experience with Agile development processes for enterprise software solutions
- Experience with software testing methodologies and automated testing frameworks
- Experience with Big Data processing technologies and frameworks such as Presto, Hadoop, MapReduce, and Spark
- Hands-on experiences designing and implementing RESTful APIs
- Knowledge and understanding of DevOps tools and technologies such as Terraform, Git, Jenkins, Docker, Harness, Nexus/Artifactory, and CI/CD pipelines
- Knowledge of SQL, data warehousing design concepts, various data management systems (structured and semi structured) and integrating with various database technologies (Relational, NoSQL)
- Experience working with Cloud ecosystems (AWS, Azure, GCP)
- Experience with stream processing technologies and frameworks such as Kafka, Spark Streaming, Flink
- Experience with cloud technologies and migrations using public cloud vendor preferably using cloud foundational services like AWS's VPCs, Security groups, EC2, RDS, S3 ACLs, KMS, AWS CLI and IAM etc.
- Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc.
- Experience working with various types of databases like Relational, NoSQL, Object-based, Graph
- Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc.
- Familiarity with monitoring related tools and frameworks like Splunk, ElasticSearch, Prometheus, AppDynamics