Apply for this Job
MUST HAVES: - Please confirm with candidates if they have hands-on experience with the following tools, with a particular focus on data security practices:
- Snowflake: Ask about their experience with cloud data warehousing and any specific security protocols they've implemented or worked with.
- Tableau: Inquire if they've worked with data visualization tools and how they handle data security within those platforms, such as user access controls or secure data sources.
- Fivetran: Check if they've used data integration tools and how they ensure the security of data during the transfer process.
- Immuta: Ask if they have experience using data access control platforms like Immuta for managing security and privacy policies, especially in a data pipeline context.
- Confirm with candidates if they have experience owning data security end-to-end. This includes being responsible for the entire data security lifecycle, from data creation and ingestion to storage and access, while ensuring compliance with internal frameworks.
- Tech stack:
- Airflow: Ask if they've used Airflow for orchestrating data pipelines. Inquire about their experience in creating, scheduling, and managing workflows.
- Spark: Check if they have experience with Apache Spark, particularly for big data processing and distributed computing. Focus on their familiarity with both batch and streaming data processing.
- Databricks: Confirm if they've worked with Databricks for building scalable data solutions using Apache Spark. Ask about any real-world projects involving Databricks.
- Snowflake: Inquire about their experience with Snowflake as a cloud data platform, especially in terms of data warehousing, security, and scalability.
- Delta Lake: Ask if they have experience using Delta Lake on top of a data lake for ensuring ACID transactions and managing large volumes of data with high consistency.
Kafka: Confirm their experience with Kafka, particularly for building data streaming solutions. Ask about handling high-throughput, real-time data streams JOB DESCRIPTION: - 5 years of experience in information security and/or the following areas: security architecture, security engineering, production or network storage engineering, cybersecurity incident investigations, experience with cloud technologies
- Advanced knowledge of cloud security and infrastructure environments for popular cloud providers (AWS, Azure, GCP)
- Broad technology expertise with application, system integration, data, and/or infrastructure knowledge
- Storage solutions (e.g., SAN, NAS, encrypted storage devices, cloud cache, and storage buckets)
- Endpoint protection and Data Loss Prevention solutions
- Strong understanding of secure network principles.
- Working knowledge of configuring and maintaining firewalls and network switching/routing devices (e.g., Palo Alto, Sonicwall, Fortinet, Brocade, Cisco, Client)
- LAN, WAN, TCP/IP connectivity and security protocols (Point-to-Point, MPLS, VPN)
- Network architecture and layer 2 and Layer 3 routing principles
- Network authentication standards
- Strong understanding of Infrastructure as a Service (IaaS) and Infrastructure as Code (IaC)
- Expert knowledge in cloud security auditing tools
- Working knowledge of Virtual Private Cloud (VPC) network access control lists
- CISSP, CISA/CISM, or CEH designations
- 7+ years of data engineering experience working with large data pipelines
- Proficiency in at least one major programming language (e.g. Python, Java, Kotlin)
- Strong SQL skills and ability to create queries to analyze and extract complex datasets
- Experience with data pipeline orchestration systems such as Airflow for creating and maintaining data pipelines
- Deep Understanding of AWS or other cloud providers as well as infrastructure as code
- Excellent written and verbal communication
- Advanced understanding of OLTP vs OLAP environments
- Willingness and ability to learn and pick up new skill sets
- Self-starting problem solver with an eye for detail and excellent analytical and communication skills
- Strong background in at least one of the following: distributed data processing or software engineering of data services or data modeling
- Realtime Event Streaming experience a plus
- Familiar with Scrum and Agile methodologies
Date Posted: 11 April 2025
Apply for this Job