Sr Data Security Engineer

Tate, Georgia

RIT Solutions, Inc.
Apply for this Job
MUST HAVES:
  1. Please confirm with candidates if they have hands-on experience with the following tools, with a particular focus on data security practices:
  1. Snowflake: Ask about their experience with cloud data warehousing and any specific security protocols they've implemented or worked with.
  2. Tableau: Inquire if they've worked with data visualization tools and how they handle data security within those platforms, such as user access controls or secure data sources.
  3. Fivetran: Check if they've used data integration tools and how they ensure the security of data during the transfer process.
  4. Immuta: Ask if they have experience using data access control platforms like Immuta for managing security and privacy policies, especially in a data pipeline context.
  1. Confirm with candidates if they have experience owning data security end-to-end. This includes being responsible for the entire data security lifecycle, from data creation and ingestion to storage and access, while ensuring compliance with internal frameworks.
  2. Tech stack:
  1. Airflow: Ask if they've used Airflow for orchestrating data pipelines. Inquire about their experience in creating, scheduling, and managing workflows.
  2. Spark: Check if they have experience with Apache Spark, particularly for big data processing and distributed computing. Focus on their familiarity with both batch and streaming data processing.
  3. Databricks: Confirm if they've worked with Databricks for building scalable data solutions using Apache Spark. Ask about any real-world projects involving Databricks.
  4. Snowflake: Inquire about their experience with Snowflake as a cloud data platform, especially in terms of data warehousing, security, and scalability.
  5. Delta Lake: Ask if they have experience using Delta Lake on top of a data lake for ensuring ACID transactions and managing large volumes of data with high consistency.
Kafka: Confirm their experience with Kafka, particularly for building data streaming solutions. Ask about handling high-throughput, real-time data streams
JOB DESCRIPTION:
  1. 5 years of experience in information security and/or the following areas: security architecture, security engineering, production or network storage engineering, cybersecurity incident investigations, experience with cloud technologies
  2. Advanced knowledge of cloud security and infrastructure environments for popular cloud providers (AWS, Azure, GCP)
  3. Broad technology expertise with application, system integration, data, and/or infrastructure knowledge
  4. Storage solutions (e.g., SAN, NAS, encrypted storage devices, cloud cache, and storage buckets)
  5. Endpoint protection and Data Loss Prevention solutions
  6. Strong understanding of secure network principles.
  7. Working knowledge of configuring and maintaining firewalls and network switching/routing devices (e.g., Palo Alto, Sonicwall, Fortinet, Brocade, Cisco, Client)
  8. LAN, WAN, TCP/IP connectivity and security protocols (Point-to-Point, MPLS, VPN)
  9. Network architecture and layer 2 and Layer 3 routing principles
  10. Network authentication standards
  11. Strong understanding of Infrastructure as a Service (IaaS) and Infrastructure as Code (IaC)
  12. Expert knowledge in cloud security auditing tools
  13. Working knowledge of Virtual Private Cloud (VPC) network access control lists
  14. CISSP, CISA/CISM, or CEH designations
  15. 7+ years of data engineering experience working with large data pipelines
  16. Proficiency in at least one major programming language (e.g. Python, Java, Kotlin)
  17. Strong SQL skills and ability to create queries to analyze and extract complex datasets
  18. Experience with data pipeline orchestration systems such as Airflow for creating and maintaining data pipelines
  19. Deep Understanding of AWS or other cloud providers as well as infrastructure as code
  20. Excellent written and verbal communication
  21. Advanced understanding of OLTP vs OLAP environments
  22. Willingness and ability to learn and pick up new skill sets
  23. Self-starting problem solver with an eye for detail and excellent analytical and communication skills
  24. Strong background in at least one of the following: distributed data processing or software engineering of data services or data modeling
  25. Realtime Event Streaming experience a plus
  26. Familiar with Scrum and Agile methodologies
Date Posted: 11 April 2025
Apply for this Job