Data Engineer
ONLY W2 Location: Dallas, TX (onsite working) Long Term $50-$53/hr on W2 Job Summary: We are seeking a skilled Data Engineer to join our team. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and architectures for efficient data collection, processing, and storage. You will work closely with data analysts, data scientists, and business stakeholders to ensure data availability, quality, and accessibility for business needs.
Responsibilities:- Design, develop, and maintain robust and scalable data pipelines.
- Build and optimize data architectures (data lakes, data warehouses).
- Integrate data from various sources (internal and external APIs, databases, cloud platforms).
- Ensure data quality, consistency, and reliability across systems.
- Develop and maintain ETL (Extract, Transform, Load) processes.
- Collaborate with cross-functional teams to understand data requirements and deliver solutions.
- Monitor and troubleshoot production data pipelines and workflows.
- Implement data security and compliance best practices.
- Document data flow processes, architecture, and standards.
- Optimize query performance and support analytics initiatives.
Required Qualifications:- Bachelor's degree in Computer Science, Information Systems, Engineering, or a related field.
- Proven experience as a Data Engineer or in a similar role.
- Proficiency in SQL and experience with relational and non-relational databases (e.g., PostgreSQL, MySQL, MongoDB).
- Experience with data pipeline and workflow management tools (e.g., Apache Airflow, Luigi).
- Knowledge of big data technologies (e.g., Hadoop, Spark).
- Experience with cloud services (AWS, Azure, or GCP) and cloud-based data solutions.
- Strong programming skills in Python, Scala, or Java.
- Understanding of data governance, data security, and best practices.
Preferred Qualifications:- Experience with Snowflake, Redshift, BigQuery, or similar data warehouse solutions.
- Familiarity with containerization (Docker, Kubernetes).
- Hands-on experience with real-time data processing (Kafka, Flink).
- Knowledge of BI tools (Tableau, Power BI) is a plus.
- Strong problem-solving and communication skills.