TechnoGen
"Data Engineer"
TechnoGen, Dallas, Texas, United States, 75215
Job Title: Data Engineer
Job Location: Remote
Mandatory Skills:•
Strong hands-on Experience with GCP•
Strong Experience with Python Developemnt•
Experience with GCP, SQL, Pyspark, Python, Airflow
Job Description:
We are seeking a talented Data Engineer with a strong hands-on experience with Snowflake, expertise in Python development, and familiarity with GCP, SQL, PySpark, and Airflow. The ideal candidate will be responsible for designing, developing, and maintaining scalable data pipelines and analytics solutions leveraging Snowflake as the core data platform. You will work closely with cross-functional teams to understand data requirements, implement data models, and ensure data integrity and reliability throughout the entire data lifecycle.
Key Responsibilities:• Design, develop, and maintain data pipelines and ETL processes using Snowflake, Python, SQL, PySpark, and Airflow.• Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and translate them into technical solutions.• Optimize and tune data pipelines for performance, scalability, and reliability.• Implement data quality checks and monitoring to ensure data integrity and reliability.• Contribute to the design and development of data warehouses, data lakes, and other data infrastructure components on GCP.• Stay updated with the latest technologies and best practices in data engineering and contribute to continuous improvement initiatives.
Job Location: Remote
Mandatory Skills:•
Strong hands-on Experience with GCP•
Strong Experience with Python Developemnt•
Experience with GCP, SQL, Pyspark, Python, Airflow
Job Description:
We are seeking a talented Data Engineer with a strong hands-on experience with Snowflake, expertise in Python development, and familiarity with GCP, SQL, PySpark, and Airflow. The ideal candidate will be responsible for designing, developing, and maintaining scalable data pipelines and analytics solutions leveraging Snowflake as the core data platform. You will work closely with cross-functional teams to understand data requirements, implement data models, and ensure data integrity and reliability throughout the entire data lifecycle.
Key Responsibilities:• Design, develop, and maintain data pipelines and ETL processes using Snowflake, Python, SQL, PySpark, and Airflow.• Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and translate them into technical solutions.• Optimize and tune data pipelines for performance, scalability, and reliability.• Implement data quality checks and monitoring to ensure data integrity and reliability.• Contribute to the design and development of data warehouses, data lakes, and other data infrastructure components on GCP.• Stay updated with the latest technologies and best practices in data engineering and contribute to continuous improvement initiatives.