Technogen International Company
Snowflake Data Engineer
Technogen International Company, Fort Mill, SC
Job Role : AWS Snowflake Data Engineer
Location : Fort Mill, SC (Onsite)
Mode of interview : Skype/Telephoni
Job Roles/Responsibilities:
We are seeking a highly skilled and experienced Data Engineer to join our team. The ideal candidate will have expertise in the following areas:
1. Snowflake, Snowpark: The candidate should have a deep understanding of Snowflake data warehousing platform and be proficient in using Snowpark for data processing and analytics.
2. AWS services (Airflow): The candidate should have hands-on experience with AWS services, particularly Apache Airflow for orchestrating complex data workflows and pipelines.
3. AWS services (Lambda): Proficiency in AWS Lambda for serverless computing and event-driven architecture is essential for this role.
4. AWS services (Glue): The candidate should be well-versed in AWS Glue for ETL (Extract, Transform, Load) processes and data integration.
5. Python: Strong programming skills in Python are required for developing data pipelines, data transformations, and automation tasks.
6. DBT: Experience with DBT (Data Build Tool) for modeling data and creating data transformation pipelines is a plus.
Responsibilities:
- Design, develop, and maintain data pipelines and ETL processes using Snowflake, AWS services, Python, and DBT.
- Collaborate with data scientists and analysts to understand data requirements and implement solutions.
- Optimize data workflows for performance, scalability, and reliability.
- Troubleshoot and resolve data-related issues in a timely manner.
- Stay updated on the latest technologies and best practices in data engineering.
- Bachelor's degree in Computer Science, Engineering, or related field.
- Proven experience in data engineering roles with a focus on Snowflake, AWS services, Python, and DBT.
- Strong analytical and problem-solving skills.
- Excellent communication and teamwork abilities.
- AWS certifications (e.g., AWS Certified Data Analytics - Specialty) are a plus.
Location : Fort Mill, SC (Onsite)
Mode of interview : Skype/Telephoni
Job Roles/Responsibilities:
We are seeking a highly skilled and experienced Data Engineer to join our team. The ideal candidate will have expertise in the following areas:
1. Snowflake, Snowpark: The candidate should have a deep understanding of Snowflake data warehousing platform and be proficient in using Snowpark for data processing and analytics.
2. AWS services (Airflow): The candidate should have hands-on experience with AWS services, particularly Apache Airflow for orchestrating complex data workflows and pipelines.
3. AWS services (Lambda): Proficiency in AWS Lambda for serverless computing and event-driven architecture is essential for this role.
4. AWS services (Glue): The candidate should be well-versed in AWS Glue for ETL (Extract, Transform, Load) processes and data integration.
5. Python: Strong programming skills in Python are required for developing data pipelines, data transformations, and automation tasks.
6. DBT: Experience with DBT (Data Build Tool) for modeling data and creating data transformation pipelines is a plus.
Responsibilities:
- Design, develop, and maintain data pipelines and ETL processes using Snowflake, AWS services, Python, and DBT.
- Collaborate with data scientists and analysts to understand data requirements and implement solutions.
- Optimize data workflows for performance, scalability, and reliability.
- Troubleshoot and resolve data-related issues in a timely manner.
- Stay updated on the latest technologies and best practices in data engineering.
- Bachelor's degree in Computer Science, Engineering, or related field.
- Proven experience in data engineering roles with a focus on Snowflake, AWS services, Python, and DBT.
- Strong analytical and problem-solving skills.
- Excellent communication and teamwork abilities.
- AWS certifications (e.g., AWS Certified Data Analytics - Specialty) are a plus.