JobRialto
Data Engineer
JobRialto, St Louis, Missouri, United States,
Job Summary
We are seeking a Data Engineer to support the migration of approximately 300 reports and reworking of SSIS packages into Snowflake, with a completion timeline extending to March 2026. The role involves designing and maintaining data pipelines, developing ETL processes, and collaborating with stakeholders to deliver optimized data solutions.
Key Responsibilities
•Migrate and rework SSIS packages into Snowflake.
•Design, develop, and maintain scalable data pipelines and ETL processes.
•Perform data manipulation, transformation, and performance tuning using SQL and T-SQL.
•Implement data warehousing solutions in Snowflake with an emphasis on optimization and reporting.
•Develop Python scripts for data processing, including zipping and encrypting locally stored files.
•Collaborate with stakeholders to gather requirements and create data solutions.
•Manage code and deployments using GitLab across Dev, Test, and Prod environments.
•Document data processes to ensure data quality and integrity.
Required Qualifications
•1.5+ years of recent experience with Snowflake.
•Proficiency in SQL, capable of writing complex queries and stored procedures.
•Ability to modify and read Python code for data processing tasks.
•Prior experience in the healthcare domain.
Preferred Qualifications
•Experience with data migrations and SSIS packages.
•Familiarity with GitLab for version control and deployments.
Education:
Bachelors Degree
We are seeking a Data Engineer to support the migration of approximately 300 reports and reworking of SSIS packages into Snowflake, with a completion timeline extending to March 2026. The role involves designing and maintaining data pipelines, developing ETL processes, and collaborating with stakeholders to deliver optimized data solutions.
Key Responsibilities
•Migrate and rework SSIS packages into Snowflake.
•Design, develop, and maintain scalable data pipelines and ETL processes.
•Perform data manipulation, transformation, and performance tuning using SQL and T-SQL.
•Implement data warehousing solutions in Snowflake with an emphasis on optimization and reporting.
•Develop Python scripts for data processing, including zipping and encrypting locally stored files.
•Collaborate with stakeholders to gather requirements and create data solutions.
•Manage code and deployments using GitLab across Dev, Test, and Prod environments.
•Document data processes to ensure data quality and integrity.
Required Qualifications
•1.5+ years of recent experience with Snowflake.
•Proficiency in SQL, capable of writing complex queries and stored procedures.
•Ability to modify and read Python code for data processing tasks.
•Prior experience in the healthcare domain.
Preferred Qualifications
•Experience with data migrations and SSIS packages.
•Familiarity with GitLab for version control and deployments.
Education:
Bachelors Degree