Logo
Saxon Global

Data Engineer - Snowflake

Saxon Global, Somerville, MA, United States


Consultant - Data Engineer Job Description

Principal Duties and Responsibilities
  • Perform EDW ETL/ELT ingestions and integrations to ensure support data and analytic needs.
  • Build and enhance standard solution that provides efficient and scalable ETL/ELT solutions from multiple data sources.
  • Ensure the quality of data assets and robustness of data engineering processes.
  • Experience with change control, release management and other ITIL processes.
  • Participate in building out EDW on Snowflake, expanding and optimizing the data ecosystem, as well as optimizing data engineering processes.
  • Support BI Developers, Data Architects, Data Analysts and Data Scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects.
  • Conduct ETL solution, code, and pre-prod review to ensure optimal data delivery architecture is consistent throughout ongoing projects.
  • Perform troubleshooting on ETLs and related components.
  • Identify application bottlenecks and opportunities to optimize performance.
Qualifications
  • 7+ plus years of experience designing and building data ingestions and data integration solution for Enterprise Data and Analytics Solutions
  • 3+ years of experience with developing data pipelines using on Snowflake features ( Snowpipe, SnowSQL, Snow Sight, Data Streams )
  • 2+ years of experience with developing models in DBT
  • Demonstrate good knowledge of scrum & agile principles
  • Experience with Azure DevOps, Azure Git and CI/CD data pipeline integrations.
  • Demonstrates good knowledge of cloud computing platforms such as Azure
  • Experience of support and enhance existing Enterprise Data Warehouse or Data Lake
  • Ensure designed systems are highly reliable, self-recovering, and require little or no supporting manpower
  • Familiar with change control, release management, and other ITIL methodology
  • Deep experience in traditional Data Warehousing & Cloud data warehousing
  • Mastery of SQL, especially within cloud-based data warehouses like Snowflake
  • Experience developing data pipelines using Snowflake features (Snow pipe, SnowSQL, Data Streams, Snow Sight)
  • Experience with logical and physical data modeling
  • Healthcare experience, most notably in Clinical data, Epic, Payer data and reference data is a plus but not mandatory.
  • Ability to clearly and concisely communicate complex technical concepts to both technical and non-technical audiences
  • Proven verbal, communication, and presentation skills
  • Proven ability to work independently.
Consultant - Data Engineer Job Description

Principal Duties and Responsibilities
  • Perform EDW ETL/ELT ingestions and integrations to ensure support data and analytic needs.
  • Build and enhance standard solution that provides efficient and scalable ETL/ELT solutions from multiple data sources.
  • Ensure the quality of data assets and robustness of data engineering processes.
  • Experience with change control, release management and other ITIL processes.
  • Participate in building out EDW on Snowflake, expanding and optimizing the data ecosystem, as well as optimizing data engineering processes.
  • Support BI Developers, Data Architects, Data Analysts and Data Scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects.
  • Conduct ETL solution, code, and pre-prod review to ensure optimal data delivery architecture is consistent throughout ongoing projects.
  • Perform troubleshooting on ETLs and related components.
  • Identify application bottlenecks and opportunities to optimize performance.
Qualifications
  • 7+ plus years of experience designing and building data ingestions and data integration solution for Enterprise Data and Analytics Solutions
  • 3+ years of experience with developing data pipelines using on Snowflake features ( Snowpipe, SnowSQL, Snow Sight, Data Streams )
  • 2+ years of experience with developing models in DBT
  • Demonstrate good knowledge of scrum & agile principles
  • Experience with Azure DevOps, Azure Git and CI/CD data pipeline integrations.
  • Demonstrates good knowledge of cloud computing platforms such as Azure
  • Experience of support and enhance existing Enterprise Data Warehouse or Data Lake
  • Ensure designed systems are highly reliable, self-recovering, and require little or no supporting manpower
  • Familiar with change control, release management, and other ITIL methodology
  • Deep experience in traditional Data Warehousing & Cloud data warehousing
  • Mastery of SQL, especially within cloud-based data warehouses like Snowflake
  • Experience developing data pipelines using Snowflake features (Snow pipe, SnowSQL, Data Streams, Snow Sight)
  • Experience with logical and physical data modeling
  • Healthcare experience, most notably in Clinical data, Epic, Payer data and reference data is a plus but not mandatory.
  • Ability to clearly and concisely communicate complex technical concepts to both technical and non-technical audiences
  • Proven verbal, communication, and presentation skills
  • Proven ability to work independently.


Required Skills : Snowflake DBT - Data Build Tool - need 1+ years with this ideally SQL Scheduling tool - ideally Tidal but ok for something else like Airlfow
Basic Qualification : Snowflake development experience DBT for ETL Strong SQL experience
Additional Skills : Snowflake development experience DBT for ETL Strong SQL experienceremote
Background Check :Yes
Drug Screen :Yes
Notes :
Selling points for candidate :remote
Project Verification Info :
Candidate must be your W2 Employee :Yes
Exclusive to Apex :Yes
Face to face interview required :No
Candidate must be local :No
Candidate must be authorized to work without sponsorship ::No
Interview times set :Yes
Type of project :0012595 | Mass General Brigham - SOW
Master Job Title :BUS
Branch Code :