Logo
JobRialto

Snowflake Data Engineer

JobRialto, Boston, Massachusetts, us, 02298


Position Summary

The client is looking for an experienced Snowflake Data Engineer to lead the implementation of Division of Housing Stabilization data in our new Snowflake data warehouse. The Snowflake Data Engineer plays a critical role in designing, building, and supporting the components of the agency's data warehouse. This position involves working with various stakeholders to ensure accurate and efficient data management, reporting, and analytics. As a Snowflake Data Engineer, you will collaborate closely with the technology group, the agency data team, agency program staff and external organizations that assist in delivery of EOHLC programs. The Data Engineer will report to EOHLC Director of Information Technology Development.

Description of Duties Design, develop, and maintain complex snowflake data warehouses, including data modeling, ETL processes, and data quality control. Ensure data integrity, quality, and security across all data warehousing activities. Develop and maintain data pipelines to integrate data from various internal and external sources. Develop and maintain data pipelines using tools for data transformation, testing, and deployment. Ensure data pipelines are scalable, reliable, and efficient. Write Python code to automate data processing, transformation, and loading tasks, including data ingestion, data quality control, and data visualization. Collaborate with data analysts and business stakeholders to understand data requirements and develop data solutions that meet business needs. Develop and maintain data visualizations and reports using Snowflake's built-in visualization tools or third-party tools like Tableau. Ensure data quality, integrity, and security by developing data validation rules, data cleansing processes, and data access controls. Optimize snowflake data warehouse performance, scalability, and reliability, including monitoring and troubleshooting data issues. Develop and maintain technical documentation, including data dictionary, data flow diagrams, and code comments. Participate in code reviews and provide feedback to ensure high-quality code and adherence to coding standards. Work closely with the agency's data team and the overall state technology team to ensure alignment on data strategy and technology standards. Communicate complex data findings to non-technical stakeholders in a clear and concise manner. Provide technical guidance and mentorship to junior analysts and other team members. Qualifications:

8+ years of experience in Snowflake data warehouse development and management, data engineering field. Strong understanding of Snowflake architecture, data modeling, and data warehousing concepts. Proficiency with SQL, including snowflake's SQL dialect and complex query writing for performance optimization. Experience with Snowflake features, including cloning, time travel, data sharing, and micro-partitioning. Strong understanding of Snowflake's architecture, including virtual warehouses, storage, and compute separation. Expertise in Snowflake Role-Based Access Control (RBAC), data masking, and encryption. Proficiency with data integration tools and ETL/ELT processes. Experience in Python or similar scripting languages for automation and data processing. Knowledge of data security and compliance standards as they relate to Snowflake. Experience with cloud platforms like AWS, Azure. SnowPro Advanced certification is a plus. Knowledge of data governance frameworks and tools. Strong experience integrating Snowflake with Tableau, Salesforce, third party APIs etc. Strong analytical, problem-solving, and communication skills Ability to work collaboratively in a cross-functional team environment Excellent verbal and written communication skills with project teams and business teams Team-oriented attitude and the proven ability to collaborate at all levels of an organization Desired Skills/Experience:

SnowPro Advanced Architect or SnowPro Data Engineering certification is preferred. Experience in building complex data pipelines in an multi cloud environment. Experience implementing data masking, and aggregation to reduce privacy risks Experience integrating data from Salesforce solutions Knowledge of CI/CD practices and tools. Familiarity JIRA or other project management tools Experience using GitHub Experience working on projects in the affordable housing sector or other public benefit programs \n

Education:

Bachelors Degree