Infometry, Inc.
DBT with Snowflake Architect (10+ Years Experience) 10+ Years DBT with Snowflake
Infometry, Inc., California, Missouri, United States, 65018
DBT with Snowflake ArchitectJob Overview:We are seeking an experienced DBT with Snowflake Architect to join our data engineering team. The ideal candidate will be responsible for designing, developing, and implementing data transformation workflows using the DBT framework in collaboration with the Snowflake data warehouse. This role requires a strong understanding of data modeling, SQL, ETL processes, and an ability to optimize and scale data pipelines for efficient analytics.
Responsibilities:
Design and develop data transformation pipelines using the DBT framework in conjunction with the Snowflake data warehouse.
Collaborate with data engineers, analysts, and stakeholders to understand data requirements and translate them into DBT models.
Write complex SQL queries to transform, aggregate, and cleanse raw data into meaningful insights.
Create and maintain data models, including logical, physical, and semantic layers.
Develop and enforce best practices for version control, testing, and documentation of DBT models.
Build and manage data quality checks and validation processes within the DBT pipelines.
Optimize data pipelines for performance and scalability, taking advantage of Snowflake’s features like clustering keys and materialized views.
Participate in code reviews and provide constructive feedback to team members.
Stay up-to-date with industry trends and advancements in data engineering, Snowflake, and DBT.
Qualifications:
Bachelor’s degree in Computer Science, Engineering, or a related field; Master’s degree is a plus.
Proven experience working with Snowflake and DBT in a data engineering or analytics role.
Strong expertise in SQL and data modeling concepts.
Experience designing and implementing ETL processes and data pipelines.
Familiarity with version control systems (e.g., Git) and collaborative development practices.
Ability to optimize SQL queries for performance and efficiency.
Excellent communication skills to collaborate with cross-functional teams and stakeholders.
Strong problem-solving skills and a proactive attitude towards challenges.
Experience with data warehousing concepts, cloud platforms, and analytics tools is a plus.
Knowledge of Python or other scripting languages is advantageous.
#J-18808-Ljbffr
Responsibilities:
Design and develop data transformation pipelines using the DBT framework in conjunction with the Snowflake data warehouse.
Collaborate with data engineers, analysts, and stakeholders to understand data requirements and translate them into DBT models.
Write complex SQL queries to transform, aggregate, and cleanse raw data into meaningful insights.
Create and maintain data models, including logical, physical, and semantic layers.
Develop and enforce best practices for version control, testing, and documentation of DBT models.
Build and manage data quality checks and validation processes within the DBT pipelines.
Optimize data pipelines for performance and scalability, taking advantage of Snowflake’s features like clustering keys and materialized views.
Participate in code reviews and provide constructive feedback to team members.
Stay up-to-date with industry trends and advancements in data engineering, Snowflake, and DBT.
Qualifications:
Bachelor’s degree in Computer Science, Engineering, or a related field; Master’s degree is a plus.
Proven experience working with Snowflake and DBT in a data engineering or analytics role.
Strong expertise in SQL and data modeling concepts.
Experience designing and implementing ETL processes and data pipelines.
Familiarity with version control systems (e.g., Git) and collaborative development practices.
Ability to optimize SQL queries for performance and efficiency.
Excellent communication skills to collaborate with cross-functional teams and stakeholders.
Strong problem-solving skills and a proactive attitude towards challenges.
Experience with data warehousing concepts, cloud platforms, and analytics tools is a plus.
Knowledge of Python or other scripting languages is advantageous.
#J-18808-Ljbffr