Saxon Global
Azure Data Engineer
Saxon Global, Milwaukee, Wisconsin, United States, 53244
Must have experience with R, Python, SQL
Must have Azure and Azure Data Factory exprience
Should have solid tenure- not moving around every year
Any TPV candidates- please submit with linkedin profile
Job Description:
ABC Supply, the nation's largest distributor of select exterior and interior building products, is currently seeking a Data Engineer to deliver analytics solutions for the enterprise. ABC has built a world class platform in Azure using Data Lake, Databricks and Snowflake. You will be working on the leading edge of data engineering. As a member of the Enterprise Analytics team at ABC you will be delivering high impact solutions driven by our executive leadership. ABC Supply has been the recipient of the Gallup Great Workplace Award for 12 consecutive years and is proud to be an employee-first company.
The Data Engineer will report to the BI/DW Supervisor, partners with BI and Software Engineers, Analysts, business stakeholders and Enterprise Analytics leadership. This individual will be building the data pipelines and data structures that create and support our cloud data warehouse. These datasets are highly used by our business analysts, managers and data scientists. They serve as the foundation for self-service, BI and Advanced Analytics.
Responsibilities:
Work closely with business and technical teams to deliver enterprise grade datasets that are reliable, flexible, scalable, and provide low cost of ownership.Understands common analytical data models like Kimball. Ensures physical data models align with best practice and requirements.Build and maintain raw data pipelines from varied sources.Build and maintain the data warehouse pipelines.Updates and creates azure pipelines to support our continuous deployment model.Recommend ways to improve data reliability, efficiency and qualityAnalyzes and estimates feasibility, costs, time, and resources needed to develop, and implement enterprise datasets as needed.Research opportunities for data acquisition and new uses for existing dataRecommend ways to improve data reliability, efficiency and qualityCollaborate with Enterprise Architecture to publish and contribute to architecture standards and roadmaps.Achieves and maintains relevant technical competencies and helps to foster an environment of continued growth and learning among colleagues on existing and emerging technologies.Qualifications:
A Bachelor's Degree in Computer Science or related field is required. A high school diploma and/or equivalent combination of education and work experience may be substituted.A minimum of 5 years relevant experience of development using integration platforms.Prefer recent experience using cloud data engineering toolsets.Highly prefer recent experience in Azure using Azure Data Factory, Azure Databricks and Snowflake.A minimum of 2 years experience building database tables and models.Must be able to write TSQL for DDL and DML operations fluently.Strong understanding of enterprise integration patterns (EIP) and data warehouse modeling.Experience with development and data warehouse requirements gathering, analysis and design.Possess strong business acumen and consistently demonstrates forward thinking.
Must have Azure and Azure Data Factory exprience
Should have solid tenure- not moving around every year
Any TPV candidates- please submit with linkedin profile
Job Description:
ABC Supply, the nation's largest distributor of select exterior and interior building products, is currently seeking a Data Engineer to deliver analytics solutions for the enterprise. ABC has built a world class platform in Azure using Data Lake, Databricks and Snowflake. You will be working on the leading edge of data engineering. As a member of the Enterprise Analytics team at ABC you will be delivering high impact solutions driven by our executive leadership. ABC Supply has been the recipient of the Gallup Great Workplace Award for 12 consecutive years and is proud to be an employee-first company.
The Data Engineer will report to the BI/DW Supervisor, partners with BI and Software Engineers, Analysts, business stakeholders and Enterprise Analytics leadership. This individual will be building the data pipelines and data structures that create and support our cloud data warehouse. These datasets are highly used by our business analysts, managers and data scientists. They serve as the foundation for self-service, BI and Advanced Analytics.
Responsibilities:
Work closely with business and technical teams to deliver enterprise grade datasets that are reliable, flexible, scalable, and provide low cost of ownership.Understands common analytical data models like Kimball. Ensures physical data models align with best practice and requirements.Build and maintain raw data pipelines from varied sources.Build and maintain the data warehouse pipelines.Updates and creates azure pipelines to support our continuous deployment model.Recommend ways to improve data reliability, efficiency and qualityAnalyzes and estimates feasibility, costs, time, and resources needed to develop, and implement enterprise datasets as needed.Research opportunities for data acquisition and new uses for existing dataRecommend ways to improve data reliability, efficiency and qualityCollaborate with Enterprise Architecture to publish and contribute to architecture standards and roadmaps.Achieves and maintains relevant technical competencies and helps to foster an environment of continued growth and learning among colleagues on existing and emerging technologies.Qualifications:
A Bachelor's Degree in Computer Science or related field is required. A high school diploma and/or equivalent combination of education and work experience may be substituted.A minimum of 5 years relevant experience of development using integration platforms.Prefer recent experience using cloud data engineering toolsets.Highly prefer recent experience in Azure using Azure Data Factory, Azure Databricks and Snowflake.A minimum of 2 years experience building database tables and models.Must be able to write TSQL for DDL and DML operations fluently.Strong understanding of enterprise integration patterns (EIP) and data warehouse modeling.Experience with development and data warehouse requirements gathering, analysis and design.Possess strong business acumen and consistently demonstrates forward thinking.