Logo
Saxon Global

Snowflake Data Engineer

Saxon Global, Charlotte, North Carolina, United States, 28245


This position will be supporting Canteen, a sector of Compass Group, focused on unattended retail solutions for clients and consumers across the US. The role will primarily be focused on providing expertise with development and management of our analytical data and reporting platforms.

The Lead Data Engineer role is responsible for:1. Lead the effort to design, implement, and optimize large-scale data and analytics solutions on Snowflake Cloud data warehouse.2. Manage and elevate other engineers (both full-time, contractor and/or third party resources) while remaining hands-on3. Maintain and build on our data warehouse and analytics environment utilizing Python, Snowflake, and AWS4. Collaborate with the data services team and data architect to develop strategy for long term data platform architecture5. Assist application teams with the collection of transactional and master data from source systems6. Design and implement data movement and transformation pipelines (e.g. AWS Glue, Apache AirFlow, dbt, Snowflake)7. Design processes and algorithms to enhance data quality and reliability8. Provide availability and access to consume analytical data through a wide range of Business Intelligence and Reporting toolsets (e.g. Microsoft Power BI, Google Looker)9. Implement proactive monitoring and alerting to ensure operational stability and supportability10. Collaborate with data analysts, data scientists, security engineers, and architects to achieve the best possible technical solutions that address our business needs11. Performs analysis and critical thinking required to troubleshoot data-related issues and assist in the resolution12. Coordinate change and release management across technical teams inside and outside the department13. Minimize operational impact while achieving the best possible performance and system health.14. Assist with production issues in Data Warehouses like reloading data and transformations

Required Skills : 2. Expertise and excellent proficiency with Snowflake internals and integration of Snowflake with other technologies for data processing and reporting. 3. 5+ years in Data Warehousing, Big Data, or ETL development (preferably using AWS S3, Glue, Apache AirFlow) 4. 5+ years of experience with a programming language (preferably Python) 5. Experience designing, building, and maintaining data processing and transformation pipelines (AWS Glue, Apache AirFlow, dbt, Snowflake) 6. Technical expertise with data models, data mining, and data segmentation techniques 7. Knowledge of CI/CD deployment practices 8. Strong skills with Python and SQL with the ability to write efficient queriesBasic Qualification :Additional Skills :Background Check :YesDrug Screen :YesNotes :Selling points for candidate :Project Verification Info :Candidate must be your W2 Employee :YesExclusive to Apex :NoFace to face interview required :NoCandidate must be local :NoCandidate must be authorized to work without sponsorship :YesInterview times set : :NoType of project :Development/EngineeringMaster Job Title :VMS Access EntryBranch Code :Charlotte