Logo
Saxon Global

Senior Data Engineer

Saxon Global, Newport Beach, California, us, 92659


Chipotle

Senior Data Engineer

Newport Beach, CA (Remote)

12+ month contract (Potential for extension)

Key Responsibilities:

Design, develop and maintain scalable data pipelinesDevelop data ingestion and integrations (REST, SOAP, SFTP, MQ, etc.) processesTake ownership of building data pipelinesActively engage in technology discovery and implementation for both on-prem and in Cloud (i.e. Azure or AWS) to build solution for future systemsDevelop high performance scripts in SQL/Python/etc. to achieve objectives of enterprise data, BI and analytics need.Incorporate standards and best practices into engineering solutionsManage code versions in source control and coordinate changes across teamParticipate in architecture design and discussionsProvide logical and physical data design, and database modelingBe part of the Agile team to collaborate and to help shape requirementsSolve complex data issues around data integration, unusable data elements, unstructured data sets, and other data processing incidentsSupports the development and design of the internal data integration frameworkWorks with system owners to resolve source data issues and refine transformation rulesPartner with enterprise teams, data scientist, architects to define requirements and solutionKey Qualifications :

Have a B.A./B.S. and 5-8 years of relevant work experience; or an equivalent in education and experienceMust have excellent experience with SnowflakeHands on experience with Microsoft Stack - SSIS, SQL, etc.Possess strong analytical skills with the ability to analyze raw data, draw conclusions, and develop actionable recommendationsExperience with the Agile development process preferredProven track-record of excellence and consistently delivered past project successfullyHands on experience with Azure data factory V2, Azure Databricks, SQLDW or Snowflake, Azure analysis services and Cosmos DBExperience with Python or Scala.Understanding of continuous integration and continuous deployment on AzureExperience with large scale data lake or warehouse implementation on any of the public cloud (AWS, Azure, GCP)Have excellent interpersonal and written/verbal communication skillsManage financial information in a confidential and professional mannerBe highly motivated and flexibleEffectively handle multiple projects simultaneously and pay close attention to detailHave experience in a multi-dimensional data environment

Required Skills : Chipotle needs a resource who understands the end-to-end process of moving data from on-prem to the cloud. They will be using Azure Data Factory, moving SSIS packages to the cloud, and ultimately ending in Snowflake. This Data Engineer may also be developing integrations on Snowflake data warehouse. For a candidate to be successful in this role they need to be strong with SQL, Azure and Snowflake. It would be a huge benefit if they also have DBT Cloud Tool experience/knowledge.Basic Qualification :

SQL, ETL, Azure Data Factory, SSIS, and SnowflakeAdditional Skills :

SQL, ETL, Azure Data Factory, SSIS, and Snowflake We have placed 4 contractors on this team and all have been on for long term engagements, this role provides cutting edge tech in a stable environment for those who come in and perform.Background Check :YesNotes :Selling points for candidate :We have placed 4 contractors on this team and all have been on for long term engagements, this role provides cutting edge tech in a stable environment for those who come in and perform.Project Verification Info :Candidate must be your W2 Employee :NoExclusive to Apex :NoFace to face interview required :NoCandidate must be local :NoCandidate must be authorized to work without sponsorship ::NoInterview times set :YesType of project :Development/EngineeringMaster Job Title :Dev: SQL Server DatabaseBranch Code :Orange County