Logo
Saxon Global

Data Engineer

Saxon Global, Jacksonville, Florida, United States, 32290


Data Engineer

Location: Jacksonville, FL (need candidates open to be onsite) - Local or open to relocate

Client: automotive industry Rate: $65/h on C2C Duration: Long term contract

Working on hybrid mode from the beginning of the contract

Candidates with very good comm skills

Any visa will work (except OPT and CPT)

Candidates with active LinkedIn profile only!!!

Please note this is a Senior level role!!!

Job Description: We are seeking a senior level Data Engineer in Jacksonville, FL with excellent communication, analysis, and development skills to join our digital transformation product team. Sr. Data Engineer will be responsible for developing curated data (data models, data ETL pipelines, views) in a cloud-based environment to support business needs for analytics and reporting. The candidate will provide guidance and support for the set-up of new environments along with best practices and recommendations based on industry and hands on experience as a data engineer. In addition, the candidate should have experience with standardizing data modeling and design frameworks for analysis, design, building, testing and maintenance. Cloud based data engineering is required and experience using the Azure stack is highly preferred and recommended. Candidates must possess the technical capabilities to extract, retrieve and analyze data with SQL based tools. The engineer should possess strong interpersonal communication skills to be able to bridge the gap between business and technical users to understand and assess requirements and deliver data driven solutions. The role will also involve adhering to data governance standards and practices along with experience developing in an agile environment using Azure boards or similar tool such as Jira. Responsibilities: •Implement the retrieval, cleansing, mapping, and transformation of data for optimized storage and use according to business and technical requirements. •Implement complex integrations of various business critical systems using the Azure data platform. •Create source to target mappings for data being retrieved, transformed, and stored. •Validate accuracy and quality of data based on business domain and requirements. •Collaborate with product team to assist with defining data, validating quality, consulting on test plans, strategies and use cases as it relates to the requirements and data. •Consult and participate in the solution design and development using MS Azure and other tools. •Implement automations for tasks and deploy production standard code (unit testing, continuous integration, versioning, etc). •Work with Product team on incident management, troubleshooting, and enhancements to the product. •Build Azure ADF pipelines to bring together data from various sources. •Implement proper alerting and monitoring solutions for ADF data pipelines.

Required Skills/Experience: •Minimum of 5-7 years of data warehousing, data lifecycle management, and computer programming experience •5+ years experience in developing complex system integrations on the Azure Cloud Platform •5+ years DW/BI development experience on Azure Cloud Platform •Experience with Azure: ADLS, Stream Analytics, SQL DW, COSMOS DB, Analysis Services, Azure Functions, Serverless Architecture, ARM/BICEP Templates •Experience in database design and data modeling •Hands on experience with Azure Data Factory in building pipelines and orchestrating the pipeline execution. •Strength in data engineering/development in writing complex SQLs, stored procs, and functions in Azure SQL DB •Hands on experience in data engineering/development with No SQL DB's, such as Cosmo's DB •Strong learning orientation and curiosity; comfortable learning new systems/software applications •Strong analytical thinking skills and problem-solving skills •Strong written and oral communication skills •Ability to work in a fast paced, cross-functional, Agile product team •Detail oriented •Good time management skills and multitasking ability •Hands on experience with writing functions in Azure.

Job Requirements •Minimum of Bachelor's Degree in Computer Science, Management Information Systems, or related field required •Minimum 5 years of Data Engineering experience required •Experience with Azure Data Platform required •Strong technical background required •Certification in Azure Fundamentals or higher a plus •Experience with CI/CD pipelines a plus •Experience with Azure DevOps a plus •ETL experience a plus •This role is hybrid.