esrhealthcare
Data engineer NYC NY
esrhealthcare, New York, New York, United States, 10292
Experience level: Mid-senior Experience required: 5 Years Education level: Bachelors degree Job function: Information Technology Industry: Insurance Pay rate : $70 per hour Total position: 1 Relocation assistance: No Visa sponsorship eligibility: No
Notes from HM: We are seeking a highly skilled Data Engineer with previous hands-on experience in the Financial or Banking industry, capable of hitting the ground running and quickly building scalable data solutions. The ideal candidate will have a strong background in Azure Cloud and will work within a cross-functional Agile team of 15 people. This role requires someone who can collaborate closely with stakeholders to understand business requirements, particularly for capturing revenue data for brokers. The engineer will also work with compliance and security teams to ensure all systems meet regulatory standards. They will be responsible for building a flexible data capture system while maintaining BAU (Business as Usual) functionality. This is a dynamic, fast-paced environment where the ability to deliver high-quality solutions quickly is critical.
Job Details
Here is the job requirement for a Data Engineer position with expertise in Python ETL and Azure experience. We are seeking a highly skilled and motivated individual who can contribute to our team and help drive our data engineering initiatives.
Job Requirements
1. Python ETL: The ideal candidate should have strong proficiency in Python programming language and experience in Extract, Transform, Load (ETL) processes. They should be able to design and develop efficient ETL workflows to extract data from various sources, transform it as per business requirements, and load it into target systems.
2. Azure Experience: The candidate should have hands-on experience working with Microsoft Azure cloud platform. They should be familiar with Azure services and tools, and have a good understanding of Azure architecture and best practices.
3. Azure Data Factory, DataBricks, Azure Storage, and Azure VM: The candidate should have practical experience with Azure Data Factory, DataBricks, Azure Storage, and Azure Virtual Machines. They should be able to design and implement data pipelines using Azure Data Factory, perform data transformations and analytics using DataBricks, and manage data storage and virtual machines in Azure.
4. Data Governance, Data Quality, and Controls: The candidate should have a strong understanding of data governance principles, data quality management, and data controls. They should be able to implement data governance frameworks, establish data quality standards, and ensure compliance with data regulations and policies.
5. Implementing alerts and notifications for batch jobs: The candidate should have experience in setting up alerts and notifications for batch jobs. They should be able to configure monitoring and alerting mechanisms to ensure timely identification and resolution of issues in batch job execution.
Notes from HM: We are seeking a highly skilled Data Engineer with previous hands-on experience in the Financial or Banking industry, capable of hitting the ground running and quickly building scalable data solutions. The ideal candidate will have a strong background in Azure Cloud and will work within a cross-functional Agile team of 15 people. This role requires someone who can collaborate closely with stakeholders to understand business requirements, particularly for capturing revenue data for brokers. The engineer will also work with compliance and security teams to ensure all systems meet regulatory standards. They will be responsible for building a flexible data capture system while maintaining BAU (Business as Usual) functionality. This is a dynamic, fast-paced environment where the ability to deliver high-quality solutions quickly is critical.
Job Details
Here is the job requirement for a Data Engineer position with expertise in Python ETL and Azure experience. We are seeking a highly skilled and motivated individual who can contribute to our team and help drive our data engineering initiatives.
Job Requirements
1. Python ETL: The ideal candidate should have strong proficiency in Python programming language and experience in Extract, Transform, Load (ETL) processes. They should be able to design and develop efficient ETL workflows to extract data from various sources, transform it as per business requirements, and load it into target systems.
2. Azure Experience: The candidate should have hands-on experience working with Microsoft Azure cloud platform. They should be familiar with Azure services and tools, and have a good understanding of Azure architecture and best practices.
3. Azure Data Factory, DataBricks, Azure Storage, and Azure VM: The candidate should have practical experience with Azure Data Factory, DataBricks, Azure Storage, and Azure Virtual Machines. They should be able to design and implement data pipelines using Azure Data Factory, perform data transformations and analytics using DataBricks, and manage data storage and virtual machines in Azure.
4. Data Governance, Data Quality, and Controls: The candidate should have a strong understanding of data governance principles, data quality management, and data controls. They should be able to implement data governance frameworks, establish data quality standards, and ensure compliance with data regulations and policies.
5. Implementing alerts and notifications for batch jobs: The candidate should have experience in setting up alerts and notifications for batch jobs. They should be able to configure monitoring and alerting mechanisms to ensure timely identification and resolution of issues in batch job execution.