Mukuru
Senior Data Engineer
Mukuru, Hartford, Connecticut, United States,
Are you ready to make a significant impact in the world of data engineering? We're actively seeking a Senior Data Engineer to join our incredible team. In this role, you'll take the lead in developing and optimizing our data engineering solutions, with a primary focus on modernizing our data warehousing and processing strategies.Your expertise will guide our transition toward more flexible and scalable data processing patterns, including ELT, and strategic utilization of Operational Data Stores (ODS), ensuring our ability to meet evolving data requirements in a dynamic technological landscape.Duties And Responsibilities (Includes But Are Not Limited To)
Lead the architectural design and implementation of scalable data engineering solutions, leveraging advanced cloud data warehouse technologies (e.g., Snowflake, AWS Redshift, Google BigQuery, Databricks, or Azure Synapse Analytics). This includes promoting the adoption of ELT patterns over traditional ETL processes to enhance data agility and efficiency.Champion the development and evaluation of proof of concept (POC) initiatives for the adoption of an Operational Data Store (ODS) and other modern data processing frameworks, such as the Medallion Architecture, ensuring our approach remains technology-agnostic and aligned with best practices.Oversee the optimization of data flows, utilizing ELT processes to streamline data loading and transformation in cloud data warehouses, ensuring high data quality and accessibility.Direct and refine CI/CD processes for seamless data pipeline deployments, incorporating best practices in version control with git.Collaborate with cross-functional teams to capture and address comprehensive data requirements, ensuring robust support for business analytics and decision-making.Uphold rigorous data security and compliance standards, aligning with financial industry regulations and evolving data privacy best practices.Key Requirements
Minimum of 5 years in Data Engineering, including 2+ years in a senior or leadership role, with a preference for experience in the financial services sector.Technical Expertise: Proficiency in at least one major cloud data warehouse solution (e.g., Snowflake, AWS Redshift, Google BigQuery, Databricks, Azure Synapse Analytics), with a strong emphasis on implementing ELT patterns and familiarity with modern data architecture frameworks like the Medallion Architecture.Leadership and Innovation: Demonstrated leadership in driving the adoption of modern data processing strategies, with the ability to manage complex projects and innovate within the data engineering space.Programming Skills: Strong proficiency in programming languages such as Python or Java and can demonstrate advanced knowledge of SQL on a cloud data warehouse solution, essential for developing and managing ELT processes.Certifications: Cloud platform certification (e.g., AWS Solutions Architect, Google Cloud Professional Data Engineer, Snowflake SnowPro) is highly desirable.Communication: Excellent verbal and written communication skills, essential for effective collaboration across teams and with stakeholders.Minimum Qualifying Attributes
Hands-on experience with CDC-based data ingestion tools and methodologies.Comprehensive understanding of data modeling, ETL/ELT processes, and ensuring data security and privacy, especially within the financial industry.If you do not receive any response after two weeks, please consider your application unsuccessful.NB: ALL STAFF APPOINTMENTS WILL BE MADE WITH DUE CONSIDERATION OF THE COMPANY’S DIVERSITY AND INCLUSION PLANS.
#J-18808-Ljbffr
Lead the architectural design and implementation of scalable data engineering solutions, leveraging advanced cloud data warehouse technologies (e.g., Snowflake, AWS Redshift, Google BigQuery, Databricks, or Azure Synapse Analytics). This includes promoting the adoption of ELT patterns over traditional ETL processes to enhance data agility and efficiency.Champion the development and evaluation of proof of concept (POC) initiatives for the adoption of an Operational Data Store (ODS) and other modern data processing frameworks, such as the Medallion Architecture, ensuring our approach remains technology-agnostic and aligned with best practices.Oversee the optimization of data flows, utilizing ELT processes to streamline data loading and transformation in cloud data warehouses, ensuring high data quality and accessibility.Direct and refine CI/CD processes for seamless data pipeline deployments, incorporating best practices in version control with git.Collaborate with cross-functional teams to capture and address comprehensive data requirements, ensuring robust support for business analytics and decision-making.Uphold rigorous data security and compliance standards, aligning with financial industry regulations and evolving data privacy best practices.Key Requirements
Minimum of 5 years in Data Engineering, including 2+ years in a senior or leadership role, with a preference for experience in the financial services sector.Technical Expertise: Proficiency in at least one major cloud data warehouse solution (e.g., Snowflake, AWS Redshift, Google BigQuery, Databricks, Azure Synapse Analytics), with a strong emphasis on implementing ELT patterns and familiarity with modern data architecture frameworks like the Medallion Architecture.Leadership and Innovation: Demonstrated leadership in driving the adoption of modern data processing strategies, with the ability to manage complex projects and innovate within the data engineering space.Programming Skills: Strong proficiency in programming languages such as Python or Java and can demonstrate advanced knowledge of SQL on a cloud data warehouse solution, essential for developing and managing ELT processes.Certifications: Cloud platform certification (e.g., AWS Solutions Architect, Google Cloud Professional Data Engineer, Snowflake SnowPro) is highly desirable.Communication: Excellent verbal and written communication skills, essential for effective collaboration across teams and with stakeholders.Minimum Qualifying Attributes
Hands-on experience with CDC-based data ingestion tools and methodologies.Comprehensive understanding of data modeling, ETL/ELT processes, and ensuring data security and privacy, especially within the financial industry.If you do not receive any response after two weeks, please consider your application unsuccessful.NB: ALL STAFF APPOINTMENTS WILL BE MADE WITH DUE CONSIDERATION OF THE COMPANY’S DIVERSITY AND INCLUSION PLANS.
#J-18808-Ljbffr