New York Life Insurance Company
Corporate Vice President - Technical Data Lead
New York Life Insurance Company, Lebanon, New Jersey, us, 08833
Corporate Vice President - Technical Data Lead
We are seeking energetic and passionate
Technical Data Lead
to design and execute our company’s modern data management strategy. As a key member of
Enterprise Data Platform
team, you will work with talented engineers, architects, data scientists and end users to reimagine, design and build the next generation of
Insurance solutions in the cloud
leveraging nascent and modern technologies and platforms including
AWS and Azure
cloud, APIs, and AI. In this role, you will be responsible for the design, development and deployment of optimal data integration pipelines taking advantage of leading-edge technologies through experimentation and iterative refinement. You will assist other team members in building and executing automation that effectively and repeatably ensures quality, security, integrity and maintainability of our solutions. You will also be responsible for the evaluation of existing data Integration architecture, evolve and enhance the foundation, participate in product ideation and technology stack evaluation and recommendations. What You’ll Do:
Lead Data Solution Design: Architect, design, and implement modern data solutions in AWS Cloud, leveraging ETL/ELT processes and data integration tools like IDMC, DBT. Manage the integration of data from various internal and external sources using modern data platforms, ensuring clean, reliable, and scalable data pipelines. Work closely with the Business Stakeholders, Business Analyst, Data Architects to review business requirements, understand business processes, data entities, data producers, consumers, and the data dependencies between various departments. Ensure that the data architecture supports the organization’s data governance framework, promoting data quality, integrity, and regulatory compliance. Provide technical leadership and mentoring to the data engineering team, setting best practices for development, testing, and documentation of data workflows. Review Data models and ETL/ELT mappings/designs/workflows. Provide solutions to ETL team, deliver efficient and scalable DW solutions in high data growth environment. Help in investigating production data issues and provide workaround/fixes for the issues in an SME role. Function as a bridge between technical and business audiences during solution planning, development, and deployment. Understand Onsite & Offshore models and lead the teams (BA, DM, ETL, QA). Own end-to-end technical delivery of all the efforts & projects. Create reliable solution plans that include cost estimates and optimized delivery approaches by working with the business and multi-disciplinary teams. What You’ll Bring:
Five or more years of experience in enterprise-level delivery of IT solutions, including stakeholder management. Five or more years of experience in managing multiple work streams of IT technical delivery through close coordination/cooperation with Business Leads / Product Managers. Five or more years of experience in SAFe Agile framework and processes, including estimation, scheduling, financial planning, and budgeting. Strong experience managing staffing, identifying resource gaps, and contention issues, including internal employees and vendor resources/consultants. Proven experience in creating Data Lineage documents, Source to Target (STM) Mapping documents and Low-Level Technical Specification Documents. Deep understanding of modern data architecture, including experience with data lakes, data warehouses, data marts, relational and dimensional modeling, data quality, and master data management. Strong background in Operational Data Stores, Dimensional Modeling, and supporting application data architecture. Experience with Oracle, Postgres, MySQL, and understanding of best practice architectural concepts for relational data models. 10 + Years of design and implementation of ETL/ELT framework for complex warehouses/data marts. Proven experience in ETL/ELT tools DataStage, IDMC, DBT, AWS Glue. Strong knowledge on AWS Cloud technologies. Proficient in cloud architecture and best practices, particularly with AWS Cloud tools such as Amazon RDS, Redshift, AWS DMS, AWS Glue, Amazon S3, AWS Lambda, and more. Hands-on development mentality, with a willingness to troubleshoot and solve complex problems. Strong analytical and problem-solving skills, and the ability to adapt to a fast-paced environment. Full life cycle software development experience. Keen ability to prioritize and handle multiple assignments. Solid understanding of the insurance, Annuities, LTC, Disability, Wealth management or financial services industry. Bachelor’s degree in computer science or an Engineering discipline. Certification in data management (structure and unstructured), and/or cloud architecture or engineering a plus Certification in delivery methodologies such as Scaled Agile a plus. 10+
years of experience in Data engineering, Data Integration, and cloud tech developing production grade data components and solutions and
15+
overall IT experience with focus on DATA.
#J-18808-Ljbffr
We are seeking energetic and passionate
Technical Data Lead
to design and execute our company’s modern data management strategy. As a key member of
Enterprise Data Platform
team, you will work with talented engineers, architects, data scientists and end users to reimagine, design and build the next generation of
Insurance solutions in the cloud
leveraging nascent and modern technologies and platforms including
AWS and Azure
cloud, APIs, and AI. In this role, you will be responsible for the design, development and deployment of optimal data integration pipelines taking advantage of leading-edge technologies through experimentation and iterative refinement. You will assist other team members in building and executing automation that effectively and repeatably ensures quality, security, integrity and maintainability of our solutions. You will also be responsible for the evaluation of existing data Integration architecture, evolve and enhance the foundation, participate in product ideation and technology stack evaluation and recommendations. What You’ll Do:
Lead Data Solution Design: Architect, design, and implement modern data solutions in AWS Cloud, leveraging ETL/ELT processes and data integration tools like IDMC, DBT. Manage the integration of data from various internal and external sources using modern data platforms, ensuring clean, reliable, and scalable data pipelines. Work closely with the Business Stakeholders, Business Analyst, Data Architects to review business requirements, understand business processes, data entities, data producers, consumers, and the data dependencies between various departments. Ensure that the data architecture supports the organization’s data governance framework, promoting data quality, integrity, and regulatory compliance. Provide technical leadership and mentoring to the data engineering team, setting best practices for development, testing, and documentation of data workflows. Review Data models and ETL/ELT mappings/designs/workflows. Provide solutions to ETL team, deliver efficient and scalable DW solutions in high data growth environment. Help in investigating production data issues and provide workaround/fixes for the issues in an SME role. Function as a bridge between technical and business audiences during solution planning, development, and deployment. Understand Onsite & Offshore models and lead the teams (BA, DM, ETL, QA). Own end-to-end technical delivery of all the efforts & projects. Create reliable solution plans that include cost estimates and optimized delivery approaches by working with the business and multi-disciplinary teams. What You’ll Bring:
Five or more years of experience in enterprise-level delivery of IT solutions, including stakeholder management. Five or more years of experience in managing multiple work streams of IT technical delivery through close coordination/cooperation with Business Leads / Product Managers. Five or more years of experience in SAFe Agile framework and processes, including estimation, scheduling, financial planning, and budgeting. Strong experience managing staffing, identifying resource gaps, and contention issues, including internal employees and vendor resources/consultants. Proven experience in creating Data Lineage documents, Source to Target (STM) Mapping documents and Low-Level Technical Specification Documents. Deep understanding of modern data architecture, including experience with data lakes, data warehouses, data marts, relational and dimensional modeling, data quality, and master data management. Strong background in Operational Data Stores, Dimensional Modeling, and supporting application data architecture. Experience with Oracle, Postgres, MySQL, and understanding of best practice architectural concepts for relational data models. 10 + Years of design and implementation of ETL/ELT framework for complex warehouses/data marts. Proven experience in ETL/ELT tools DataStage, IDMC, DBT, AWS Glue. Strong knowledge on AWS Cloud technologies. Proficient in cloud architecture and best practices, particularly with AWS Cloud tools such as Amazon RDS, Redshift, AWS DMS, AWS Glue, Amazon S3, AWS Lambda, and more. Hands-on development mentality, with a willingness to troubleshoot and solve complex problems. Strong analytical and problem-solving skills, and the ability to adapt to a fast-paced environment. Full life cycle software development experience. Keen ability to prioritize and handle multiple assignments. Solid understanding of the insurance, Annuities, LTC, Disability, Wealth management or financial services industry. Bachelor’s degree in computer science or an Engineering discipline. Certification in data management (structure and unstructured), and/or cloud architecture or engineering a plus Certification in delivery methodologies such as Scaled Agile a plus. 10+
years of experience in Data engineering, Data Integration, and cloud tech developing production grade data components and solutions and
15+
overall IT experience with focus on DATA.
#J-18808-Ljbffr