Logo
HexaQuEST Global, Inc.

Product Owner with EDA

HexaQuEST Global, Inc., Boston, Massachusetts, 02298


Required skills : EDA (Event driven Architecture) architect experience is must. The candidate must have experience as Product owner with healthcare payer(US Health insurance) domain experience and experience in implementing EDA(Event driven Architecture) , excellent communication skills. Candidate must be senior, experienced resource who will help champion the EDA (Event Driven Architecture) We are seeking an experienced and detail-oriented Business Analyst to join our team and take a leading role in driving the implementation and optimization of data pipelines within our AWS data lake and data warehouse. As an AWS Data Pipeline Business Analyst, you will collaborate with business partners, technical teams, and vendors to gather requirements, design efficient data workflows, and ensure successful project completion. Responsibilities: Work closely with business partners to understand their data requirements and objectives and translate them into well-defined technical specifications for data pipeline implementation. Collaborate with technical teams to design and implement data ingestion, transformation, and processing workflows within our AWS data lake and data warehouse. Evaluate and recommend appropriate AWS services and technologies for building scalable and reliable data pipelines, such as AWS Glue, AWS Data Pipeline, Amazon Redshift, or AWS Lambda. Gather and document data mapping, transformation rules, and data quality requirements to ensure accurate and consistent data flows across the pipelines. Collaborate with the Enterprise Data Service team to ensure compliance with data governance, security, and privacy policies throughout the data pipeline lifecycle. Facilitate the integration of data from various sources into the data lake and data warehouse, ensuring data integrity, consistency, and optimization. Conduct thorough testing and validation of data pipelines to ensure functionality, performance, and reliability, and coordinate with stakeholders to resolve any issues or defects. Provide training and support to end-users, helping them leverage the full capabilities of the data pipelines and AWS data services. Monitor the performance and effectiveness of data pipelines, identify opportunities for optimization and efficiency improvements, and propose recommendations for enhancements. Stay up-to-date with the latest AWS data services, features, and industry best practices, and actively contribute to the continuous improvement of data pipeline processes. Requirements: Proven experience as a Business Analyst in implementing and optimizing data pipelines within AWS environments, including AWS data lake and data warehouse solutions. Strong understanding of AWS data services and technologies, such as AWS Glue, AWS Data Pipeline, Amazon Redshift, AWS Lambda, and related data storage and processing services. In-depth knowledge of data ingestion, transformation, and ETL (Extract, Transform, Load) processes, as well as data integration and data quality principles. Experience with data governance, data security, and data privacy considerations in an AWS environment. Proficiency in SQL, data modeling, and data visualization tools, as well as scripting languages like Python or PySpark. Excellent analytical and problem-solving skills, with the ability to gather and translate complex business requirements into effective data pipeline solutions. Strong communication and collaboration skills to engage with business partners, technical teams, and vendors. Experience in project management methodologies and tools to drive project execution and ensure successful project completion. Bachelor's degree in Computer Science, Information Systems, or a related field is preferred. If you are a dedicated Business Analyst with a strong background in AWS data services, data pipeline implementation, and a passion for optimizing data workflows within an AWS data lake and data warehouse, we invite you to join our team. Contribute to the success of our organization by leveraging your expertise to drive efficient and scalable data pipelines, enabling effective data analysis, insights, and decision-making.