Becker Health, a division of MedHQ
Data Engineer II
Becker Health, a division of MedHQ, Phila, Pennsylvania, United States, 19117
Term to Perm. Fully remote - can be based anywhere in US. Tableau certification and a strong portfolio on Tableau Public is highly preferred, along with a background in research administration or experience working at a research institute.Job Summary:
As a Data Engineer at CHOP, you will be responsible for driving the analytics product life cycle, expanding our analytics products while optimizing our data architecture, and developing best practices and governance for administrative reporting, data visualization, and data flow for cross-functional teams. We are looking for a candidate who is experienced in all aspects of data from development to implementation.As the Data Engineer II, you will support various stakeholders and ensure consistent optimal product delivery throughout ongoing projects. You will also support non-technical colleagues in collecting and appropriately using administrative data. The ideal candidate must be self-directed, comfortable supporting the data needs of multiple teams, systems, and products, and excited about re-designing CHOP's data architecture to support our next generation of products and data initiatives.Job Responsibilities:
Conduct data modeling by evaluating structured and unstructured data and determining the most appropriate schema for new fact tables, data marts, etc.Collaborate with colleagues across the enterprise to scope requests, extract data from various data sources, validate results, create relevant data visualizations, and share with the requester in Tableau. Develop dashboards and automate refreshes as appropriate in Tableau Server.Adhere to and contribute to data governance standards and educate and support colleagues in best practices to ensure that data is used appropriately.Collaborate and act as the voice of the customer to offer concrete feedback and project requests as well as an advocate for analytics from within the business units themselves.Assemble large, complex data sets that meet functional/non-functional business requirements.Identify, design, and implement internal process improvements, such as automating manual processes, optimizing data delivery, and re-designing infrastructure for greater scalability.Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources (including ground, hybrid cloud, and cloud) using SQL and various programming technologies.Develop analytics tools that utilize data resources to provide actionable insights, operational efficiency, and other key business performance metrics.Work with stakeholders including the Executive, Administrative, and Analyst teams to assist with data-related technical issues and support their data infrastructure needs.Develop optimized tools for analytics and data scientist team members that assist them in building and optimizing projects into innovative industry leaders.Minimum Requirements:
Experience in data analysis, design, and development using Tableau.Strong understanding of data modeling, data warehousing, and data integration.Proficient in SQL for data retrieval, manipulation, and analysis.Strong communication and collaboration skills, able to work effectively in a team environment.Self-motivated and able to work independently.Experience integrating predictive and prescriptive models into applications and processes.Develop processes supporting data transformation, data structures, metadata, dependency, and workload management.Perform root cause analysis on internal and external data and processes to identify opportunities for improvement.Skills:
Strong analytic skills for structured and unstructured datasets.Critical thinking and creative problem-solving skills, ability to communicate with stakeholders.Project management and organizational skills.Experience with relational SQL and NoSQL databases such as IBM PDA (Netezza), MS SQL Server, and HBase.Experience with data integration tools such as Informatica, MS Integration Services, and Sqoop.Experience with API consumption and building.Knowledge of object-oriented programming languages such as Python, Java, C++, and Scala.Familiarity with statistical data analysis tools like R, SAS, and SPSS.Proficiency in visual analytics tools including QlikView, Tableau, and Power BI.Familiarity with Agile methodology for development.
#J-18808-Ljbffr
As a Data Engineer at CHOP, you will be responsible for driving the analytics product life cycle, expanding our analytics products while optimizing our data architecture, and developing best practices and governance for administrative reporting, data visualization, and data flow for cross-functional teams. We are looking for a candidate who is experienced in all aspects of data from development to implementation.As the Data Engineer II, you will support various stakeholders and ensure consistent optimal product delivery throughout ongoing projects. You will also support non-technical colleagues in collecting and appropriately using administrative data. The ideal candidate must be self-directed, comfortable supporting the data needs of multiple teams, systems, and products, and excited about re-designing CHOP's data architecture to support our next generation of products and data initiatives.Job Responsibilities:
Conduct data modeling by evaluating structured and unstructured data and determining the most appropriate schema for new fact tables, data marts, etc.Collaborate with colleagues across the enterprise to scope requests, extract data from various data sources, validate results, create relevant data visualizations, and share with the requester in Tableau. Develop dashboards and automate refreshes as appropriate in Tableau Server.Adhere to and contribute to data governance standards and educate and support colleagues in best practices to ensure that data is used appropriately.Collaborate and act as the voice of the customer to offer concrete feedback and project requests as well as an advocate for analytics from within the business units themselves.Assemble large, complex data sets that meet functional/non-functional business requirements.Identify, design, and implement internal process improvements, such as automating manual processes, optimizing data delivery, and re-designing infrastructure for greater scalability.Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources (including ground, hybrid cloud, and cloud) using SQL and various programming technologies.Develop analytics tools that utilize data resources to provide actionable insights, operational efficiency, and other key business performance metrics.Work with stakeholders including the Executive, Administrative, and Analyst teams to assist with data-related technical issues and support their data infrastructure needs.Develop optimized tools for analytics and data scientist team members that assist them in building and optimizing projects into innovative industry leaders.Minimum Requirements:
Experience in data analysis, design, and development using Tableau.Strong understanding of data modeling, data warehousing, and data integration.Proficient in SQL for data retrieval, manipulation, and analysis.Strong communication and collaboration skills, able to work effectively in a team environment.Self-motivated and able to work independently.Experience integrating predictive and prescriptive models into applications and processes.Develop processes supporting data transformation, data structures, metadata, dependency, and workload management.Perform root cause analysis on internal and external data and processes to identify opportunities for improvement.Skills:
Strong analytic skills for structured and unstructured datasets.Critical thinking and creative problem-solving skills, ability to communicate with stakeholders.Project management and organizational skills.Experience with relational SQL and NoSQL databases such as IBM PDA (Netezza), MS SQL Server, and HBase.Experience with data integration tools such as Informatica, MS Integration Services, and Sqoop.Experience with API consumption and building.Knowledge of object-oriented programming languages such as Python, Java, C++, and Scala.Familiarity with statistical data analysis tools like R, SAS, and SPSS.Proficiency in visual analytics tools including QlikView, Tableau, and Power BI.Familiarity with Agile methodology for development.
#J-18808-Ljbffr