Revive Staffing Solutions Inc
Senior Data Engineer- 5+ Roles to be filled IMMEDIATELY!!!
Revive Staffing Solutions Inc, Erlanger, Kentucky, United States, 41018
Job Description:
1.Responsible to design, build, refactor, and maintain data pipelines using Microsoft Azure, SQL, Azure Data Factory, Azure Synapse, Databricks, Python, and PySpark to meet business requirements for reporting, analysis, and data science 2. Responsible to teach, adhere to, and contribute to DataOps and MLOps standards and best practices to accelerate and continuously improve data system performance 3. Responsible to design, and integrate fault tolerance and enhancements into data pipelines to improve quality and performance 4. Responsible to lead and perform root cause analysis and solve problems using analytical and technical skills to optimize data delivery and reduce costs 5. Engages business end users and shares responsibility leading a delivery team. 6. Responsible to mentor Data Engineers at all levels of experience
How you will do it
• Advanced experience with Microsoft Azure, SQL, Azure Data Factory, Azure Synapse, Databricks, Python, PySpark, SAP Datasphere, Power BI or other cloud-based data systems • Advanced experience with Azure Develops, GitHub, CI/CD • Advanced experience with database storage systems such as cloud, relational, mainframe, data lake, and data warehouse • Advanced experience building cloud ETL pipelines using code or ETL platforms utilizing techniques to extract value from large, disconnected datasets • Experienced presenting conceptual and technical improvements to influence decisions • Continuous learning to up skills data engineering techniques and business acumen
What we look for
• Bachelor's or Master's degree in computer science, software engineering, information technology or equivalent combination of data engineering professional experience and education. • 7+ years proven Data Engineering experience in a complex agile environment database connections, APIs, or file-based • Advanced experience with data warehousing concepts and agile methodology • Advanced experience designing and coding data manipulations applying processing
1.Responsible to design, build, refactor, and maintain data pipelines using Microsoft Azure, SQL, Azure Data Factory, Azure Synapse, Databricks, Python, and PySpark to meet business requirements for reporting, analysis, and data science 2. Responsible to teach, adhere to, and contribute to DataOps and MLOps standards and best practices to accelerate and continuously improve data system performance 3. Responsible to design, and integrate fault tolerance and enhancements into data pipelines to improve quality and performance 4. Responsible to lead and perform root cause analysis and solve problems using analytical and technical skills to optimize data delivery and reduce costs 5. Engages business end users and shares responsibility leading a delivery team. 6. Responsible to mentor Data Engineers at all levels of experience
How you will do it
• Advanced experience with Microsoft Azure, SQL, Azure Data Factory, Azure Synapse, Databricks, Python, PySpark, SAP Datasphere, Power BI or other cloud-based data systems • Advanced experience with Azure Develops, GitHub, CI/CD • Advanced experience with database storage systems such as cloud, relational, mainframe, data lake, and data warehouse • Advanced experience building cloud ETL pipelines using code or ETL platforms utilizing techniques to extract value from large, disconnected datasets • Experienced presenting conceptual and technical improvements to influence decisions • Continuous learning to up skills data engineering techniques and business acumen
What we look for
• Bachelor's or Master's degree in computer science, software engineering, information technology or equivalent combination of data engineering professional experience and education. • 7+ years proven Data Engineering experience in a complex agile environment database connections, APIs, or file-based • Advanced experience with data warehousing concepts and agile methodology • Advanced experience designing and coding data manipulations applying processing