Logo
Mavensoft Technologies

ETL Data Warehouse Engineer

Mavensoft Technologies, Denver, Colorado, United States, 80285


Job Title

:

ETL Data Warehouse Engineer Duration: 13 months (contract) Location: Denver, CO (Hybrid 1 day/week, Candidate Must Be Local )

Key Skills : Python ETL Development, SQL Proficiency (MS SQL Server/Google BigQuery), Data Modeling.

Overview: We are seeking an experienced

Data Specialist

with a strong background in SQL (MS SQL Server/Google BigQuery), data warehousing, and Python-based ETL processes. The ideal candidate will play a critical role in refining our data models, improving existing ETL code, and collaborating with business teams to define data reporting needs.

Primary Responsibilities:

Refine Data Warehouse Model:

Enhance and optimize existing data warehouse structures, ensuring data integrity and consistency. Improve Python ETL Code:

Review, optimize, and expand upon existing Python-based ETL pipelines to ensure efficient and accurate data processing. Collaborate with Business Teams:

Work with business and analytics teams to understand data reporting and analysis requirements, ensuring solutions align with business goals. Write and Maintain ETL Processes:

Design, develop, and maintain robust ETL processes using Python, with a focus on scalability, reliability, and performance. Data Integration and Transformation:

Ensure smooth integration and transformation of data from various sources into the data warehouse, with accurate mapping and business logic application. Required Experience:

Data Modeling:

Strong experience with data modeling concepts and techniques. SQL Proficiency:

Hands-on experience in SQL, particularly with MS SQL Server and Google BigQuery. Python ETL Coding:

Practical experience writing and optimizing ETL code in Python. Python Libraries:

Familiarity with libraries such as Pandas, SqlAlchemy, and other related tools for data manipulation and integration. ETL Tools:

Experience using ETL tools like Apache Airflow, NiFi, dbt, or similar for automating and orchestrating data pipelines. Desired Experience:

Cloud Platforms:

Experience working with Google Cloud Platform (GCP) services, such as BigQuery, Dataflow, or Cloud Storage. Data Warehouse Methodologies:

Familiarity with data warehousing methodologies like Inmon, Kimball, or Data Vault. CI/CD Tools:

Experience with CI/CD tools and practices, including Git, Terraform, and related technologies for automated deployment and version control. Reporting Tools:

Proficiency with data visualization and reporting tools such as Tableau, Power BI, or R Shiny. Public Health Data (Optional):

Experience working with public health data is a plus. Code Quality & Assurance:

A strong focus on writing clean, efficient, and well-documented code. Experience with automated testing and code review processes is beneficial. Additional Details:

Contract Duration:

This is a long-term contract position with potential for extension based on project needs and performance. Communication:

The ideal candidate will possess excellent communication skills, both technical and non-technical, and be comfortable collaborating with cross-functional teams. Proactive Learning:

We value candidates who demonstrate a strong interest in continuous learning and staying current with new technologies and industry best practices.

Email your resume to : usjobs@mavensoft.com To learn more about Mavensoft visit us online at http://www.mavensoft.com/