CCS IT
Cloud Solution Analyst - Hybrid, Denver, CO - Need locals
CCS IT, Denver, Colorado, United States, 80285
Job Title: Cloud Solution Analyst
Overview: We are looking for a highly experienced Data Specialist with hands-on experience in SQL (ideally MS Sql Server/Google Big Query), Data Warehousing, and designing and writing code for ETL processes in Python.
Primary Responsibilities: Refine Data Warehouse Model. Improve existing Python ETL (Extract, Transform, Load) code. Work with business teams to identify data reporting requirements. Required Experience:
Data Modeling Proficiency in SQL, ideally with experience in MS SQL Server and Google BigQuery. Hands-on experience in Python ETL coding. Familiarity with Python libraries such as Pandas, SqlAlchemy, etc. Experience with ETL tools such as Airflow, NiFi, dbt, or similar. Desired Experience:
Knowledge of Google Cloud Platform (GCP) services. Familiarity with Data Warehouse methodologies like Inmon, Kimball, or Data Vault. Experience with Continuous Integration/Continuous Deployment (CI/CD) tools such as Git, Terraform, etc. Proficiency in reporting tools like Tableau or R Shiny. Experience working with public health data is a plus. A keen interest in code and quality assurance. Additional Details:
This is a long-term contract position. The ideal candidate should have strong problem-solving skills and the ability to communicate effectively with both technical and non-technical stakeholders. The work may involve collaboration with cross-functional teams and the ability to adapt to evolving project requirements. The candidate should demonstrate a proactive approach to learning and keeping up with advancements in technology and industry best practices. Work Environment:
Remote work Flexibility in working hours may be required to accommodate different time zones or project deadlines.
Overview: We are looking for a highly experienced Data Specialist with hands-on experience in SQL (ideally MS Sql Server/Google Big Query), Data Warehousing, and designing and writing code for ETL processes in Python.
Primary Responsibilities: Refine Data Warehouse Model. Improve existing Python ETL (Extract, Transform, Load) code. Work with business teams to identify data reporting requirements. Required Experience:
Data Modeling Proficiency in SQL, ideally with experience in MS SQL Server and Google BigQuery. Hands-on experience in Python ETL coding. Familiarity with Python libraries such as Pandas, SqlAlchemy, etc. Experience with ETL tools such as Airflow, NiFi, dbt, or similar. Desired Experience:
Knowledge of Google Cloud Platform (GCP) services. Familiarity with Data Warehouse methodologies like Inmon, Kimball, or Data Vault. Experience with Continuous Integration/Continuous Deployment (CI/CD) tools such as Git, Terraform, etc. Proficiency in reporting tools like Tableau or R Shiny. Experience working with public health data is a plus. A keen interest in code and quality assurance. Additional Details:
This is a long-term contract position. The ideal candidate should have strong problem-solving skills and the ability to communicate effectively with both technical and non-technical stakeholders. The work may involve collaboration with cross-functional teams and the ability to adapt to evolving project requirements. The candidate should demonstrate a proactive approach to learning and keeping up with advancements in technology and industry best practices. Work Environment:
Remote work Flexibility in working hours may be required to accommodate different time zones or project deadlines.