Logo
Ciber

Data Engineer General

Ciber, Dearborn, Michigan, United States, 48120


HTC Global Services wants you. Come build new things with us and advance your career. At HTC Global you'll collaborate with experts. You'll join successful teams contributing to our clients' success. You'll work side by side with our clients and have long-term opportunities to advance your career with the latest emerging technologies.

At HTC Global Services our consultants have access to a comprehensive benefits package. Benefits can include Paid-Time-Off, Paid Holidays, 401K matching, Life and Accidental Death Insurance, Short & Long Term Disability Insurance, and a variety of other perks.

Description:Key Responsibilities:

Data Pipeline Development:

Design, build, and maintain scalable and robust data pipelines on GCP using tools such as Apache Airflow, Cloud Composer, and Cloud Dataflow.Implement data integration solutions to ingest data from various sources, including cloud storage, and third-party APIs.

Data Warehousing:

Develop and optimize data warehouse solutions using BigQuery and other GCP services.Ensure data accuracy, consistency, and security within the data warehouse environment.Monitor and troubleshoot data pipeline and warehouse issues to maintain system reliability.

Cloud Platform Expertise:

Utilize GCP services such as Cloud Storage, Cloud Run, and Cloud Functions to build scalable and cost-effective data solutions.Implement best practices for cloud infrastructure management, including resource provisioning, monitoring, and cost optimization.

Collaboration and Communication:

Work closely with data scientists, analysts, and business stakeholders to understand data requirements and deliver high-quality data solutions.Collaborate with cross-functional teams to design and implement data models, ETL processes, and reporting solutions.

Automation and Optimization:

Develop automated workflows using Apache Airflow and Astronomer to streamline data processing and improve efficiency.Continuously optimize data pipelines for performance, scalability, and cost-effectiveness.

Documentation and Training:

Create and maintain comprehensive documentation for data pipelines, data models, and infrastructure components.Provide training and support to team members and stakeholders on data engineering best practices and GCP services.

Skills Required:Technical Skills:

Proficiency in data pipeline tools and frameworks such as Apache Airflow, Cloud Composer, and Cloud Dataflow.Strong knowledge of GCP services, including BigQuery, Cloud Storage, Cloud Run, and Cloud Functions.Experience with SQL, Python, and other programming languages commonly used in data engineering.Familiarity with data modeling, ETL processes, and data integration techniques.Soft Skills:

Excellent problem-solving and analytical skills.Strong communication and collaboration abilities.Ability to work independently and as part of a team in a fast-paced, dynamic environment.Experience Required:

5+ years in data warehouse and 2+ years in GCPEducation Required:

Bachelors in ScienceEducation Preferred:

Masters in Science.

Our success as a company is built on practicing inclusion and embracing diversity. HTC Global Services is committed to providing a work environment free from discrimination and harassment, where all employees are treated with respect and dignity. Together we work to create and maintain an environment where everyone feels valued, included, and respected. At HTC Global Services, our differences are embraced and celebrated. HTC is an Equal Opportunity Employer. We respect and seek to empower each individual and support the diverse cultures, perspectives, skills, and experiences within our workforce. HTC is proud to be recognized as a National Minority Supplier.