Collabera
GCP Data Engineer
Collabera, Phoenix, Arizona, United States, 85003
Description
HomeSearch JobsJob Description
GCP Data Engineer
Contract: Phoenix, Arizona, US
Salary: $60.00 Per Hour
Job Code: 355936
End Date: 2024-12-15Days Left: 24 days, 2 hours left
Apply
Client's Domain -
Financial
Job Title -
Data Engineer
Location -
Phoenix, AZ 85054 - Hybrid
Duration
- 06+ Months (Potentially Contract to Hire)
Pay Rate
- ($55-$60) hourly.
Note:Min 7 years of experience.About the Role:
We are seeking a talented Data Engineer to join our team and help us unlock the power of data. In this role, you will be responsible for designing, developing, and maintaining robust data pipelines to support critical business initiatives.You will work closely with data scientists, analysts, and business stakeholders to ensure data quality, accessibility, and security.This is a hybrid role based in Phoenix, offering a flexible work arrangement. You will have the opportunity to work on cutting-edge projects, collaborate with talented colleagues, and contribute to the growth of our organizationKey Responsibilities:
Data Pipeline Development:
Build and maintain efficient data pipelines using Python and SQL to ingest, transform, and load data from various sources into our data warehouse.GCP Expertise:
Leverage GCP services such as Dataflow, Dataproc, and BigQuery to implement scalable and cost-effective data solutions.Bitquery Mastery:
Utilize Bitquery to extract valuable insights from blockchain data and integrate them into our data pipelines.Infrastructure Development:
Design and implement robust data infrastructure, including data lakes, data warehouses, and data marts, to support our growing data needs.AI/ML Integration:
Collaborate with AI/ML teams to integrate machine learning models into our data pipelines and applications.Data Governance and Security:
Ensure data quality, security, and compliance with industry standards and regulations.Performance Optimization:
Continuously monitor and optimize data pipelines for performance and efficiency.Cross-functional Collaboration:
Work closely with teams across the organization to understand business requirements and translate them into technical solutions.Required Skills and Experience:
Strong proficiency in SQLWorking experience with GCP, knowledge of blockchain technology and BitqueryExpertise in building and maintaining data pipelines using PythonFamiliarity with data warehousing and data lake conceptsPreferred Skills and Experience:
Experience with AI/ML techniques and toolsPrior experience working with a financial or banking client.Job Requirement
PythonSQLGCPAI/MLReach Out to a RecruiterRecruiterEmailPhoneMradul Khampariyamradul.khampariya@collabera.com8139371148
Apply Now
HomeSearch JobsJob Description
GCP Data Engineer
Contract: Phoenix, Arizona, US
Salary: $60.00 Per Hour
Job Code: 355936
End Date: 2024-12-15Days Left: 24 days, 2 hours left
Apply
Client's Domain -
Financial
Job Title -
Data Engineer
Location -
Phoenix, AZ 85054 - Hybrid
Duration
- 06+ Months (Potentially Contract to Hire)
Pay Rate
- ($55-$60) hourly.
Note:Min 7 years of experience.About the Role:
We are seeking a talented Data Engineer to join our team and help us unlock the power of data. In this role, you will be responsible for designing, developing, and maintaining robust data pipelines to support critical business initiatives.You will work closely with data scientists, analysts, and business stakeholders to ensure data quality, accessibility, and security.This is a hybrid role based in Phoenix, offering a flexible work arrangement. You will have the opportunity to work on cutting-edge projects, collaborate with talented colleagues, and contribute to the growth of our organizationKey Responsibilities:
Data Pipeline Development:
Build and maintain efficient data pipelines using Python and SQL to ingest, transform, and load data from various sources into our data warehouse.GCP Expertise:
Leverage GCP services such as Dataflow, Dataproc, and BigQuery to implement scalable and cost-effective data solutions.Bitquery Mastery:
Utilize Bitquery to extract valuable insights from blockchain data and integrate them into our data pipelines.Infrastructure Development:
Design and implement robust data infrastructure, including data lakes, data warehouses, and data marts, to support our growing data needs.AI/ML Integration:
Collaborate with AI/ML teams to integrate machine learning models into our data pipelines and applications.Data Governance and Security:
Ensure data quality, security, and compliance with industry standards and regulations.Performance Optimization:
Continuously monitor and optimize data pipelines for performance and efficiency.Cross-functional Collaboration:
Work closely with teams across the organization to understand business requirements and translate them into technical solutions.Required Skills and Experience:
Strong proficiency in SQLWorking experience with GCP, knowledge of blockchain technology and BitqueryExpertise in building and maintaining data pipelines using PythonFamiliarity with data warehousing and data lake conceptsPreferred Skills and Experience:
Experience with AI/ML techniques and toolsPrior experience working with a financial or banking client.Job Requirement
PythonSQLGCPAI/MLReach Out to a RecruiterRecruiterEmailPhoneMradul Khampariyamradul.khampariya@collabera.com8139371148
Apply Now