RIT Solutions, Inc.
GCP Data Engineer
RIT Solutions, Inc., Tampa, Florida, us, 33646
Urgent fill -We need someone who is current hands-on with
building end-to-end data pipeline
using GCP services such as Dataflow, Spark+Python (DataProc), Composer, GCS, BQ, etc. This is a more hands-on developer role.
Would love someone in Chicago, would take someone remote.10+ years experience.Waiting on rate information
Client -JOB TITLE:
GCP Data Engineer
POSITION SUMMARY:
The GCP Data Engineer is responsible for Analysis, Design, Development, Testing and Deployment support for new framework, enhance existing framework that enables the data platform data pipelines. This position is required to perform independently in a highly dynamic and fast paced environment. The person will work alongside Architects, engineers, analysts and PMs to deliver scalable robust innovative technical solutions. This position plays a key role in building real-time and batch data ingestion, egress frameworks, streaming analytics framework and support AI platform. The person must have similar experience in a prior job.
RESPONSIBILITIES
Build frameworks for large-scale data processing evaluating appropriate emerging technologies, and approaches that will power data-driven capabilities across the Enterprise.Develops data solutions on Google Cloud Platform leveraging Google Data Flow, Data Proc, Composer, Pub/Sub, BigQuery, GCS, Cloud function, define workflows, scheduled and event driven workloads to ingest data from internal/external partners, data distribution channels, etc.Build features using Python in Spark, leveraging GCP's Spark Engine (Dataproc), SQL/GSQL on Google BQ.Build and maintain scalable data pipelines to handle high-volume data (e.g., 150 million rows).Collaborate with cross-functional technologists across the organization to gather requirements, solve new problems and deliver quality results.Coordinate with offshore engineers to get projects/tasks completed.Develops and executes test plans to validate the implementation and performance of frameworks and recommend performance improvements.Supports the operations of the deployed solutions, investigates complex issues and assists with the resolution and implementation of preventive measures.REQUIREMENTS FOR CONSIDERATION:
Bachelor's degree in Computer Science, Engineering, or a related field.8+ years of experience in data engineering, with a focus on GCP technologies.Proficiency in Google Cloud Platform leveraging Google Data Flow, Data Proc, Composer, Pub/Sub, BigQuery, GCS, Cloud function, define workflows, scheduled and event driven workloads processing.Hands-on experience and solid knowledge in building and maintaining end-to-end data pipelines using Python and Spark (GCP Dataproc) or any other GCP servicesStrong experience with BigQuery including SQL and Stored Procedures.Experience with high-volume data processing (GB-level).Previous experience working in a similar role with GCP Services is a must.Experience in data quality, continuous integrations build and deployment processes using GitHub, Jenkins and Unix/Linux shell scripts.Proactive and able to catch issues before failures.Possess a strong work ethic; takes pride in producing a quality product and a strong team playerWork with production support and project consultants in an onshore / offshore modelSupport off-hours platform issues and code deployments as needed
building end-to-end data pipeline
using GCP services such as Dataflow, Spark+Python (DataProc), Composer, GCS, BQ, etc. This is a more hands-on developer role.
Would love someone in Chicago, would take someone remote.10+ years experience.Waiting on rate information
Client -JOB TITLE:
GCP Data Engineer
POSITION SUMMARY:
The GCP Data Engineer is responsible for Analysis, Design, Development, Testing and Deployment support for new framework, enhance existing framework that enables the data platform data pipelines. This position is required to perform independently in a highly dynamic and fast paced environment. The person will work alongside Architects, engineers, analysts and PMs to deliver scalable robust innovative technical solutions. This position plays a key role in building real-time and batch data ingestion, egress frameworks, streaming analytics framework and support AI platform. The person must have similar experience in a prior job.
RESPONSIBILITIES
Build frameworks for large-scale data processing evaluating appropriate emerging technologies, and approaches that will power data-driven capabilities across the Enterprise.Develops data solutions on Google Cloud Platform leveraging Google Data Flow, Data Proc, Composer, Pub/Sub, BigQuery, GCS, Cloud function, define workflows, scheduled and event driven workloads to ingest data from internal/external partners, data distribution channels, etc.Build features using Python in Spark, leveraging GCP's Spark Engine (Dataproc), SQL/GSQL on Google BQ.Build and maintain scalable data pipelines to handle high-volume data (e.g., 150 million rows).Collaborate with cross-functional technologists across the organization to gather requirements, solve new problems and deliver quality results.Coordinate with offshore engineers to get projects/tasks completed.Develops and executes test plans to validate the implementation and performance of frameworks and recommend performance improvements.Supports the operations of the deployed solutions, investigates complex issues and assists with the resolution and implementation of preventive measures.REQUIREMENTS FOR CONSIDERATION:
Bachelor's degree in Computer Science, Engineering, or a related field.8+ years of experience in data engineering, with a focus on GCP technologies.Proficiency in Google Cloud Platform leveraging Google Data Flow, Data Proc, Composer, Pub/Sub, BigQuery, GCS, Cloud function, define workflows, scheduled and event driven workloads processing.Hands-on experience and solid knowledge in building and maintaining end-to-end data pipelines using Python and Spark (GCP Dataproc) or any other GCP servicesStrong experience with BigQuery including SQL and Stored Procedures.Experience with high-volume data processing (GB-level).Previous experience working in a similar role with GCP Services is a must.Experience in data quality, continuous integrations build and deployment processes using GitHub, Jenkins and Unix/Linux shell scripts.Proactive and able to catch issues before failures.Possess a strong work ethic; takes pride in producing a quality product and a strong team playerWork with production support and project consultants in an onshore / offshore modelSupport off-hours platform issues and code deployments as needed