Tech Tammina
Senior GCP Data Engineers
Tech Tammina, New York, New York, us, 10261
Senior GCP Data Engineers
New York
SummaryThe ideal candidate should have extensive experience in IT and data engineering, particularly with GCP in the financial sector. They should be proficient in managing and optimizing data storage, ensuring data security, and using Python and SQL for data manipulation. They must also have a strong background in distributed data processing frameworks and collaboration skills to work with various teams.
Project TypeThe project involves designing, developing, and maintaining data pipelines on Google Cloud Platform (GCP).The focus is on transforming raw data into valuable insights.The candidate will be working on data ingestion, transformation, and storage optimization.Emphasis on data security and access control.Experience:
12+ years of IT experience.4+ years of recent experience with a major bank or brokerage house in the US.8+ years as a Data Engineer.5+ years with GCP.Specific experience with GCP services like Dataflow, Dataproc, Pub/Sub, GCS, BigQuery, Cloud Storage, and Dataflow.Experience in data storage management and optimization with BigQuery, Cloud Storage, and Cloud SQL.Implementation of data security and access controls using GCP IAM and Cloud Security Command Center.Data manipulation and querying using Python and SQL.Distributed data processing with Apache Beam and Apache Spark.Skills :
GCP services:
Dataflow, Dataproc, Pub/Sub, GCS, BigQuery, Cloud Storage, Cloud SQL.Data security and access control principles.Python & SQL for data manipulation and querying.Apache Beam and Apache Spark.Monitoring and troubleshooting with GCP's Stackdriver and Cloud Monitoring tools.Automation of data processing tasks using scripting languages like Python.Strong collaboration skills to work with data experts, analysts, and product teams.
New York
SummaryThe ideal candidate should have extensive experience in IT and data engineering, particularly with GCP in the financial sector. They should be proficient in managing and optimizing data storage, ensuring data security, and using Python and SQL for data manipulation. They must also have a strong background in distributed data processing frameworks and collaboration skills to work with various teams.
Project TypeThe project involves designing, developing, and maintaining data pipelines on Google Cloud Platform (GCP).The focus is on transforming raw data into valuable insights.The candidate will be working on data ingestion, transformation, and storage optimization.Emphasis on data security and access control.Experience:
12+ years of IT experience.4+ years of recent experience with a major bank or brokerage house in the US.8+ years as a Data Engineer.5+ years with GCP.Specific experience with GCP services like Dataflow, Dataproc, Pub/Sub, GCS, BigQuery, Cloud Storage, and Dataflow.Experience in data storage management and optimization with BigQuery, Cloud Storage, and Cloud SQL.Implementation of data security and access controls using GCP IAM and Cloud Security Command Center.Data manipulation and querying using Python and SQL.Distributed data processing with Apache Beam and Apache Spark.Skills :
GCP services:
Dataflow, Dataproc, Pub/Sub, GCS, BigQuery, Cloud Storage, Cloud SQL.Data security and access control principles.Python & SQL for data manipulation and querying.Apache Beam and Apache Spark.Monitoring and troubleshooting with GCP's Stackdriver and Cloud Monitoring tools.Automation of data processing tasks using scripting languages like Python.Strong collaboration skills to work with data experts, analysts, and product teams.