Resource Informatics Group
GCP Data Engineer
Resource Informatics Group, Irving, Texas, United States, 75084
Job Title:
GCP Data Engineer
Location: Atlanta, GA
Duration: Long-term
Work Mode: Onsite
Main Skills:•10+ years of application development experience required;•Need a GCP Data Engineer with 4+ years' experience - Big Query, Data Flow, Cloud Composer are some of the key components in GCP .• Need to have telecom domain experience
Responsibilities:•Build data systems and pipelines on cloud providers (GCP preferable);•Build algorithms and prototypes (geospatial models are a plus);•Implement tasks for Apache Airflow;•Support and organize data in a data warehouse (Snowflake/BiqQuery);•Develop efficient ETL/ELT pipelines.
Experience required:•10+ years of application development experience required;•4+ years of GCP experience. Experience working in GCP based Big Data deployments (Batch/Real-Time) leveraging Big Query, Big Table, Google Cloud Storage, PubSub, Data Fusion, Dataflow, Dataproc, Airflow;•2 + years coding skills in Java/Python;•Work with data team to analyze data, build models and integrate massive datasets from multiple data sources for data modeling;•Extracting, Loading, Transforming, cleaning, and validating data + Designing pipelines and architectures for data processing;
Architecting and implementing next generation data and analytics platforms on GCP cloud;
GCP Data Engineer
Location: Atlanta, GA
Duration: Long-term
Work Mode: Onsite
Main Skills:•10+ years of application development experience required;•Need a GCP Data Engineer with 4+ years' experience - Big Query, Data Flow, Cloud Composer are some of the key components in GCP .• Need to have telecom domain experience
Responsibilities:•Build data systems and pipelines on cloud providers (GCP preferable);•Build algorithms and prototypes (geospatial models are a plus);•Implement tasks for Apache Airflow;•Support and organize data in a data warehouse (Snowflake/BiqQuery);•Develop efficient ETL/ELT pipelines.
Experience required:•10+ years of application development experience required;•4+ years of GCP experience. Experience working in GCP based Big Data deployments (Batch/Real-Time) leveraging Big Query, Big Table, Google Cloud Storage, PubSub, Data Fusion, Dataflow, Dataproc, Airflow;•2 + years coding skills in Java/Python;•Work with data team to analyze data, build models and integrate massive datasets from multiple data sources for data modeling;•Extracting, Loading, Transforming, cleaning, and validating data + Designing pipelines and architectures for data processing;
Architecting and implementing next generation data and analytics platforms on GCP cloud;