Logo
Prodapt

GCP Data Engineer- Java

Prodapt, Irving, Texas, United States, 75084


Overview

Prodapt is the largest and fastest-growing specialized player in the Connectedness industry, recognized by Gartner as a Large, Telecom-Native, Regional IT Service Provider across North America, Europe and Latin America. With its singular focus on the domain, Prodapt has built deep expertise in the most transformative technologies that connect our world. Prodapt is a trusted partner for enterprises across all layers of the Connectedness vertical. Prodapt designs, configures, and operates solutions across their digital landscape, network infrastructure, and business operations - and craft experiences that delight their customers. Today, Prodapt's clients connect 1.1 billion people and 5.4 billion devices, and are among the largest telecom, media, and internet firms in the world. Prodapt works with Google, Amazon, Verizon, Vodafone, Liberty Global, Liberty Latin America, Claro, Lumen, Windstream, Rogers, Telus, KPN, Virgin Media, British Telecom, Deutsche Telekom, Adtran, Samsung, and many more. A "Great Place To Work® Certified™" company, Prodapt employs over 6,000 technology and domain experts in 30+ countries across North America, Latin America, Europe, Africa, and Asia. Prodapt is part of the 130-year-old business conglomerate The Jhaver Group, which employs over 30,000 people across 80+ locations globally.

The GCP Data Engineer will be responsible for developing and supporting database applications to drive automated data collection, storage, visualization and transformation as per business needs. The candidate will uphold Prodapt's winning values and work in a way that contributes to the Company's vision.

Responsibilities

Experience in building GCP data pipelines supporting both batch and real-time streams to enable data collection, storage, processing, transformation and aggregation.Analyze, design, code, and test complex ETL processes for data warehouses and operational data stores.Familiarity with the Technology stack available in the industry for data management, data ingestion, capture, processing and curationETL development experience with a strong SQL background, analyzing huge data sets, trends and issues, and creating structured outputs.Experience in building high-performing data processing frameworks leveraging Google Cloud Platform and TeradataDatabase design, Data Modelling and MiningConsolidate data across multiple sources and databases to make it easier to locate and accessImplement automated data collection and data storage systemsProvide database support by coding utilities, respond to and resolve user problemsDevelop and deploy applications at the direction of leads including large-scale data processing, computationally intensive statistical modeling, and advanced analyticsData Visualization and PresentationRequirements

Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.Work experience (7-10 years)Experience in utilizing GCP Services like Big Query, Composer, Dataflow, Pub-Sub, Cloud MonitoringExperience in performing ETL and data engineering work by leveraging multiple Google Cloud components using Dataflow, Data Proc, BigQuery. Experience in scheduling like Airflow, Cloud Composer etc.Understand ETL application design, data sources, data targets, relationships, and business rules.Experience in JIRA or any other Project Management ToolsExperience in CI/CD automation pipeline facilitating automated deployment and testingExperience in Bash shell scripts, UNIX utilities & UNIX Commands