Futran Tech Solutions Pvt. Ltd.
GCP Data Architect
Futran Tech Solutions Pvt. Ltd., Warren, New Jersey, us, 07059
Expert in following
GCP Data Architecture and designing streaming & Batch pipelines
Data processing services: Dataproc ( Pyspark ) & Dataflow (Apache Beam)
Cloud databases: Spanner, Cloud SQL, Memory store, BigQuery ,
Looker studio & Operations suite (Cloud monitoring and Logging)
Google certified Data Engineer
Secondary Skill
Python, Flask FastAPI Development
Role Description:
Develop data solutions in distributed microservices and full stack systems
Responsible and accountable for doing requirement gathering and creation of Architecture, High level, and detailed technical design using GCP Data focused reference Architecture
Participate in the analysis of newest technologies and suggest the o ptimal solutions which will be best suited for satisfying the current requirements and will simplify the future modifications.
Provide design expertise with Master Data Management, Data Quality and Meta Data Management
Design systems to provide c omplete observability and support continuous improvement in DevOps Automation
Design appropriate data models for the use in transactional and big data environments as an input into Machine Learning processing.
Design and Build the necessary infrast ructure for optimal ETL from a variety of data sources to be used on GCP services.
Experience with SQL and NoSQL modern data stores.
Ensure clarity on NFR and implement these requirements.
Develop data and semantic interoperability specificati ons
Build relationships with client stakeholders to establish a high-level of rapport and confidence. Work with clients, local teams and offshore resources to deliver modern data products
Collaborate with several external vendors to support data acquisition.
Knowledge of SAFe delivery model and groom requirements for PI backlog, Sprint backlog and identify all technical dependencies between teams
Detailed JD
Primary Skills
Expert in following
GCP Data Architecture and designing stream ing & Batch pipelines
Data processing services: Dataproc ( Pyspark ) & Dataflow (Apache Beam)
Cloud databases: Spanner, Cloud SQL, Memory store, BigQuery,
Looker studio & Operations suite (Cloud monitoring and Logging)
Google certified Data
GCP Data Architecture and designing streaming & Batch pipelines
Data processing services: Dataproc ( Pyspark ) & Dataflow (Apache Beam)
Cloud databases: Spanner, Cloud SQL, Memory store, BigQuery ,
Looker studio & Operations suite (Cloud monitoring and Logging)
Google certified Data Engineer
Secondary Skill
Python, Flask FastAPI Development
Role Description:
Develop data solutions in distributed microservices and full stack systems
Responsible and accountable for doing requirement gathering and creation of Architecture, High level, and detailed technical design using GCP Data focused reference Architecture
Participate in the analysis of newest technologies and suggest the o ptimal solutions which will be best suited for satisfying the current requirements and will simplify the future modifications.
Provide design expertise with Master Data Management, Data Quality and Meta Data Management
Design systems to provide c omplete observability and support continuous improvement in DevOps Automation
Design appropriate data models for the use in transactional and big data environments as an input into Machine Learning processing.
Design and Build the necessary infrast ructure for optimal ETL from a variety of data sources to be used on GCP services.
Experience with SQL and NoSQL modern data stores.
Ensure clarity on NFR and implement these requirements.
Develop data and semantic interoperability specificati ons
Build relationships with client stakeholders to establish a high-level of rapport and confidence. Work with clients, local teams and offshore resources to deliver modern data products
Collaborate with several external vendors to support data acquisition.
Knowledge of SAFe delivery model and groom requirements for PI backlog, Sprint backlog and identify all technical dependencies between teams
Detailed JD
Primary Skills
Expert in following
GCP Data Architecture and designing stream ing & Batch pipelines
Data processing services: Dataproc ( Pyspark ) & Dataflow (Apache Beam)
Cloud databases: Spanner, Cloud SQL, Memory store, BigQuery,
Looker studio & Operations suite (Cloud monitoring and Logging)
Google certified Data