Abode Techzone LLC
Lead GCP Data Engineer
Abode Techzone LLC, Dallas, TX, United States
Job Title: Lead GCP Data Engineer - Air flow
Location: Dallas, TX
Project Duration: 12 months Contract
Interview: Three rounds of Video interview
Job Description :
Primary Skills: Air Flow, Big Query, Data Proc , Spark), Hadoop. Lead the Team. E2E Delivery
Secondary Skills: Java, Unix Scripts, Oozie, Google Cloud Big Query & Big Table, Google Data Proc, Google Data Flow, Google Cloud Storage
Should have 2+ yrs of experience preferably from Telecom Domain
Should have worked as Java developer in the initial stage.
Big data expert with 10+ years of experience in Hadoop Big data ecosystem
Experience in cloud environment, specially GCP
Experience in developing both batch and real-time streaming data pipelines
Have experience as a tech lead for data engineering projects
Writes complex SQL queries required to perform Data Acquisition and Ingestion required for Data pipelines
Builds Data pipelines and does data engineering activities using technologies like Python, Hadoop, Spark etc.,
Ensures the upkeep of the Hadoop Data Lake Platform by monitoring the Horton Works HDFS
Monitors the Data Lake constantly and ensures that the appropriate support teams are engaged at the right times
Works in an Agile/Scrum Environment, interacts with a scrum team as well as the Client Stakeholders
Understands the client requirements from Agile scrum user stories and develops low level design required for the user stories
Result Oriented and able to match the pace of work demands of the Program through self-improvement
Preferably Google Cloud Certified Data Engineer
Obsessively focused on coding standards and code quality
Location: Dallas, TX
Project Duration: 12 months Contract
Interview: Three rounds of Video interview
Job Description :
Primary Skills: Air Flow, Big Query, Data Proc , Spark), Hadoop. Lead the Team. E2E Delivery
Secondary Skills: Java, Unix Scripts, Oozie, Google Cloud Big Query & Big Table, Google Data Proc, Google Data Flow, Google Cloud Storage
Should have 2+ yrs of experience preferably from Telecom Domain
Should have worked as Java developer in the initial stage.
Big data expert with 10+ years of experience in Hadoop Big data ecosystem
Experience in cloud environment, specially GCP
Experience in developing both batch and real-time streaming data pipelines
Have experience as a tech lead for data engineering projects
Writes complex SQL queries required to perform Data Acquisition and Ingestion required for Data pipelines
Builds Data pipelines and does data engineering activities using technologies like Python, Hadoop, Spark etc.,
Ensures the upkeep of the Hadoop Data Lake Platform by monitoring the Horton Works HDFS
Monitors the Data Lake constantly and ensures that the appropriate support teams are engaged at the right times
Works in an Agile/Scrum Environment, interacts with a scrum team as well as the Client Stakeholders
Understands the client requirements from Agile scrum user stories and develops low level design required for the user stories
Result Oriented and able to match the pace of work demands of the Program through self-improvement
Preferably Google Cloud Certified Data Engineer
Obsessively focused on coding standards and code quality