Dice
Google Cloud Platform Data Engineer - Sunnyvale, CA/Bentonville, AR (Hybrid) - C
Dice, Sunnyvale, California, United States, 94087
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Empower Professionals, is seeking the following. Apply via Dice today!Role: Google Cloud Platform Engineer
Location: Sunnyvale, CA/Bentonville, AR (Hybrid)
Duration: 12+ Months
Requirements:
2+ years of recent Google Cloud Platform experienceExperience building data pipelines in Google Cloud PlatformGoogle Cloud Platform Dataproc, GCS & BIGQuery experience5+ years of hands-on experience with developing data warehouse solutions and data products.5+ years of hands-on experience developing a distributed data processing platform with Hadoop, Hive or Spark, Airflow or a workflow orchestration solution are required2+ years of hands-on experience in modeling and designing schema for data lakes or for RDBMS platforms.Experience with programming languages: Python, Java, Scala, etc.Experience with scripting languages: Perl, Shell, etc.Practice working with, processing, and managing large data sets (multi TB/PB scale).Exposure to test driven development and automated testing frameworks.Background in Scrum/Agile development methodologies.Capable of delivering on multiple competing priorities with little supervision.Excellent verbal and written communication skills.Bachelor's Degree in computer science or equivalent experience.Preferred:
GitflowAtlassian products BitBucket, JIRA, Confluence etc.Continuous Integration tools such as Bamboo, Jenkins, or TFS
#J-18808-Ljbffr
Location: Sunnyvale, CA/Bentonville, AR (Hybrid)
Duration: 12+ Months
Requirements:
2+ years of recent Google Cloud Platform experienceExperience building data pipelines in Google Cloud PlatformGoogle Cloud Platform Dataproc, GCS & BIGQuery experience5+ years of hands-on experience with developing data warehouse solutions and data products.5+ years of hands-on experience developing a distributed data processing platform with Hadoop, Hive or Spark, Airflow or a workflow orchestration solution are required2+ years of hands-on experience in modeling and designing schema for data lakes or for RDBMS platforms.Experience with programming languages: Python, Java, Scala, etc.Experience with scripting languages: Perl, Shell, etc.Practice working with, processing, and managing large data sets (multi TB/PB scale).Exposure to test driven development and automated testing frameworks.Background in Scrum/Agile development methodologies.Capable of delivering on multiple competing priorities with little supervision.Excellent verbal and written communication skills.Bachelor's Degree in computer science or equivalent experience.Preferred:
GitflowAtlassian products BitBucket, JIRA, Confluence etc.Continuous Integration tools such as Bamboo, Jenkins, or TFS
#J-18808-Ljbffr