Falconsmartit
Data Engineer
Falconsmartit, Sunnyvale, California, United States, 94087
Job Title:
Data Engineer
Location:
Sunnyvale, CA (onsite)
Job Type:
Contract
Must Have Skills:
Hadoop - 8+ Yrs of Exp
Spark - 8+ Yrs of Exp
Scala - 8+ Yrs of Exp
GCP - 5+ Yrs of Exp
ETL Process / Data Pipeline experience - 8+ Yrs of Exp
Domain Experience (If any):
Retail
Responsibilities:
Design and develop big data applications using the latest open source technologies.
Work in an offshore model and manage outcomes.
Develop logical and physical data models for big data platforms.
Automate workflows using Apache Airflow.
Create data pipelines using Apache Hive, Apache Spark, Apache Kafka.
Provide ongoing maintenance and enhancements to existing systems and participate in rotational on-call support.
Learn the business domain and technology infrastructure quickly and share knowledge actively with others in the team.
Mentor junior engineers on the team.
Lead daily standups and design reviews.
Groom and prioritize backlog using JIRA.
Act as the point of contact for your assigned business domain.
Requirements:
2+ years of recent GCP experience.
Experience building data pipelines in GCP.
GCP Dataproc, GCS & BIGQuery experience.
5+ years of hands-on experience with developing data warehouse solutions and data products.
5+ years of hands-on experience developing a distributed data processing platform with Hadoop, Hive or Spark, Airflow or a workflow orchestration solution.
2+ years of hands-on experience in modeling and designing schema for data lakes or for RDBMS platforms.
Experience with programming languages: Python, Java, Scala, etc.
Experience with scripting languages: Perl, Shell, etc.
Practice working with, processing, and managing large data sets (multi TB/PB scale).
Exposure to test driven development and automated testing frameworks.
Background in Scrum/Agile development methodologies.
Capable of delivering on multiple competing priorities with little supervision.
Excellent verbal and written communication skills.
Bachelor’s degree in computer science or equivalent experience.
The most successful candidates will also have experience in the following:
Gitflow
Atlassian products – BitBucket, JIRA, Confluence etc.
Continuous Integration tools such as Bamboo, Jenkins, or TFS.
#J-18808-Ljbffr
Data Engineer
Location:
Sunnyvale, CA (onsite)
Job Type:
Contract
Must Have Skills:
Hadoop - 8+ Yrs of Exp
Spark - 8+ Yrs of Exp
Scala - 8+ Yrs of Exp
GCP - 5+ Yrs of Exp
ETL Process / Data Pipeline experience - 8+ Yrs of Exp
Domain Experience (If any):
Retail
Responsibilities:
Design and develop big data applications using the latest open source technologies.
Work in an offshore model and manage outcomes.
Develop logical and physical data models for big data platforms.
Automate workflows using Apache Airflow.
Create data pipelines using Apache Hive, Apache Spark, Apache Kafka.
Provide ongoing maintenance and enhancements to existing systems and participate in rotational on-call support.
Learn the business domain and technology infrastructure quickly and share knowledge actively with others in the team.
Mentor junior engineers on the team.
Lead daily standups and design reviews.
Groom and prioritize backlog using JIRA.
Act as the point of contact for your assigned business domain.
Requirements:
2+ years of recent GCP experience.
Experience building data pipelines in GCP.
GCP Dataproc, GCS & BIGQuery experience.
5+ years of hands-on experience with developing data warehouse solutions and data products.
5+ years of hands-on experience developing a distributed data processing platform with Hadoop, Hive or Spark, Airflow or a workflow orchestration solution.
2+ years of hands-on experience in modeling and designing schema for data lakes or for RDBMS platforms.
Experience with programming languages: Python, Java, Scala, etc.
Experience with scripting languages: Perl, Shell, etc.
Practice working with, processing, and managing large data sets (multi TB/PB scale).
Exposure to test driven development and automated testing frameworks.
Background in Scrum/Agile development methodologies.
Capable of delivering on multiple competing priorities with little supervision.
Excellent verbal and written communication skills.
Bachelor’s degree in computer science or equivalent experience.
The most successful candidates will also have experience in the following:
Gitflow
Atlassian products – BitBucket, JIRA, Confluence etc.
Continuous Integration tools such as Bamboo, Jenkins, or TFS.
#J-18808-Ljbffr