Inficare
Senior Data Engineer With GCP
Inficare, Sunnyvale, CA, United States
Job Title: Senior Data Engineer
Location: Sunnyvale, CA (Day 1 Onsite)
Duration: 12-Month Contract
Job Description:
seeking a highly skilled and collaborative Senior Data Engineer to design and develop big data applications using the latest open-source technologies. The ideal candidate will have a proven track record in building data pipelines, working with distributed data platforms, and utilizing Google Cloud Platform (GCP) technologies.
Responsibilities:
If you are a passionate Data Engineer with expertise in GCP and big data technologies, we encourage you to apply for this exciting opportunity.
Location: Sunnyvale, CA (Day 1 Onsite)
Duration: 12-Month Contract
Job Description:
seeking a highly skilled and collaborative Senior Data Engineer to design and develop big data applications using the latest open-source technologies. The ideal candidate will have a proven track record in building data pipelines, working with distributed data platforms, and utilizing Google Cloud Platform (GCP) technologies.
Responsibilities:
- Design and develop scalable big data applications and pipelines using GCP and open-source tools.
- Build logical and physical data models for big data platforms.
- Automate workflows using Apache Airflow.
- Create and manage data pipelines with Apache Hive, Spark, and Kafka.
- Provide maintenance and enhancements to existing systems, including rotational on-call support.
- Mentor junior engineers and conduct design reviews.
- Participate in Agile ceremonies like daily standups and backlog grooming using JIRA.
- Act as the technical point of contact for assigned business domains.
- GCP Expertise: 3+ years of recent experience with Dataproc, GCS, and Big Query.
- Big Data Platforms: 7+ years of hands-on experience with Hadoop, Hive, Spark, and Airflow.
- Data Modeling: 4+ years of experience designing schemas for data lakes or RDBMS platforms.
- Programming Languages: Proficiency in Python, Java, Scala, etc.
- Scripting Languages: Experience with Perl, Shell scripting, etc.
- Data Handling: Experience processing and managing large datasets (multi-TB/PB scale).
- Agile Methodologies: Background in Scrum/Agile development and test-driven development.
- Strong problem-solving, analytical, and communication skills.
- Bachelor's Degree in Computer Science or equivalent experience.
- Familiarity with Gitflow and Atlassian tools like Bitbucket, JIRA, and Confluence.
- Experience with CI/CD tools such as Bamboo, Jenkins, or TFS.
- Opportunity to work on cutting-edge technologies and large-scale data platforms.
- Collaborative and fast-paced work environment with opportunities to mentor and lead.
If you are a passionate Data Engineer with expertise in GCP and big data technologies, we encourage you to apply for this exciting opportunity.