Logo
GoodRx Holdings Inc.

Sr. Data Engineer (Platform)

GoodRx Holdings Inc., San Francisco, California, United States, 94199


Sr. Data Engineer (Platform)Locations: Santa Monica, CA; Remote USA; Seattle, WA; New York City, NY; San Francisco, CA

Time Type: Full time

Posted on: 22 Days Ago

GoodRx is the leading prescription savings platform in the U.S.

Trusted by more than 25 million consumers and 750,000 healthcare professionals annually, GoodRx provides access to savings and affordability options for generic and brand-name medications at more than 70,000 pharmacies nationwide, as well as comprehensive healthcare research and information. Since 2011, GoodRx has helped consumers save nearly $75 billion on the cost of their prescriptions.

Our goal is to help Americans find convenient and affordable healthcare. We offer solutions for consumers, employers, health plans, and anyone else who shares our desire to provide affordable prescriptions to all Americans.

About the Role:

GoodRx is looking for extremely smart and curious data engineers, who are deft at working with a wide variety of languages and raw data formats, such as parquet, in a fast-paced and friendly environment. You will collaborate and work with teams across GoodRx to build outstanding data pipelines and processes that stitch together complex sets of data stores to guide business decisions.

Responsibilities:

Collaborate with product managers, data scientists, data analysts, and engineers to define requirements and data specifications.

Plan, design, build, test, and deploy data warehouse and data mart solutions.

Lead small to medium size projects, solving data problems through the documentation, design, and creation of ETL jobs and data marts.

Increase the usage and value of the data warehouse and ensure the integrity of the data delivered.

Develop and implement standards, and promote their use throughout the warehouse.

Develop, deploy, and maintain data processing pipelines using cloud technology such as AWS, Kubernetes, Airflow, Redshift, Databricks, EMR.

Define and manage overall schedule and availability for a variety of data sets.

Work closely with other engineers to enhance infrastructure, improve reliability, and efficiency.

Make smart engineering and product decisions based on data analysis and collaboration.

Act as an in-house data expert and make recommendations regarding standards for code quality and timeliness.

Architect cloud-based data pipeline solutions to meet stakeholder needs.

Skills & Qualifications:

Bachelor’s degree in analytics, engineering, math, computer science, information technology, or related discipline.

8+ years professional experience in the big data space.

8+ years' experience in engineering data pipelines using big data technologies (Spark, Flink, etc.) on large scale data sets.

Expert knowledge in writing complex pySpark, SQL, dbt, and ETL development with experience processing extremely large datasets.

Expert in applying SCD types on S3 data lake using Databricks/Delta Lake.

Experience with data model principles and data cataloging.

Experience with job scheduler Airflow or similar.

Demonstrated ability to analyze large data sets to identify gaps and inconsistencies, provide data insights, and advance effective product solutions.

Deep familiarity with AWS Services (S3, Event Bridge, Kinesis, Glue, EMR, Lambda).

Experience with data warehouse platforms such as Redshift, Databricks, Big Query, Snowflake.

Ability to quickly learn complex domains and new technologies.

Innately curious and organized with the drive to analyze data to identify deliverables, anomalies, and gaps and propose solutions to address these findings.

Thrives in a fast-paced startup environment.

Good To Have:

Experience with customer data platform tools such as Segment.

Experience with data streaming such as Kafka.

Experience using Jira, GitHub, Docker, CodeFresh, Terraform.

Experience contributing to full lifecycle deployments with a focus on testing and quality.

Experience with data quality processes, data quality checks, validations, data quality metrics definition, and measurement.

AWS/Kafka/Databricks or similar certifications.

Engineering teams are responsible for supporting appropriate security controls, including management, operational, and technical controls in addition to general GoodRx best practices.

At GoodRx, pay ranges are determined based on work locations and may vary based on where the successful candidate is hired. The pay ranges below are shown as a guideline, and the successful candidate’s starting pay will be determined based on job-related skills, experience, qualifications, and other relevant business and organizational factors.

San Francisco and Seattle Offices: $161,000.00 - $257,000.00

New York Office: $147,000.00 - $235,000.00

Santa Monica Office: $134,000.00 - $214,000.00

Other Office Locations: $121,000.00 - $193,000.00

GoodRx also offers additional compensation programs such as annual cash bonuses and annual equity grants for most positions as well as generous benefits.

We’re committed to growing and empowering a more inclusive community within our company and industry. That’s why we hire and cultivate diverse teams of the best and brightest from all backgrounds, experiences, and perspectives.

GoodRx is committed to providing reasonable accommodations for candidates with disabilities during our recruiting process. If you need any assistance or accommodations due to a disability, please reach out to us at accommodations@goodrx.com.

#J-18808-Ljbffr