Logo
W3Global

Senior Data Engineer

W3Global, Seattle, Washington, us, 98127


Role: Senior Data Engineer

Locations: Seattle, WA, USA (Remote)

Full Time

Skills: SQL, ETL

Description

Subscriber Data Solutions builds and maintains best-in-class data products enabling business teams to analyze and measure subscriber movements and support revenue generation initiatives.

The Senior Data Engineer will contribute to the Company's success by partnering with business, analytics, and infrastructure teams to design and build data pipelines to facilitate measuring subscriber movements and metrics. Collaborating across disciplines, they will identify internal/external data sources, design table structure, define ETL strategy, and automate Data Quality checks.

Responsibilities:Test and validate ETL logic and Data PipelinesOwn the quality of every release into production with a data-driven approachCreate, deliver, and continuously improve our quality processes for delivering operational data to address all types of subscriber and commerce operations to our stakeholders.Partner with Data Analysts, Product, and Engineering teams to deeply understand the underlying transactional systems behavior and business use cases.Translate reporting and operational technical specifications, including calculations, custom groups, parameters, filtering criteria, and/or aggregations into test requirements.Build automated and reusable tests for data stores to improve quality and development velocityDiagnose issues, report defects, and propose regression tests to catch recurring bugs

Basic Qualifications:9+ years' experience in Quality validating ETL pipelines and Data WarehousesExpert SQL knowledge and experience working with relational databases as well as a working familiarity with a variety of databases.Good experience in data analytics, data engineering, data modeling, data warehousing, and big data platformsStrong programming (Scala/Java) or scripting skills (Python)Experience with but not limited to JUnit/TestNG/BDD or similar toolsExperience in a range of common big data tools & technologies such as Airflow, Hive, Snowflake, Databricks, Spark, etc.Experience in working with large datasets (Terabytes or more)Ability to operate effectively in a team-oriented and collaborative environmentExcellent communication skills and ability to interact with all levels of end users and technical resources

#J-18808-Ljbffr