Logo
Reuben Cooley, Inc.

Sr Data Engineer

Reuben Cooley, Inc., Glendale, California, us, 91222


Job Description:

The CompanyHeadquartered in Los Angeles, this leader in the Entertainment & Media space is focused on delivering world-class stories and experiences to it's global audience. To offer the best entertainment experiences, their technology teams focus on continued innovation and utilization of cutting edge technology.

Platform / Stack

You will work with technologies that include Python, AWS, Airflow and Snowflake.What You'll Do As a Sr Data Engineer:

Contribute to maintaining, updating, and expanding existing Core Data platform data pipelinesBuild tools and services to support data discovery, lineage, governance, and privacyCollaborate with other software/data engineers and cross-functional teamsWork on a Tech stack that includes Airflow, Spark, Databricks, Delta Lake, and SnowflakeCollaborate with product managers, architects, and other engineers to drive the success of the Core Data platformContribute to developing and documenting both internal and external standards and best practices for pipeline configurations, naming conventions, and moreEngage with and understand our customers, forming relationships that allow us to understand and prioritize both innovative new offerings and incremental platform improvementsMaintain detailed documentation of your work and changes to support data quality and data governance requirementsQualifications

You could be a great fit if you have:5+ years of data engineering experience developing large data pipelinesProficiency in at least one major programming language (e.g. Python, Java, Scala)Strong SQL skills and ability to create queries to analyze complex datasetsHands-on production environment experience with distributed processing systems such as SparkHands-on production experience with data pipeline orchestration systems such as Airflow for creating and maintaining data pipelinesExperience with at least one major Massively Parallel Processing (MPP) or cloud database technology (Snowflake, Databricks, Big Query).Experience in developing APIs with GraphQLDeep Understanding of AWS or other cloud providers as well as infrastructure as codeFamiliarity with Data Modeling techniques and Data Warehousing standard methodologies and practices