The Walt Disney Company (Germany) GmbH
Senior Software Engineer - Data Capture Team
The Walt Disney Company (Germany) GmbH, Seattle, Washington, us, 98127
Disney Entertainment & ESPN Technology
On any given day at Disney Entertainment & ESPN Technology, we’re reimagining ways to create magical viewing experiences for the world’s most beloved stories while also transforming Disney’s media business for the future. Whether that’s evolving our streaming and digital products in new and immersive ways, powering worldwide advertising and distribution to maximize flexibility and efficiency, or delivering Disney’s unmatched entertainment and sports content, every day is a moment to make a difference to partners and to hundreds of millions of people around the world.A few reasons why we think you’d love working for Disney Entertainment & ESPN TechnologyBuilding the future of Disney’s media business:
DE&E Technologists are designing and building the infrastructure that will power Disney’s media, advertising, and distribution businesses for years to come.
Reach & Scale:
The products and platforms this group builds and operates delight millions of consumers every minute of every day – from Disney+ and Hulu, to ABC News and Entertainment, to ESPN and ESPN+, and much more.
Innovation:
We develop and execute groundbreaking products and techniques that shape industry norms and enhance how audiences experience sports, entertainment & news.
The Product & Data Engineering team is responsible for end to end development for Disney’s world-class consumer-facing products, including streaming platforms Disney+, Hulu, and ESPN+, and digital products & experiences across ESPN, Marvel, Disney Studios, NatGeo, and ABC News. The team drives innovation at scale for millions of consumers around the world across Apple, Android, Smart TVs, game consoles, and the web, with our platforms powering core experiences like personalization, search, messaging and data.About The Role
The Data Capture team for the Data organization within the DE&ET organization is in search of a Lead Software Engineer. As a member of the Data Capture team you will establish the foundational set of core platform frameworks and pipelines which are a vital key to success – enabling dozens of engineering and analytical teams to unlock the power of data to drive key business decisions and provide engineering, analytics, and operational teams the critical information necessary to scale the largest streaming service. Expanding, scaling, and standardizing the core foundational principles through consistent observability, lineage, data quality, logging, and alerting across all engineering teams in the Data organization is imperative to the creation of a single pane of glass. The Data Capture team is seeking to grow their team of world class Software Engineers that share their charisma and enthusiasm for making a positive impact.Responsibilities
Contribute to maintaining, updating, and expanding the existing Data Capture platform including the Spark data pipelines while maintaining strict uptime SLAs.
Extend functionality of current Data Capture platform offerings, including metadata parsing, extending the metastore API, and building new integrations with APIs both internal and external to the Data organization.
Implement the Lakehouse architecture, working with customers, partners, and stakeholders to shift towards a Lakehouse centric data platform.
Architect, design, and code shared libraries in Scala and Python that abstract complex business logic to allow consistent functionality across the Data organization.
Collaborate with product managers, architects, and other engineers to drive the success of the Data Capture platform.
Lead the developing and documenting both internal and external standards and best practices for pipeline configurations, naming conventions, partitioning strategies, and more.
Ensure high operational efficiency and quality of the Data Capture platform datasets to ensure our solutions meet SLAs and project reliability and accuracy to all our stakeholders (Engineering, Data Science, Operations, and Analytics teams).
Be an active participant and advocate of agile/scrum ceremonies to collaborate and improve processes for our team.
Engage with and understand our customers, forming relationships that allow us to understand and prioritize both innovative new offerings and incremental platform improvements.
Maintain detailed documentation of your work and changes to support data quality and data governance requirements.
Provide mentorship and guidance for team members; evangelize the platform, best-practices, data driven decisions; identify new use cases and features and drive adoption.
Tech stack includes Airflow, Spark, Databricks, Delta Lake, Snowflake, Scala, Python.
Basic Qualifications
5+ years of software engineering experience developing backend applications.
2+ years of data engineering experience developing large data pipelines.
Strong algorithmic problem-solving expertise.
Strong fundamental Scala and Python programming skills.
Basic understanding of AWS or other cloud provider resources (S3).
Strong SQL skills and ability to create queries to analyze complex datasets.
Hands-on production environment experience with distributed processing systems such as Spark.
Hands-on production experience with orchestration systems such as Airflow.
Some scripting language experience.
Willingness and ability to learn and pick up new skillsets.
Self-starting problem solver with an eye for detail and excellent analytical and communication skills.
Preferred Qualifications
Experience with at least one major Massively Parallel Processing (MPP) or cloud database technology (Snowflake, Redshift, Big Query).
Experience in developing APIs with GraphQL.
Deep Understanding of AWS or other cloud providers as well as infrastructure as code.
Familiarity with Data Modeling techniques and Data Warehousing standard methodologies and practices.
Familiar with Scrum and Agile methodologies.
Master’s Degree a plus.
Required Education
Bachelor’s Degree in Computer Science, Information Systems or related field or equivalent industry experience.
Additional Information
#DISNEYTECHThe hiring range for this position in Santa Monica, California is $136,038 to $182,490 per year, in Bristol, Connecticut is $136,038 to $182,490 per year, in Seattle, Washington is $142,516 to $191,180 per year, in New York City, NY is $142,516 to $191,180 per year, and in San Francisco, California is $149,000 to $199,000 per year. The base pay actually offered will take into account internal equity and also may vary depending on the candidate’s geographic region, job-related knowledge, skills, and experience among other factors. A bonus and/or long-term incentive units may be provided as part of the compensation package, in addition to the full range of medical, financial, and/or other benefits, dependent on the level and position offered.
#J-18808-Ljbffr
On any given day at Disney Entertainment & ESPN Technology, we’re reimagining ways to create magical viewing experiences for the world’s most beloved stories while also transforming Disney’s media business for the future. Whether that’s evolving our streaming and digital products in new and immersive ways, powering worldwide advertising and distribution to maximize flexibility and efficiency, or delivering Disney’s unmatched entertainment and sports content, every day is a moment to make a difference to partners and to hundreds of millions of people around the world.A few reasons why we think you’d love working for Disney Entertainment & ESPN TechnologyBuilding the future of Disney’s media business:
DE&E Technologists are designing and building the infrastructure that will power Disney’s media, advertising, and distribution businesses for years to come.
Reach & Scale:
The products and platforms this group builds and operates delight millions of consumers every minute of every day – from Disney+ and Hulu, to ABC News and Entertainment, to ESPN and ESPN+, and much more.
Innovation:
We develop and execute groundbreaking products and techniques that shape industry norms and enhance how audiences experience sports, entertainment & news.
The Product & Data Engineering team is responsible for end to end development for Disney’s world-class consumer-facing products, including streaming platforms Disney+, Hulu, and ESPN+, and digital products & experiences across ESPN, Marvel, Disney Studios, NatGeo, and ABC News. The team drives innovation at scale for millions of consumers around the world across Apple, Android, Smart TVs, game consoles, and the web, with our platforms powering core experiences like personalization, search, messaging and data.About The Role
The Data Capture team for the Data organization within the DE&ET organization is in search of a Lead Software Engineer. As a member of the Data Capture team you will establish the foundational set of core platform frameworks and pipelines which are a vital key to success – enabling dozens of engineering and analytical teams to unlock the power of data to drive key business decisions and provide engineering, analytics, and operational teams the critical information necessary to scale the largest streaming service. Expanding, scaling, and standardizing the core foundational principles through consistent observability, lineage, data quality, logging, and alerting across all engineering teams in the Data organization is imperative to the creation of a single pane of glass. The Data Capture team is seeking to grow their team of world class Software Engineers that share their charisma and enthusiasm for making a positive impact.Responsibilities
Contribute to maintaining, updating, and expanding the existing Data Capture platform including the Spark data pipelines while maintaining strict uptime SLAs.
Extend functionality of current Data Capture platform offerings, including metadata parsing, extending the metastore API, and building new integrations with APIs both internal and external to the Data organization.
Implement the Lakehouse architecture, working with customers, partners, and stakeholders to shift towards a Lakehouse centric data platform.
Architect, design, and code shared libraries in Scala and Python that abstract complex business logic to allow consistent functionality across the Data organization.
Collaborate with product managers, architects, and other engineers to drive the success of the Data Capture platform.
Lead the developing and documenting both internal and external standards and best practices for pipeline configurations, naming conventions, partitioning strategies, and more.
Ensure high operational efficiency and quality of the Data Capture platform datasets to ensure our solutions meet SLAs and project reliability and accuracy to all our stakeholders (Engineering, Data Science, Operations, and Analytics teams).
Be an active participant and advocate of agile/scrum ceremonies to collaborate and improve processes for our team.
Engage with and understand our customers, forming relationships that allow us to understand and prioritize both innovative new offerings and incremental platform improvements.
Maintain detailed documentation of your work and changes to support data quality and data governance requirements.
Provide mentorship and guidance for team members; evangelize the platform, best-practices, data driven decisions; identify new use cases and features and drive adoption.
Tech stack includes Airflow, Spark, Databricks, Delta Lake, Snowflake, Scala, Python.
Basic Qualifications
5+ years of software engineering experience developing backend applications.
2+ years of data engineering experience developing large data pipelines.
Strong algorithmic problem-solving expertise.
Strong fundamental Scala and Python programming skills.
Basic understanding of AWS or other cloud provider resources (S3).
Strong SQL skills and ability to create queries to analyze complex datasets.
Hands-on production environment experience with distributed processing systems such as Spark.
Hands-on production experience with orchestration systems such as Airflow.
Some scripting language experience.
Willingness and ability to learn and pick up new skillsets.
Self-starting problem solver with an eye for detail and excellent analytical and communication skills.
Preferred Qualifications
Experience with at least one major Massively Parallel Processing (MPP) or cloud database technology (Snowflake, Redshift, Big Query).
Experience in developing APIs with GraphQL.
Deep Understanding of AWS or other cloud providers as well as infrastructure as code.
Familiarity with Data Modeling techniques and Data Warehousing standard methodologies and practices.
Familiar with Scrum and Agile methodologies.
Master’s Degree a plus.
Required Education
Bachelor’s Degree in Computer Science, Information Systems or related field or equivalent industry experience.
Additional Information
#DISNEYTECHThe hiring range for this position in Santa Monica, California is $136,038 to $182,490 per year, in Bristol, Connecticut is $136,038 to $182,490 per year, in Seattle, Washington is $142,516 to $191,180 per year, in New York City, NY is $142,516 to $191,180 per year, and in San Francisco, California is $149,000 to $199,000 per year. The base pay actually offered will take into account internal equity and also may vary depending on the candidate’s geographic region, job-related knowledge, skills, and experience among other factors. A bonus and/or long-term incentive units may be provided as part of the compensation package, in addition to the full range of medical, financial, and/or other benefits, dependent on the level and position offered.
#J-18808-Ljbffr