Logo
The Knot Worldwide

Senior Data Engineer

The Knot Worldwide, Boston, Massachusetts, us, 02298


ABOUT THE ROLE AND YOUR TEAM:

The Knot Worldwide's Data team is looking for a highly motivated, highly energetic team-player with a strong analytical mindset to join as a Senior Data Engineer supporting end-to-end data pipelines for our business.

This role will be partnering closely with our stakeholders and also with other Data teams to develop data models and data-driven solutions that enable deep analysis and self-service analytics for our Business Domains (Product, Marketing, Revenue and Core).

We see a Senior Data Engineer as a hybrid role that combines mostly data engineering skills but also having the ability to pivot through business intelligence or reporting models. You will be responsible for building (leading) end-to-end data pipelines using tools like dbt, airbyte, airflow and other modern data stack. This role requires proficiency in SQL, python, data modeling, data transformation, data visualization, but also collaboration skills and business acumen as they play a crucial role in enabling data-driven decision-making within our organization.

The position is based at Wedding Planner's head office in Barcelona. This position offers the flexibility of working remotely. As a Data Engineer, you will have the opportunity to work from your preferred location (Spain), collaborating with a distributed team of professionals. We embrace a remote-first culture that values work-life balance and provides the necessary tools and resources for effective remote collaboration.

RESPONSIBILITIES :

Design, build, and maintenance of end-to-end scalable data pipelines to collect, process, and analyze large and complex datasets from various sources, including but not only, website visitors behavioral tracking, customer data, transactional data and other business data using tools like Airbyte, Python and dbt to write mainly SQL-based data transformations and ensure the data is clean and ready for analysisApply dimensional modeling techniques to design tables and views that map business processes into an enterprise data model.Develop and support complex ETL infrastructure to ensure the delivery of clean and reliable data to the organizationAutomate manual processes to improve efficiency, robustness, and speed.Participate in overall architecture and strategy for the deployment of our end-to-end data pipelines.Partner with the rest of Data Platform Engineering Teams to provide business logics to design, develop and move data pipelines into production.Partner with Director / Lead to define best practices, templates, scalable code for our tech stack development, mainly in dbt.Assist with performance and tuning of our existing models, pipelines and data-applications.Foster data governance practices, ensuring data privacy and security, and documenting data pipelines, transformations, and models, contributing to maintaining a data-driven culture within the organizationUnderstand our business and processes including how our data pipelines and data apps support the business processes and apply this knowledge to best solve problems.Work and communicate with end users / stakeholders across the organization to understand their needs, providing insights, training them on use of the Information Mart models, data-applications or any actionable results in support of decisions.Clearly scope, track, execute, and communicate on projects in an Agile environmentMentor other Data Engineering team members, provide technical leadership and guidance in data engineering projects and initiatives.Oversee the top of funnel metrics for at least one Business Domain or Pillar and ensure consistency in their usage across models and reporting by maintaining great documentation.SUCCESSFUL SENIOR DATA ENGINEER CANDIDATES HAVE:

Bachelor's degree (Computer Science, Engineering, Information Systems or relational functional field)4+ years experience including design, development, data management, administration and support building data models, data pipelines, data warehousing (Snowflake preferred)Advanced proficiency in modeling using SQL and dbt, strength in PythonExperience in data integration & orchestration (Airbyte & Airflow) is a plusSolid understanding of modern data engineering and architecture concepts and practices. This includes knowledge of data warehousing, data pipelines, data marts, and data integration techniques and tools mostly in cloud infrastructure.Solid understanding of different business process models and reporting needs and ability to convert requirements into models and end-to-end data pipeline designs.Must be self-motivated and able to work both independently and with othersVersatile and quick learner with ability to pick up any new skills necessary to get the job done.Positive attitude and ability to receive and provide objective and constructive feedbackStrong organizational and troubleshooting skills with attention to detail, curiosity to detect and explain data anomalies.Strong interpersonal skills with the ability to work effectively in a cross-functional teamExcellent communication skills, with the ability to explain complex data insights to non-technical stakeholdersExperience in different reporting tools or data applications like Qlik, Tableau, Power BI, Looker, Streamlit ...is a plusWork with Agile methodology and JIRA are a plusOur international work environment requires fluency in spoken and written English.