Kustomer
Data Engineer
Kustomer, Little Ferry, New Jersey, us, 07643
About KustomerKustomer is the industry leading conversational CRM platform perfecting every customer experience. Built with intelligent tools such as AI and Automation, no code-configuration and a connected data platform that unifies data from multiple sources through a single timeline, Kustomer empowers businesses to operate with greater efficiency and deliver more personalized service to customers across any channel, making every interaction more meaningful and memorable. Today, Kustomer is the core platform for some of the leading customer service brands like Ring, Glovo, Away Travel, Priceline and Sweetgreen.Kustomer was founded in 2015 by serial entrepreneurs Brad Birnbaum and Jeremy Suriel and has raised over $200M in funding backed by leading VCs. Meta announced its intention to acquire Kustomer in 2020 and completed the transaction in 2022. Kustomer joined Meta’s Business Messaging Group to transform the way people and businesses communicate through modern messaging channels. In 2023, Kustomer spun out from Meta as a standalone company backed by original partners, Battery, Redpoint and Boldstart Ventures, who have invested $60M in capital, ensuring Kustomer’s growth and success for many years to come.Our Krew is made up of passionate and collaborative people who really care about what they do and the people they help. We look for people who are passionate about enhancing the customer service experience for everyone involved, as it's the core of what we do. We're growing our business with no plans of slowing down. We actively seek individuals who want to learn and be challenged every day. We have also transitioned to a remote friendly company, with Krew members located throughout the U.S. coming together for Kamp Kustomer each year.About the RoleAs a data engineer on Kustomer’s Engineering team, you will be responsible for growing our business by leveraging advanced data-engineering techniques to enhance our data architecture and streamline data-ingestion processes. Your expertise will be instrumental in improving our data-driven product offerings and in providing valuable insights that drive strategic product decision-making and help us grow globally.We believe in ownership and are looking for people driven to continuously deliver excellence across all dimensions of their role. This role requires a deep understanding of both the technical and strategic aspects of big data solutions, including data modeling, data warehousing, and data integration.What You’ll Do:Data Engineering:
Write and adapt tools to classify, ingest and reconcile data. Manage ETL processes to move data among various systems. Onboard datasets, explore data and automate tasks using a modern data stack.Data Analysis/Business Intelligence:
Parse, analyze and understand data sets. Work closely with key stakeholders to identify valuable insights that can be realized from the amalgamation of our data and integration of disparate data sources. Partner with stakeholders to effectively surface and visualize data.Data Debugging:
Find anomalies in datasets and debug issues relating to data availability, access, integrity, privacy and security.Production Support:
Provide proactive oversight of our data pipeline, handle inquiries from internal customers and resolve issues in a timely manner.Best Practices:
Conduct architecture, systems and data reviews across the platform. Provide education and support to the engineering team in data architecture and design. Identify and resolve deficiencies and deviance from best practices. Participate in data scalability initiatives across the platform.Our Tech Stack:MongoDB, Redis, BigQuery, Elasticsearch, DynamoDB, Aurora, SnowflakeGCP, AWSJavaScript/TypeScript (React/Node.js), Python, GoDatadog, Coralogix, ELKMinimum requirements:Bachelor's degree in computer science, data engineering, relevant technical field or equivalent practical experience5+ years of experience in all or most of the following:MongoDB architecture, data management and ETLRelational databases and caches (Redis)BigQuery, data warehouses/virtualization tools, analytical and data visualization products such as Google Data StudioBigQuery integration with AWS products such as S3, Kinesis and AWS BatchCloud environments, especially GCP and AWS, with a focus on the administration, architecture and configuration of relational and nonrelational data stores and the integration of S3 and Kinesis with BigQuerySQL, MQL, shell scripting, database scripting and ETL processingSecuring data privacy (PII, encryption, regulatory compliance)SRE/observability tools such as Datadog, ELK stack, Coralogix and SentryData governanceNice To Have:Certification in cloud or database technologiesWorking understanding of JavaScript/Node.js, Python and/or GoFamiliarity with SOLID principlesBig Data management and analysisElasticsearch and/or Rockset experienceExperience with infrastructure as code using TerraformHIPAA ComplianceAll roles at Kustomer may involve handling sensitive personal data.BenefitsKustomer offers an array of benefits including competitive salaries, stock options, 100% healthcare coverage, 401K, WiFi and Mobile reimbursement, and a generous vacation policy.Diversity, Equity & Inclusion at KustomerKustomer is committed to bringing together individuals from different backgrounds and perspectives. We strive to create an inclusive environment where everyone can thrive, feel a sense of belonging, and do great work together. We are proud to be an equal opportunity employer open to all qualified applicants regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or expression, Veteran status, or any other legally protected status.Disclaimer:
Kustomer only contacts candidates from company email addresses ending in kustomer.com and does not seek funds from candidates in any circumstances.
#J-18808-Ljbffr
Write and adapt tools to classify, ingest and reconcile data. Manage ETL processes to move data among various systems. Onboard datasets, explore data and automate tasks using a modern data stack.Data Analysis/Business Intelligence:
Parse, analyze and understand data sets. Work closely with key stakeholders to identify valuable insights that can be realized from the amalgamation of our data and integration of disparate data sources. Partner with stakeholders to effectively surface and visualize data.Data Debugging:
Find anomalies in datasets and debug issues relating to data availability, access, integrity, privacy and security.Production Support:
Provide proactive oversight of our data pipeline, handle inquiries from internal customers and resolve issues in a timely manner.Best Practices:
Conduct architecture, systems and data reviews across the platform. Provide education and support to the engineering team in data architecture and design. Identify and resolve deficiencies and deviance from best practices. Participate in data scalability initiatives across the platform.Our Tech Stack:MongoDB, Redis, BigQuery, Elasticsearch, DynamoDB, Aurora, SnowflakeGCP, AWSJavaScript/TypeScript (React/Node.js), Python, GoDatadog, Coralogix, ELKMinimum requirements:Bachelor's degree in computer science, data engineering, relevant technical field or equivalent practical experience5+ years of experience in all or most of the following:MongoDB architecture, data management and ETLRelational databases and caches (Redis)BigQuery, data warehouses/virtualization tools, analytical and data visualization products such as Google Data StudioBigQuery integration with AWS products such as S3, Kinesis and AWS BatchCloud environments, especially GCP and AWS, with a focus on the administration, architecture and configuration of relational and nonrelational data stores and the integration of S3 and Kinesis with BigQuerySQL, MQL, shell scripting, database scripting and ETL processingSecuring data privacy (PII, encryption, regulatory compliance)SRE/observability tools such as Datadog, ELK stack, Coralogix and SentryData governanceNice To Have:Certification in cloud or database technologiesWorking understanding of JavaScript/Node.js, Python and/or GoFamiliarity with SOLID principlesBig Data management and analysisElasticsearch and/or Rockset experienceExperience with infrastructure as code using TerraformHIPAA ComplianceAll roles at Kustomer may involve handling sensitive personal data.BenefitsKustomer offers an array of benefits including competitive salaries, stock options, 100% healthcare coverage, 401K, WiFi and Mobile reimbursement, and a generous vacation policy.Diversity, Equity & Inclusion at KustomerKustomer is committed to bringing together individuals from different backgrounds and perspectives. We strive to create an inclusive environment where everyone can thrive, feel a sense of belonging, and do great work together. We are proud to be an equal opportunity employer open to all qualified applicants regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or expression, Veteran status, or any other legally protected status.Disclaimer:
Kustomer only contacts candidates from company email addresses ending in kustomer.com and does not seek funds from candidates in any circumstances.
#J-18808-Ljbffr