Netomi
Data Engineer
Netomi, San Mateo, California, United States, 94409
Job Title:
Data EngineerCompany:
NetomiLocation:
San Mateo, CA, United StatesDate Posted:
02 Aug, 2023Job Type:
Full TimeExperience Required:
No experience requiredRemote Work:
NoStock Options:
NoVacancies:
1 availableAt Netomi AI, we are on a mission to create artificial intelligence that builds customer love for the world’s largest global brands. Some of the largest brands are already using Netomi AI’s platform to solve mission-critical problems. This would allow you to work with top-tier clients at the senior level and build your network.Backed by the world’s leading investors such as Y-Combinator, Index Ventures, Jeffrey Katzenberg (co-founder of DreamWorks) and Greg Brockman (co-founder & President of OpenAI/ChatGPT), you will become a part of an elite group of visionaries who are defining the future of AI for customer experience. We are building a dynamic, fast growing team that values innovation, creativity, and hard work. You will have the chance to significantly impact the company’s success while developing your skills and career in AI.We are looking for a Data Engineer with a passion for using data to discover and solve real-world problems. You will enjoy working with rich data sets, modern business intelligence technology, and the ability to see your insights drive the features for our customers. You will also have the opportunity to contribute to the development of policies, processes, and tools to address product quality challenges in collaboration with teams.Responsibilities:Partner with teammates to create complex data processing pipelines to solve our clients’ most ambitious challenges.Collaborate with Data Scientists to design scalable implementations of their models.Write clean and iterative code based on TDD and leverage various continuous delivery practices to deploy, support, and operate data pipelines.Advise and educate clients on how to use different distributed storage and computing technologies.Develop and operate modern data architecture approaches to meet key business objectives and provide end-to-end data solutions.Create data models and discuss the tradeoffs of different modeling approaches.Incorporate data quality into day-to-day work as well as into the delivery process.Requirements:3+ Years of Experience.Expertise in SQL, PL/SQL & General Software Engineering (proficiency coding in Python/Java).Experience in MySQL 8.0+, and AWS Aurora required.Expert SQL query optimization.Good understanding of data modeling and experience with data engineering tools.Comfortable taking a data-driven approach and applying data security strategy to solve business problems.Excited about data infrastructure and operations with familiarity working in cloud environments.Experience building and operating data pipelines, and maintaining data storage within distributed systems.Ensure effective collaboration between Netomi and the client’s teams, encouraging open communication and advocating for shared outcomes.Experience writing data quality units and functional tests.Strong Experience with any Relational Database (preferably Aurora MySQL).Experience with AWS Lambda, Kinesis, RDS, EC2, Quicksight.Experience with Streaming Platforms such as Kafka, Kinesis, etc.Netomi is an equal opportunity employer committed to diversity in the workplace. We evaluate qualified applicants without regard to race, color, religion, sex, sexual orientation, disability, veteran status, and other protected characteristics.
#J-18808-Ljbffr
Data EngineerCompany:
NetomiLocation:
San Mateo, CA, United StatesDate Posted:
02 Aug, 2023Job Type:
Full TimeExperience Required:
No experience requiredRemote Work:
NoStock Options:
NoVacancies:
1 availableAt Netomi AI, we are on a mission to create artificial intelligence that builds customer love for the world’s largest global brands. Some of the largest brands are already using Netomi AI’s platform to solve mission-critical problems. This would allow you to work with top-tier clients at the senior level and build your network.Backed by the world’s leading investors such as Y-Combinator, Index Ventures, Jeffrey Katzenberg (co-founder of DreamWorks) and Greg Brockman (co-founder & President of OpenAI/ChatGPT), you will become a part of an elite group of visionaries who are defining the future of AI for customer experience. We are building a dynamic, fast growing team that values innovation, creativity, and hard work. You will have the chance to significantly impact the company’s success while developing your skills and career in AI.We are looking for a Data Engineer with a passion for using data to discover and solve real-world problems. You will enjoy working with rich data sets, modern business intelligence technology, and the ability to see your insights drive the features for our customers. You will also have the opportunity to contribute to the development of policies, processes, and tools to address product quality challenges in collaboration with teams.Responsibilities:Partner with teammates to create complex data processing pipelines to solve our clients’ most ambitious challenges.Collaborate with Data Scientists to design scalable implementations of their models.Write clean and iterative code based on TDD and leverage various continuous delivery practices to deploy, support, and operate data pipelines.Advise and educate clients on how to use different distributed storage and computing technologies.Develop and operate modern data architecture approaches to meet key business objectives and provide end-to-end data solutions.Create data models and discuss the tradeoffs of different modeling approaches.Incorporate data quality into day-to-day work as well as into the delivery process.Requirements:3+ Years of Experience.Expertise in SQL, PL/SQL & General Software Engineering (proficiency coding in Python/Java).Experience in MySQL 8.0+, and AWS Aurora required.Expert SQL query optimization.Good understanding of data modeling and experience with data engineering tools.Comfortable taking a data-driven approach and applying data security strategy to solve business problems.Excited about data infrastructure and operations with familiarity working in cloud environments.Experience building and operating data pipelines, and maintaining data storage within distributed systems.Ensure effective collaboration between Netomi and the client’s teams, encouraging open communication and advocating for shared outcomes.Experience writing data quality units and functional tests.Strong Experience with any Relational Database (preferably Aurora MySQL).Experience with AWS Lambda, Kinesis, RDS, EC2, Quicksight.Experience with Streaming Platforms such as Kafka, Kinesis, etc.Netomi is an equal opportunity employer committed to diversity in the workplace. We evaluate qualified applicants without regard to race, color, religion, sex, sexual orientation, disability, veteran status, and other protected characteristics.
#J-18808-Ljbffr