Dexterity
Staff Data Platform Engineer
Dexterity, Redwood City, California, United States, 94061
Location: Redwood City, CA
Travel Required: No
Job Classification Exempt FT
Job Functions: Engineering, Data
Reports to: Head of Platform
About Dexterity
At Dexterity we’re building robotic automation systems to perform pick-place-pack tasks in warehouses. Our end-to-end automation systems use intelligent software to enable human-like dexterity in commodity robot arms. Such intelligent robots will deeply impact the logistics industry and help realize a step change in logistics automation and supply chain productivity.
Dexterity is one of the fastest growing and best funded startups in the robot manipulation space. Come join our team of robot-obsessed engineers and help make intelligent robots a reality.
About the Role
The Data group at Dexterity is designing one of the most distributed data infrastructures on the planet. Our charter is to build a platform that connects the world’s warehouses, spanning from the edge devices to the cloud.
With a growing number of robots deployed across the world, Dexterity’s robots collect terabytes of operational data at the edge which is used to improve our operations as well as that of our customers. Our data platform ingests raw robotic sensor data, video/ still image feeds, along with a variety of log data while handling considerations like low-bandwidth connections. We extensively utilize cutting edge collection and transformation tools to unlock the potential from our data. We are looking for talented engineers to shape and build the next generation of the data engineering pipelines to bridge the divide between edge and the cloud.
As a Data Platform Lead, your responsibilities will include:
Shape and build infrastructure that ingests sub-millisecond sensor data as well application logs generated on-premise and makes it available to cloud data consumers with real-time performance
Design, implement, and optimize the performance of distributed data platforms (distributed systems) spanning from the edge to the cloud with security, reliability and network constraints in mind.
Architect and design the data pipelines for AI/ ML heavy workloads in mind in order to extract crucial insights from the data
Work with product teams across the company to identify the needs of data and build out features
Have a good knowledge of data modeling and distributed systems like
Spark, Hadoop, kafka, Elasticsearch
etc Help build a fast-growing team by mentoring other engineers and contribute to Dexterity’s strong customer focused culture Tech-lead several key initiatives and define the future Dexterity big data pipelines and platform Basic Qualifications: BA/BS Degree in Computer Science or related technical discipline, or related practical experience. 10+ years of experience in software design, data engineering, data processing at scale and basic algorithm related solutions. 7+ years of programming experience in Java, Python, C/C++, Rust, Go, or other relevant coding languages Hands-on experience developing/managing distributed systems or other large-scale systems. Preferred Qualifications: Experience with data infrastructure technologies like Kakfa, Elastic search, HDFS, distributed tracing etc. Experience with observability/telemetry tech stack like Prometheus/Victoria metrics/ Grafana , Alert manager etc Experience with building solutions on a public cloud platform like GCP, Azure etc. Experience with industry, open-source projects and/or academic research in data management, data modeling, relational databases, and/or large-data, parallel and distributed systems Experience with deploying solutions on a on-premise systems and on public cloud platform Experience in building data infrastructure for AI/ ML ecosystems Equal Opportunity Employer We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
#J-18808-Ljbffr
Spark, Hadoop, kafka, Elasticsearch
etc Help build a fast-growing team by mentoring other engineers and contribute to Dexterity’s strong customer focused culture Tech-lead several key initiatives and define the future Dexterity big data pipelines and platform Basic Qualifications: BA/BS Degree in Computer Science or related technical discipline, or related practical experience. 10+ years of experience in software design, data engineering, data processing at scale and basic algorithm related solutions. 7+ years of programming experience in Java, Python, C/C++, Rust, Go, or other relevant coding languages Hands-on experience developing/managing distributed systems or other large-scale systems. Preferred Qualifications: Experience with data infrastructure technologies like Kakfa, Elastic search, HDFS, distributed tracing etc. Experience with observability/telemetry tech stack like Prometheus/Victoria metrics/ Grafana , Alert manager etc Experience with building solutions on a public cloud platform like GCP, Azure etc. Experience with industry, open-source projects and/or academic research in data management, data modeling, relational databases, and/or large-data, parallel and distributed systems Experience with deploying solutions on a on-premise systems and on public cloud platform Experience in building data infrastructure for AI/ ML ecosystems Equal Opportunity Employer We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
#J-18808-Ljbffr