Logo
Deepgram

Research Scientist - LLMs

Deepgram, San Francisco, California, United States, 94199


[Full Time] Research Scientist - LLMs at Deepgram (United States)Research Scientist - LLMs

Deepgram United StatesDate Posted: 02 Aug, 2023Work Location: San Francisco, United StatesSalary Offered: $150000 — $250000 yearlyJob Type: Full TimeExperience Required: 3+ yearsRemote Work: YesStock Options: NoVacancies: 1 availableAt Deepgram, we spend every day tackling big, real-world challenges in speech. Our customers hire us to solve their hardest problems in speech, taking real, complex audio and transforming it into novel insights. And to raise the bar, everything we build needs scale in its DNA; we aren’t content with simple horizontal scaling: we intend to replace entire data centers dedicated to speech analytics with a single rack of servers. These challenges provide opportunities for creativity and innovative problem-solving every day. Deepgram’s Research Scientists tackle some of the most exciting and difficult problems on the forefront of ASR and NLU technologies.You’ll have the freedom to innovate and uncover breakthroughs — and influence our product roadmap in turn. We look forward to you bringing your whole self to work, sharing learnings from your latest experiments, and collaborating with us to advance the state of speech technology.The Role

Deepgram is currently looking for an experienced Research Scientist who has worked extensively with Large Language Models (LLMs) and has a deep understanding of transformer architecture. This individual should have extensive experience working on the hard technical aspects of LLMs, such as data curation, distributed large-scale training, optimization of transformer architecture, and Reinforcement Learning (RL) training.What You’ll Do

Brainstorming and collaborating with other members of the research team to define new LLM research initiativesBroad surveying of literature, evaluating, classifying, and distilling current methodsDesigning and carrying out experimental programs for LLMsDriving transformer (LLM) training jobs successfully on distributed compute infrastructure and deploying new models into productionDocumenting and presenting results and complex technical concepts clearly for a target audienceStaying up to date with the latest advances in deep learning and LLMs, with a particular eye towards their implications and applications within our productsYou’ll Love This Role If You

Are passionate about AI and excited about working on state of the art LLM researchHave an interest in producing and applying new science to help us develop and deploy large language modelsEnjoy building from the ground up and love to create new systemsHave strong communication skills and are able to translate complex concepts clearlyAre highly analytical and enjoy delving into detailed analyses when necessaryIt’s Important To Us That You Have

3+ years of experience in applied deep learning research, with a solid understanding toward the applications and implications of different neural network types, architectures, and loss mechanismsProven experience working with large language models (LLMs) - including experience with data curation, distributed large-scale training, optimization of transformer architecture, and RL LearningStrong experience coding in Python and working with PytorchExperience with various transformer architectures (auto-regressive, sequence-to-sequence, etc.)Experience with distributed computing and large-scale data processingPrior experience in conducting experimental programs and using results to optimize modelsIt Would Be Great if You Had

Deep understanding of transformers, causal LMs, and their underlying architectureUnderstanding of distributed training and distributed inference schemes for LLMsFamiliarity with RLHF labeling and training pipelinesUp-to-date knowledge of recent LLM techniques and developmentsPublished papers in Deep Learning Research, particularly related to LLMs and deep neural networksAbout Deepgram

Building foundational AI for speech transcription and understanding.

Company Size:

51 - 250 PeopleYear Founded:

2015Country:

United StatesCompany Status:

Actively Hiring

#J-18808-Ljbffr