Logo
TikTok

LIVE Model and Enforcement Strategy Policy Manager - Trust and Safety

TikTok, San Jose, CA


Responsibilities

TikTok is the leading destination for short-form mobile video. At TikTok, our mission is to inspire creativity and bring joy. TikTok's global headquarters are in Los Angeles and Singapore, and its offices include New York, London, Dublin, Paris, Berlin, Dubai, Jakarta, Seoul, and Tokyo.

Why Join Us:
Creation is the core of TikTok's purpose. Our platform is built to help imaginations thrive. This is doubly true of the teams that make TikTok possible. Together, we inspire creativity and bring joy - a mission we all believe in and aim towards achieving every day. To us, every challenge, no matter how difficult, is an opportunity; to learn, to innovate, and to grow as one team. Status quo? Never. Courage? Always. At TikTok, we create together and grow together. That's how we drive impact - for ourselves, our company, and the communities we serve. Join us.

About the Team:
Building a world where people can safely discover, create and connect. The Trust & Safety (T&S) team at TikTok helps ensure that our global online community is safe and empowered to create and enjoy content across all of our applications. We have invested heavily in human and machine-based moderation to remove harmful content quickly and often before it reaches our general community.

The Trust & Safety Policy team develops, reviews, and implements the policies and processes that underpin our Community Guidelines to promote a positive and safe environment for all of our users and content creators to enjoy and express themselves. As a LIVE Policy Manager, you will develop policies and manage content intervention strategies for TikTok LIVE. You will work closely with a variety of internal partners, including members of the Trust & Safety, Product, Engineering, and Legal teams.

As a LIVE Model and Enforcement Strategy Policy Manager, you will shape the intervention strategies for livestream on our platform, as well as leverage AI models and products to enable accurate, effective, and fair machine moderation. This is achieved by ensuring seamless updates to machine moderation, large language models, and improving the quality of enforcement model training and development. You will help to innovate and define safety approaches based on leading industry practices and latest market trends. Your work will be critical in shaping ByteDance's prioritised initiatives, and ensuring content safety on our platforms.

It is possible that this role will be exposed to harmful content as part of the core role/as part of project/ in response to escalation requests/by chance. This may occur in the form of images, video, and text related to every-day life, but it can also include (but is not limited to) bullying; hate speech; child safety; depictions of harm to self and others, and harm to animals.

Responsibilities:
- Shape policies and intervention strategies that impact TikTok LIVE;
- Identify and anticipate product and content safety risks to support launches of new models, features, and products;
- Partner with relevant teams to maintain AI pipelines, optimize model deployment, and scale implementation;
- Ensure that models are deployed fairly and in line with industry best practices;
- Partner with product, engineering, research, operations, and legal teams to enhance our safety approaches based on the latest developments and best practices.

Qualifications

Minimum Qualifications:
- You have 3-5 years of experience working in the technology industry, with exposure to model training, policy development, program management, and/or risk assessment;
- You have a deep understanding and interest in the key policy issues that impact online safety and AI;
- You have a bachelors or masters degree in artificial intelligence, public policy, politics, law, economics, behavioral sciences, or any other related field;
- You are a confident self-starter with excellent judgment, and can balance multiple trade-offs to develop principled, enforceable, and defensible policies and strategies. You have persuasive oral and written communication, with the ability to translate complex challenges into simple and clear language and persuade cross-functional partners in a dynamic, fast-paced, and often uncertain environment;
- You have experience working with international partners across different time zones and cultures.

Preferred Qualifications:
- Proven ability to develop sound research methodologies and collect, synthesize, analyze, and interpret data;
- Experience working in a start-up, or being part of new teams in established companies.

Trust & Safety recognizes that keeping our platform safe for TikTok communities is no ordinary job which can be both rewarding and psychologically demanding and emotionally taxing for some. This is why we are sharing the potential hazards, risks and implications in this unique line of work from the start, so our candidates are well informed before joining.We are committed to the wellbeing of all our employees and promise to provide comprehensive and evidence-based programs, to promote and support physical and mental wellbeing throughout each employee's journey with us. We believe that wellbeing is a relationship and that everyone has a part to play, so we work in collaboration and consultation with our employees and across our functions in order to ensure a truly person-centred, innovative and integrated approach.

TikTok is committed to creating an inclusive space where employees are valued for their skills, experiences, and unique perspectives. Our platform connects people from across the globe and so does our workplace. At TikTok, our mission is to inspire creativity and bring joy. To achieve that goal, we are committed to celebrating our diverse voices and to creating an environment that reflects the many communities we reach. We are passionate about this and hope you are too.

TikTok is committed to providing reasonable accommodations in our recruitment processes for candidates with disabilities, pregnancy, sincerely held religious beliefs or other reasons protected by applicable laws. If you need assistance or a reasonable accommodation, please reach out to us at https://shorturl.at/cdpT2