Logo
TikTok

Product Policy Lead, Child Safety & Exploitation and Abuse - Trust and Safety

TikTok, San Jose, CA


Responsibilities

TikTok is the leading destination for short-form mobile video. At TikTok, our mission is to inspire creativity and bring joy. TikTok's global headquarters are in Los Angeles and Singapore, and its offices include New York, London, Dublin, Paris, Berlin, Dubai, Jakarta, Seoul, and Tokyo.

Why Join Us:
Creation is the core of TikTok's purpose. Our platform is built to help imaginations thrive. This is doubly true of the teams that make TikTok possible. Together, we inspire creativity and bring joy - a mission we all believe in and aim towards achieving every day. To us, every challenge, no matter how difficult, is an opportunity; to learn, to innovate, and to grow as one team. Status quo? Never. Courage? Always. At TikTok, we create together and grow together. That's how we drive impact - for ourselves, our company, and the communities we serve. Join us.

About the Team:
Building a world where people can safely discover, create and connect. The Trust & Safety (T&S) team at TikTok helps ensure that our global online community is safe and empowered to create and enjoy content across all of our applications. We have invested heavily in human and machine-based moderation to remove harmful content quickly and often before it reaches our general community.

As a Product Policy Manager Lead for Child Safety & Exploitation, and Abuse in our Trust & Safety team, you will develop and drive TikTok's enforcement strategies to ensure a safe online platform. You will work to safeguard the platform from dangerous individuals operating on the internet, preventing or removing harmful content as quickly as possible. You will implement and enforce our policies across video content while defining and executing operational plans on a global scale, collaborating with stakeholders in product, policy development, and child safety teams.

In this role, you will work closely with the Exploitation and Abuse Policy Development team, focusing on the risks related to child safety and adult sexual exploitation. This role will require an entrepreneurial, data-driven, forward-thinking policy mindset with a passion to grow our policy enforcement in this area.

It is possible that this role will be exposed to harmful content as part of the core role/as part of project/ in response to escalation requests/by chance. This may occur in the form of images, video, and text related to every-day life, but it can also include (but is not limited to) bullying; hate speech; child safety; depictions of harm to self and others, and harm to animals.

Responsibilities:
- Tracking and monitoring performance against key metrics;
- Driving and utilizing data insights, and turning data into digestible and actionable enforcement strategies for cross functional teams;
- Analyzing, identifying, prioritizing, and delivering strategies to reduce potentially harmful content;
- Developing enforcement signals, moderation strategy;
- Strategizing proactive and reactive enforcement methodologies beyond human moderation, working with product, enhancing model enforcement;
- Review graphic, controversial, and sometimes offensive video content;
- Reviewing training materials, advocating for tooling requirements;
- Partner with policy, product, engineering, research, ops, legal, and PR teams to innovate our safety approaches based on the latest developments and best practices.

Qualifications

Minimum Qualifications:
- You have a bachelors or masters degree in artificial intelligence, public policy, politics, law, economics, behavioral sciences, or any other related field;
- You have 3-5 years of experience working in Trust and Safety, with exposure to child safety, policy enforcement, and policy development;
- You are a confident self-starter with excellent judgment, and can balance multiple trade-offs to develop principled enforcement strategies;
- You have persuasive oral and written communication;
- You have experience working with international partners across different time zones and cultures.

Preferred Qualifications:
- Experience being part of new teams in established companies;
- Experience in launching new strategies and Zero-to-One initiatives;

Trust & Safety recognizes that keeping our platform safe for TikTok communities is no ordinary job which can be both rewarding and psychologically demanding and emotionally taxing for some. This is why we are sharing the potential hazards, risks and implications in this unique line of work from the start, so our candidates are well informed before joining.We are committed to the wellbeing of all our employees and promise to provide comprehensive and evidence-based programs, to promote and support physical and mental wellbeing throughout each employee's journey with us. We believe that wellbeing is a relationship and that everyone has a part to play, so we work in collaboration and consultation with our employees and across our functions in order to ensure a truly person-centred, innovative and integrated approach.

TikTok is committed to creating an inclusive space where employees are valued for their skills, experiences, and unique perspectives. Our platform connects people from across the globe and so does our workplace. At TikTok, our mission is to inspire creativity and bring joy. To achieve that goal, we are committed to celebrating our diverse voices and to creating an environment that reflects the many communities we reach. We are passionate about this and hope you are too.

TikTok is committed to providing reasonable accommodations in our recruitment processes for candidates with disabilities, pregnancy, sincerely held religious beliefs or other reasons protected by applicable laws. If you need assistance or a reasonable accommodation, please reach out to us at https://shorturl.at/cdpT2