Back to jobs
A

Content Moderation Specialist

🇺🇸Anthropic

Type
Full Time
Level
Mid-level
Location
New York City, NY; San Francisco, CA; Washington, DC
Posted 9h ago

Job Description

About Anthropic Anthropic’s mission is to create reliable, interpretable, and steerable AI systems. We want AI to be safe and beneficial for our users and for society as a whole. Our team is a quickly growing group of committed researchers, engineers, policy experts, and business leaders working together to build beneficial AI systems. About the role Anthropic's Integrity Compliance (I C) function is building the systems that let us scale responsibly as our products reach more people, more enterprises, and more regulated industries. Our global compliance program is bespoke, reflecting our unique mission and position as one of the leading AI labs operating on the frontier. Our Regulatory Programs pillar is a key pillar of our overall Integrity Compliance function and covers a range of compliance domain areas including economic sanctions, US export controls, and regulatory compliance programs stemming from global AI safety regulation. As a Content Moderation Specialist, you'll own day-to-day program management of Anthropic's global content moderation and online safety regulatory compliance program. Online safety regulation is one of the fastest-moving areas of technology law, and AI sits squarely in its sights. Regimes including the EU Digital Services Act, the UK Online Safety Act, the Australia Online Safety Act, and a growing set of emerging frameworks globally create novel obligations for how AI products are built, deployed, and governed. You will be at the forefront of translating those obligations into a defensible, well-documented compliance program — with regulatory risk assessments as the core of the work. This is a deeply cross-functional role. You'll partner closely with internal counsel, Safeguards, and operations teams across Anthropic to build the compliance program and frameworks that demonstrate Anthropic meets its obligations under content regulation. This is a builder's role at a company that takes integrity seriously and moves fast — you'll exercise independent judgment on issues without clear precedent and help build durable programs that let Anthropic move quickly while honoring its obligations to regulators, customers, and the public. Key responsibilities Own the global content regulation risk assessment program, including the roadmap of required assessments across jurisdictions, a consistent and repeatable risk assessment methodology and framework, and the coordination of inputs, consultation, and approvals for each assessment Build and maintain systems and trackers to assess, operationalize, and report on relevant regulatory requirements across Anthropic's products and jurisdictions Partner with internal counsel, Safeguards, Policy, engineering, and operations teams to align internal practices with external commitments and legal obligations Maintain a controls inventory and the compliance documentation library for content regulation, ensuring documentation is drafted, reviewed by the right stakeholders, and kept current Cond

Required Skills

GoRGit
A

Anthropic