Introduction to the Moderation Platform
The Moderation Platform makes toxicity management accessible, impactful, and insightful by providing you with the tools you need to grow and maintain healthy communities within your games.
The Moderation Platform is the way you take action on incidents, review evidence that come from Safe Text, and monitor community health scores. You gain access to this dashboard when the product is included in your project.
The dashboard allows you to:
Gather and review incidents from your game through Safe Text with the Moderation SDK.
Review evidence that is automatically collected from Safe Text when user reports are submitted.
Search and filter incidents in the Moderation queue to prioritize the most toxic reports.
Action incidents manually or through automated processes with the Moderation API. Gain confidence in your review by referencing AI analysis and detections from Safe Text.
Assign moderation roles to project members to action player reports and monitor community health.
Important: Moderation requires the use of Unity's Authentication Service (UAS).
Moderation products
Safe Text
Safe Text is a suite of safety tools you can use to tailor in-game communications rules, filter harmful messages, and collect evidence on toxic players.
Learn more about Safe Text in the Safe Text documentation.