Introduction to the Moderation Platform
The Moderation Platform makes toxicity management accessible, impactful, and insightful by providing you with the tools you need to grow and maintain healthy communities within your games.
The Moderation Platform is the way you action reports, review evidence that come from Safe Voice or Safe Text, and monitor community health scores. You gain access to this dashboard when either product is included in your project.
The dashboard allows you to:
- Gather and review user reports from your game through Safe Voice or Safe Text with the Moderation SDK.
- Evidence is automatically collected from Safe Voice or Safe Text when user reports are submitted.
- Search and filter incidents to prioritize reports.
- Action player reports with confidence by referencing AI analysis and detections from Safe Voice and Safe Text.
- Assign moderation roles to project members to action player reports and monitor community health.
Important: Moderation requires the use of Unity's Authentication Service (UAS).
Moderation products
Safe Voice
Safe Voice is a machine learning based voice analytics service that records player voice communications and provides users with an analysis of the recording and the recording itself to support customer moderation.
Learn more about Safe Voice in the Safe Voice documentation.
Safe Text
Safe Text is a suite of safety tools you can use to tailor in-game communications rules, filter harmful messages, and collect evidence on toxic players.
Learn more about Safe Text in the Safe Text documentation.