Introduction to the Moderation dashboard
An overview of the moderation dashboard and how it helps manage player safety.
Read time 1 minuteLast updated 2 days ago
The Moderation dashboard makes toxicity management accessible, impactful, and insightful by providing you with the tools you need to grow and maintain healthy communities within your games. The dashboard is the way you take action on incidents, review evidence that come from Safe Text, and monitor community health scores. You gain access to this dashboard when the product is included in your project. The dashboard allows you to:
- Gather and review incidents from your game through Safe Text with the Moderation SDK.
- Review evidence that is automatically collected from Safe Text when user reports are submitted.
- Search and filter incidents in the Moderation queue to prioritize the most toxic reports.
- Action incidents manually or through automated processes with the Moderation API. Gain confidence in your review by referencing AI analysis and detections from Safe Text.
- Assign moderation roles to project members to action player reports and monitor community health.