Moderation SDK

Report a player

Once the setup is complete, you can report players by using their UAS ID and providing a reason for the report.

Once the player is reported, you can navigate to the Unity Dashboard to review incidents with the attached report information.

If you’re using the Vivox v16.0.0 package or higher, the last 15 minutes of the conversation from the channel that the reporter and the reported were in will be attached to the report.

Below is an example of a report being submitted:

using Unity.Services.Moderation;
using Unity.Services.Moderation.Models;
using Unity.Services.Moderation.Exceptions;

public async void Report(string userId){

// userId should be the UAS id of the reported player
// eg: the currently logged in user id is accessible with
// var MyUserId = AuthenticationService.Instance.PlayerId

 Try {
   var report = Moderation.Instance.NewReport(userId,
                new ReportReason(ReportReason.Threat));
   await Moderation.Instance.ReportPlayer(report);
   Debug.Log("Report submitted!");
 } catch (ModerationServiceException e){
     if (e.Reason == ModerationServiceExceptionReason.SelfReportError) {
       Debug.Log(“Error: you can’t report yourself”);
     } else {
       Debug.Log($“Error: {e.Reason}”);
     }
 }
}

Report reason

Reasons are defined by Unity and are available as an array to display in your game:

using Unity.Services.Moderation.Models;

public ListReasons(){

foreach (var reason in ReportReason.PossibleReasons)
            {
                Debug.Log(reason);
            }

}

The constants for each reason can be found in Moderation > Runtime > Models > ReportReasons.cs.

ConstantReason
public const string AimSnapping= "aim snapping";
public const string Boosting= "boosting";
public const string Exploiting= "exploiting";
public const string Hacking= "hacking";
public const string Smurfing= "smurfing";
public const string UnrealisticMovement= "unrealistic movement";
public const string CollusionWithOpponent= "collusion with opponent";
public const string LeftMatch= "left the match";
public const string Inactive= "inactive";
public const string Sabotage= "sabotage";
public const string Spamming= "spamming";
public const string HateSpeech= "hate speech";
public const string PredatoryBehavior= "predatory behavior";
public const string NoiseDisruption= "noise disruption";
public const string Scamming= "scamming";
public const string Ads= "ads";
public const string Threat= "threat";
public const string VerbalAbuse= "verbal abuse";
public const string InappropriatePlayerName= "player name";

Actions

Actions are applied through the Dashboard when reviewing incidents.

When an action is applied, and the actioned user logs in, you will receive various error messages from the different UGS packages integrated with the Moderation services. Errors will follow the access control convention described in Error responses. This is the expected behavior.

Some SDKs will provide a mechanism to wrap those errors into exceptions of their own, for example, the Vivox SDK will emit a MintException when a policy is preventing a user from accessing communications.

Below is the supported sanctions error list:

Package nameExceptionUse case
com.unity.services.authenticationRequestFailedExceptionWhen a player is Banned, you will receive this exception with an ACCOUNT_DISABLED message in its innerException.
com.unity.services.vivoxMintExceptionIf the player is prevented from accessing communications channels, this exception will be thrown with a 403 status code.

Evidence management

If Vivox version 16.0.0 or higher is installed in the project, the SDK will automatically include the list of channels both the reporter and reported players are part of. As well as the list of extra players that were part of those channels.

This information is used to add details to the evidence in the report. The extra players' tracks will be included in the Safe Voice screenings and each channel included in the channel list will be screened.

To better control the report context, you can alter the list of channels and players included in the report before it's sent.

var report = Moderation.Instance.NewReport(userId,
                new ReportReason(ReportReason.Threat));

      report.Players = new List<PlayerContext>(); // The list of extra players you want to include in Safe Voice screenings.
      report.VivoxChannels = new List<VivoxChannel>(); // The list of channels to be screened by Safe Voice.

Note: Channels and players are automatically cached for 45 minutes, this is the maximum time Safe Voice can go back in time to screen.