X

THANK YOU!

Post has been reported succesfully.

signout

Player Reporting in Minecraft: Java Edition

Our games bring millions of players together, from all over the world, all united by the goal of crafting, exploring, and having fun. That’s why it’s so important that our games are a safe and welcoming place for all players.  
 
To achieve that, we have a dedicated team of passionate people that work every day to celebrate and protect our community’s creativity and passion while ensuring interactions remain safe. As a part of that work, Mojang Studios moderates harmful content and behaviors that don’t follow the Microsoft Services Agreement as explained in the Community Standards for Minecraft and Xbox, but we can’t be everywhere at once. That’s where you, our players, come in. You are our community, and you help us keep Minecraft fun and safe. 

Empowering fun, welcoming, and safe experiences

The goal of having effective and intuitive Player Reporting mechanisms in our games is to empower the community to let us know when harmful behavior is taking place and getting in the way of the fun and intended experiences in Minecraft. 

There are different ways to file a report. In addition to being able to use the existing report a concern form and report a player in Minecraft, it is now possible to report a player for abusive messages in the in the Minecraft Java game client. 

You may also have seen our new profanity filter on Realms – these two functions are different. A chat report is always initiated by a player. No reports are created automatically. 

Submitting a Player Report in Minecraft Java

When you submit a player report, you are required to select the individual chat messages that contain the objectionable content, as well as the category of the report, to provide the best context for our moderation team to take action. 

Reporting can be accessed via the social interactions screen (default keybind is P) or via the pause menu. 

  • Multiple chat messages can be selected for reporting – additional context of surrounding chat messages will also be included in the report. 
  • The category of the player report is selected from a list of Report Categories. 
  • Additional comments can be entered to provide more details and information regarding the report. 
  • Evidence of the authenticity of the reported chat is also included with the report.   

Player Report Categories 

  • Imminent harm - Self-harm or suicide. 
      • Someone is threatening to harm themselves in real life or talking about harming themselves in real life. 
  • Child sexual exploitation or abuse. 
      • Someone is talking about or otherwise promoting indecent behavior involving children. 
  • Terrorism or violent extremism. 
      • Someone is talking about, promoting, or threatening with acts of terrorism or violent extremism for political, religious, ideological, or other reasons. 
  • Hate speech. 
      • Someone is attacking you or another player based on characteristics of their identity, like religion, race, or sexuality. 
  • Imminent harm - Threat to harm others. 
      • Someone is threatening to harm you or someone else in real life. 
  • Non-consensual intimate imagery. 
      • Someone is talking about, sharing, or otherwise promoting private and intimate images. 
  • Harassment or bullying. 
      • Someone is shaming, attacking, or bullying you or someone else. This includes when someone is repeatedly trying to contact you or someone else without consent or posting private personal information about you or someone else without consent (“doxing”). 
  • Defamation, impersonation, false information.
      • Someone is damaging someone else's reputation, pretending to be someone they're not, or sharing false information with the aim to exploit or mislead others. 
  • Drugs or alcohol. 
      • Someone is encouraging others to partake in illegal drug related activities or encouraging underage drinking. 

The Lifetime of a Player Report 

Let us walk through what happens when a player is reported. 

  • A player creates a chat report, selects the offending messages, category and details, and submits it 
  • The report is sent to our team of Minecraft Investigators
  • A moderator reviews the report and the evidence and assigns an appropriate action (if any) 
  • If action is taken, the offending player’s account is suspended from online play for some duration of time or in extreme cases permanently 

Reviewing reports

When reports are submitted, our team reviews not only the phrase reported, but the surrounding context and the authenticity of the report to determine if our community standards were violated. 
 
When someone doesn’t follow our community standards, it is possible that their account will get suspended. If your account has been suspended and you would like more information, please submit a review via this link, or scroll to the bottom of any page in our Help Center and select 'Case Review' to send a ticket to the appropriate team.

Abuse of the Player Reporting system

You are responsible for the reports you submit. Knowingly sending incorrect reports to try to get another player banned, excessively sending irrelevant reports, or otherwise abusing the player reporting system can lead to repercussions for your account. Do not incite others into using the player reporting system for your sake. 

 

Last Updated - 2022-08-08 16:46:16 UTC

Was this article helpful?