Allow users to report a message and automatically hide the reported message.
How does this mitigate hate?
Users should be able to report any harassing or hateful messages. Sending a message for review should hide the message on the reporter’s end.
When to use it?
Within messaging spaces, all participants should have the ability to report an instance of abuse or violation of policies. An option to report any message should be accessible, concise, and specific to use for all.
How does it work?
After a successful report has been submitted, the offending message should be hidden for the reporter. The reporter should also be able to unhide the message. Both the messaging space participants and the offender should be notified that the message has been reported and why. If a message is reported multiple times, further action should take place with updates regarding the review and appeals processes.
Reporting systems help to empower users to bring instances of hate to the platform’s attention. Messaging spaces are often less regulated than public social media spaces and many hateful messages can be undetected by the platform. Reporting can help empower targeted individuals to protect themselves and bring specific users to the platform’s attention while providing specific information on the violations.
Spam reporting can be used to target specific users or groups within a messaging space, especially in reporting systems that are not as robust or when platforms are under-resourced to effectively mitigate against abuse of the reporting system.
Facebook Messenger’s message report interstitial
Oesch, Sean, Ruba Abu-Salma, Oumar Diallo, Juliane Krämer, James Simmons, Justin Wu, and Scott Ruoti. “User Perceptions of Security and Privacy for Group Chat.” Digital Threats 3 (2022). https://doi.org/10.1145/3491265.
www.ag.state.mn.us. “Text Message Scams | the Office of Attorney General Keith Ellison,” n.d. https://www.ag.state.mn.us/consumer/publications/TextMessagePhishing.asp.