Voice Chat Report

The ability to report another user within a voice chat.

How does this mitigate hate?

Voice chat reports help prevent harmful speech by empowering participants and hosts to be proactive in maintaining a safe environment. In social voice platforms where a diverse user base may be present, it is important to display the rules and policies to everyone while encouraging users to flag any offenders or harmful content in real time.

LEARN MORE

When to use it?

After a report has been successfully submitted during or after a chat room has ended. The follow up process should be transparent to the reporting user, as well as the offender to minimize appeals evidence should be provided based on moderation and privacy policies of the platform.

How does it work?

After a report is successfully received, reporting users should be notified that their report was received by the platform, and notified of removed offenders. Officially blocked or banned users should be muted and not visible to the reporting user, nor should the blocked user have access or visibility of the chat room or reporting user to ensure safety by prioritizing privacy.

Advantages

Allowing users to report bad actors in voice chat brings parity to this mode compared to text based systems which already allow reporting.

Reports should be prioritized higher for users that utilize the platform more. This may assist in maintaining the integrity of the reporting system, while minimizing spam.

Disadvantages

Users who spam report features will have less impact. Platforms may not be able to stop hateful speech in real time, however empower users to restrict offenders immediately after the harmful speech is recognized.

Examples


Report screens for Twitter Spaces allows users to indicate the severity or type of issue being reported about the space.

References

ADL. “The Unique Challenges of Audio Content Moderation Part Two: Static vs. Livestreaming Audio.” Anti-Defamation League, June 30, 2021. https://www.adl.org/blog/the-unique-challenges-of-audio-content-moderation-part-two-static-vs-livestreaming-audio.

Shadijanova, Diyora. “The Problem with Clubhouse.” www.vice.com, February 10, 2021. https://www.vice.com/en/article/z3vkde/clubhouse-app-misinformation-problem.

platformabuse.org. “Platformabuse.org.” platformabuse.org. Accessed February 10, 2022. https://database.platformabuse.org/hate-speech.