Livestream Broadcast Delay
A platform briefly delays a livestreamed broadcast to enact automatic moderation.
How does this mitigate hate?
Moderation is essential to successfully removing and filtering potentially harmful content or unwanted experiences while on a social platform. A momentary delay in broadcasting livestreams allows platforms, moderators and the community to moderate the content in real time before it reaches the audience.
When to use it?
A delay should happen in realtime during a broadcast. The delay should be a duration which allows the stream to feel natural while allowing an algorithm or human moderators to filter out potentially harmful content.
How does it work?
A brief delay may work in tandem with algorithms or human moderators to filter out content in violation of safety rules. A delay can apply to both the backend and front end of a platforms interface. i.e.
– A live streamer should have the option to delay their broadcast if they feel they’ve been targeted in the past and request their livestream be actively monitored.
– A platform should be able to delay previously flagged channels or users streams at the platform’s discretion, so long as the user is aware they are being actively monitored based on their previous behavior.
Delays can offer platforms the opportunity to moderate and remove a livestream before severe exposure to harmful content
Offer an opportunity to display reminders, or reiterate conduct rules.
Allows for faster moderation/ reporting of harmful streams
Delays require proper moderation to be useful. Currently, many platforms aren’t capable of the necessary moderation therefore, putting the onus on the users to report diligently.
Twitch allows streamers to set a delay for the stream, giving time for real-time moderation to happen.
ADL. “How Platforms Can Stem Abuses of Livestreaming after the Storming of the Capitol.” Anti-Defamation League, January 15, 2021. https://www.adl.org/blog/how-platforms-can-stem-abuses-of-livestreaming-after-the-storming-of-the-capitol.
ADL. “The Unique Challenges of Audio Content Moderation Part Two: Static vs. Livestreaming Audio.” Anti-Defamation League, June 30, 2021. https://www.adl.org/blog/the-unique-challenges-of-audio-content-moderation-part-two-static-vs-livestreaming-audio.
Douek, Evelyn, and Quinta Jurecic. “The Lawfare Podcast: The Challenges of Audio Content Moderation.” Podcast. Lawfare, April 22, 2021. https://www.lawfareblog.com/lawfare-podcast-challenges-audio-content-moderation.
“How Do I Moderate the Chat on My Events.” Vimeo Livestream. Vimeo. Accessed September 28, 2021. https://help.livestream.com/hc/en-us/articles/360002069027-How-Do-I-Moderate-the-Chat-on-My-Events-.
Israel, Samuel. “Hate the Player and the Game? How Hate Speech Spreads in Online Gaming Communities.” Chicago Policy Review, September 29, 2020. https://chicagopolicyreview.org/2020/09/29/hate-the-player-and-the-game-how-hate-speech-spreads-in-online-gaming-communities/.
Jiang, Jialun Aaron, Charles Kiene, Skyler Middler, Jed R. Brubaker, and Casey Fiesler. “Moderation Challenges in Voice-Based Online Communities on Discord.” Proceedings of the ACM on Human-Computer Interaction 3, no. CSCW (November 7, 2019): 1–23. https://doi.org/10.1145/3359157.
Koss, Hal. “Are Content Moderators Ready for Voice-Based Chat Rooms?” Built In, October 27, 2020. https://builtin.com/product/content-moderation-voice-audio-chat.
Shadijanova, Diyora. “The Problem with Clubhouse.” www.vice.com, February 10, 2021. https://www.vice.com/en/article/z3vkde/clubhouse-app-misinformation-problem.
Sultan, Ahmad. “Livestreaming Hate: Problem Solving through Better Design.” Anti-Defamation League, May 13, 2019. https://www.adl.org/news/article/livestreaming-hate-problem-solving-through-better-design.