Civil Society & Independent Media Referrals
Civil society and independent media organizations should be able to flag hateful and abusive content and escalate reports and appeals on behalf of individual users and/or employees through a formal, in-platform communication channel.
How does this mitigate hate?
Users experiencing hate and harassment sometimes turn to civil society organizations and/or to their employers for help. Targeted users can benefit from quicker response times and more effective content moderation when civil society and independent media organizations can quickly and efficiently flag hateful and abusive content and escalate reports and appeals. Platforms can slow the spread of harmful content online and more quickly address harassment by formally collaborating with civil society and independent media organizations.
When to use it?
When a platform’s content moderation system fails or when an urgent episode of hate or harassment requires a time-sensitive response, civil society and independent media organizations need to be able to quickly and efficiently flag hateful and abusive content and escalate reports and appeals on behalf of individual users.
How does it work?
Platforms should provide civil society and independent media organizations with a formal, dedicated escalation channel that is integrated into the platform’s primary user experience.
Civil society and independent media organizations can use this channel to advocate on behalf of an individual under attack, such as a journalist or human rights defender.
Advocacy would include the ability to quickly and easily escalate and expedite reports and appeals directly to the platform and flag hateful and abusive content.
Platforms can expedite and strengthen their content moderation process by allowing trusted civil society and independent media organizations to flag hateful and abusive content and escalate reports and appeals that are in violation of platform’s policies. This is especially relevant in cases of malicious or inaccurate content takedowns and time-sensitive cases where a delay in restoring content or accounts could be harmful, for example during times of urgent political debate or crises.
A civil society or independent media organization can make mistakes, such as flagging content that is not a policy violation. Developing a list of trusted, independent civil society and media organizations will require time, knowledge, and care
YouTube has a Trusted Partner Flagger program that allows users to flag videos that violate Community Guidelines in bulk. Trusted Flaggers gain visibility into decisions on flagged content and videos flagged by Trusted Flaggers are prioritized for review by content moderators for increased actionability. Flagged videos are reviewed by YouTube content moderators according to YouTube’s Community Guidelines.
Several social media platforms have informal escalation channels to allow some trusted civil society organizations to flag hateful and abusive content and escalate individual cases; however, these channels are informal, idiosyncratic, little known, and rarely integrated into the platform itself.
Written by PEN America