Feed Inputs: User-Defined Algorithm
Users can select topics or people they want to see in their feed and prioritize those selections more than user activity when making recommendations.
How does this mitigate hate?
Activity can devolve into recommendations focused on hate content and misinformation just by watching one video. That action then informs the algorithm and overrides any user selection of content or people.
Prioritizing user selection within the recommendation algorithm allows for safer and more intentional content to be consumed.
When to use it?
Feed input selection should be presented upon registration, account creation, and should be configurable throughout the account’s existence. A person’s interests may change over time, so the feed inputs should be easy to access and edit.
How does it work?
Users should be prompted to select the types of content they want to see when they first open an account. They should also be able to easily access and edit their feed inputs. Consider including an option to toggle this feature on or off.
Advantages
This pattern gives users more control over the content they see and interact with. It can also mitigate tangents or rabbit holes that lead to delivering hateful content when a person is scrolling through content.
Disadvantages
While this pattern can mitigate harmful content consumption, it might also limit the reach of positive content. Perhaps a way to distinguish harmful content from safe content can be paired with user defined feed inputs.
Examples
References
Koponen, Jarno. “The Future of Algorithmic Personalization.” TechCrunch, June 25, 2015. https://techcrunch.com/2015/06/25/the-future-of-algorithmic-personalization/.
Luria, Michal. “This Is Transparency to Me”: User Insights into Recommendation Algorithm Reporting.” Center for Democracy & Technology. Center for Democracy & Technology, October 4, 2022. https://cdt.org/insights/this-is-transparency-to-me-user-insights-into-recommendation-algorithm-reporting/.
The research prototypes for the report – https://cdt.org/insights/this-is-transparency-to-me-research-prototypes/
Merten, Lisa. “Block, Hide or Follow—Personal News Curation Practices on Social Media.” Digital Journalism, November 13, 2020, 1–22. https://doi.org/10.1080/21670811.2020.1829978.
Wu, Katherine. “Radical Ideas Spread through Social Media. Are the Algorithms to Blame?” www.pbs.org, March 28, 2019. https://www.pbs.org/wgbh/nova/article/radical-ideas-social-media-algorithms/.