YouTube Plans to Give Users More Control Over Its Controversial Algorithm
Responding to recent criticism of its suggestion algorithm, YouTube is planning on introducing several changes that will give users more control over the videos and channels that YouTube suggests based on a user's history.
YouTube to Gives Users Tools to Police Themselves Through Suggestion Algorithm
In a post on the company's official blog, YouTube is planning on introducing a series of changes to its algorithm that generates suggested content for the user based on their history and on what other viewers are watching.
RELATED: DOCTOR REVEALS DANGEROUS CONTENT IN YOUTUBE KIDS VIDEOS
"We want to help viewers find new interests and passions — such as a new favorite artist, a new creator they can follow or simply the best food recipes," the company said. "But there's one true expert in what you want to watch: you. One thing we’ve consistently heard from you is that you want more control over what videos appear on your homepage and in Up Next suggestions. So we're doing more to put you in the driver's seat."
The three major changes outlined by the company include creating a new way to explore suggested content from the homepage that in theory should direct users towards more appropriate content, the option to remove suggestions from channels you don't want to watch, and inform users why a video or channel was being suggested to the user in the first place.
While these changes are certainly worthwhile, it doesn't do much to fundamentally change the nature of the algorithm itself, which is going to continue to produce highly problematic results.
As Gizmodo points out, this algorithm has had the pernicious effect of recommending home videos featuring children to pedophiles because other pedophiles keep clicking on videos of children. There's no built-in 'topic' for this kind of thing--at least we should really, really hope not--but YouTube's algorithm builds up viewer profiles automatically and makes uncanny connections between what someone is watching and what someone else who has a similar viewing profile is watching.
It then recommends these videos to the other viewer on the assumption that the viewer would like to watch these videos too. In the case of pedophiles, that assumption is correct and is precisely what makes it so incredibly problematic. Today's announced changes doesn't do much to address this concern because this is a problem with the algorithm itself.
This problem extends as well to YouTube's other major problem; it's role in radicalizing viewers on the fringe of dangerous political and ideological movements by suggesting videos similar to the ones someone is watching. If these videos are on the extreme end of the "similar" spectrum, clicking through to that video produces a fresh set of recommendations with this former outlier as the center of the spectrum.
Like a digital Overton window, more and more extreme content becomes what YouTube's algorithm uses as its baseline measure, producing even more extreme content at the edges. Click on those videos, and the process becomes a rabbit hole that can quickly lead to Nazis, white supremacists, and other hate groups that have been at the core of a host of recent attacks targeting mosques, synagogues, women, and the LGBTQ community.
The problem with YouTube's announced changes is that it leaves the algorithm in place that leads to problematic and even dangerous outcomes and instead gives the user the tools they need to police themselves, rather than making a real attempt to police the algorithm. If users were capable of doing that, however, we wouldn't be in the situation we're in in the first place. This isn't a problem limited to only YouTube, since Facebook, Twitter, and others work very much the same way, and have all failed to adequately police their platforms.