YouTube moves to make conspiracy videos harder to find
SAN FRANCISCO >> Whether it is a video claiming the Earth is flat or the moon landing was faked, conspiracy theories are not hard to find on Google’s YouTube. But in a significant policy change, YouTube said today that it planned to stop recommending them.
After years of criticism that YouTube leads viewers to videos that spread misinformation, the company said it was changing what videos it recommends to users. In a blog post, YouTube said it would no longer suggest videos with “borderline content” or those that “misinform users in a harmful way” even if the footage does not violate its community guidelines.
YouTube said the number of videos affected by the policy change amounted to less than 1 percent of all videos on the platform. But given the billions of videos in YouTube’s library, it is still a large number.
YouTube and other powerful technology platforms have faced rising criticism for failing to police the content that users post.
YouTube’s recommendation engine has been denounced for pushing users to troubling content even when they showed little interest in such videos. It has also been blamed for widening the political divide in the country, pushing already partisan viewers to more extreme points of view.
The new policy is also the latest example of YouTube taking a more aggressive approach to content that many find distasteful even if it is not in violation of the service’s community guidelines.
Don't miss out on what's happening!
Stay in touch with top news, as it happens, conveniently in your email inbox. It's FREE!
In late 2017, YouTube started putting “controversial religious or supremacist” content in a “limited state” so the videos are not monetized with advertising and features such as comments and likes are turned off. Some videos appear behind a brief message saying the videos may be inappropriate or offensive.
YouTube only provided three examples of the types of videos that it would stop recommending: those promoting a phony miracle cure for a serious illness, ones claiming the Earth is flat, or content making blatantly false claims about historic events like 9/11.
The company declined to provide more detail on what other videos would be classified borderline.
YouTube is not taking down the targeted videos, and it will still recommend them to users who subscribe to a channel that creates such content. Also, YouTube will not exclude the borderline videos from search results.
“We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users,” YouTube wrote in the blog post.
YouTube said it was constantly adjusting its recommendation system, noting that it made hundreds of changes last year. In its early years, YouTube said, it suggested videos the company thought would lead to more clicks or views, but it found that people who create videos started trying to game the system with clickbait titles.
YouTube recently said it wanted to recommend videos that viewers would consider “time well spent.” YouTube also said it had been working to broaden recommendations so that they aren’t too similar to the most recent video.
Much like the powerful and opaque algorithms that govern the search results of YouTube’s parent company, Google, the video service is secretive about the factors weighed by its systems that dictate which videos get recommended.
YouTube did not reveal much about how it would determine which videos would be excluded from its recommendations. The decisions on specific videos will not be made by YouTube employees, but by machine-learning algorithms.
Human raters from “all over the U.S.,” the company said, will watch different YouTube videos and provide feedback on the quality of those videos. Those judgments will help inform what the algorithm flags.
Google has adopted a similar approach in determining the quality of its search results.
YouTube said it would start making a gradual change on a small set of videos in the United States, but it plans to introduce the alterations globally as the system becomes more accurate.
© 2019 The New York Times Company