YOUR ESSENTIAL GUIDE TO THE BUSINESS OF INFLUENCER MARKETING

News

YouTube fine-tunes recommendation algorithm - changes may isolate users

YouTube fine-tunes recommendation algorithm - changes may isolate users

Video giant YouTube has tweaked its recommendation feature, to only suggest videos that are chronically similar to the ones that viewers are already choosing. There's a concern that this change to the platform will 'trap' users in a 'bubble of misinformation'. 

The feature relies on data from an on-going user survey; the feature uses collated information to predict what a user will want to watch, and then pushes videos based on similar tags, titles and genre. YouTube already uses a machine learning algorithym to generate personalised recommendations for every single user.

Recommendations now power 70 percent of the combined “watch time” on YouTube, compared with 40 percent in early 2014, YouTube has said. 

"The goal is to prevent the negative sentiments that can arise when people watch hours and hours of uninspired programs." said Jim McFadden and Cristos Goodrow, who work on recommendation technology at YouTube.

The more time people spend watching, the more ad slots YouTube can sell. YouTube advertising is one of Google's biggest earners.

However, while this tweak may lead users to watch more content without having to actively choose it - it comes at a time when YouTube and other sites are under heavy scrutiny for failing to accurately police the content appearing on the platform.

There's a worry that watchers will become indoctrinated by recommended content

YouTube appears to be worried about shoehorning its userbase into limited thinking.  

The platform has begun to measure satisfaction by surveying its users about which videos they did or didn't enjoy. A version of the survey asks whether a video watched in the last week was “one of the best,” “great,” “about average,” “poor” or “one of the worst.”

There is a concern that users will use this mechanism to push inappropriate, misinformed or propagandic content. 

Johanna Wright, vice president of product management at YouTube, said in an interview that the company is taking steps to combat misinformation.

"Next year, YouTube is planning to have a similar initiative around science videos to surface “the established belief on the topic” on science videos, she said.

YouTube’s main goal is to maximize viewing time. Alphabet Executive Chairman Eric Schmidt said recently that there wasn't a whole lot the company could do without a larger societal shift.

"The problem of filter bubbles will persist." Schmidt told an international security conference on Nov. 18, “until we decide collectively” that users should see content from “someone not like you.”

However, ex-Google engineer Guillaume Chaslot disagrees with this idea. 

“Users are not asking YouTube to optimise for truth,” Chaslot said.


 

Tags:
Editor

Danielle Partis is editor of PocketGamer.biz and former editor of InfluencerUpdate.biz. She was named Journalist of the Year at the MCV Women in Games Awards 2019, as well as in the MCV 30 under 30 2020. Prior to Steel Media, she wrote about music and games at Team Rock.