Susan Wojcicki Explains Why She Ruined YouTube


In May 2020, YouTube CEO Susan Wojcicki gave an interview to the New York Times who were publishing a podcast series accusing YouTube of pushing users towards political extremism with their algorithm.

The podcast series documents a change in the YouTube algorithm from one that promotes 'more of the same' to a more complex version that filters out more content.

Susan Wojcicki explains how and why this happened:

"One of the biggest realisations for me was that we needed to start adapting and changing, and that we needed a very different set of ways of managing information than you want to manage entertainment."

The New York Times host asked Wojcicki if there was a moment that crystallised this view:

"In 2016 there was a terrorist attack that had happened in France, and I remember reading that and being extremely upset, and thinking: our users need to know about it."

YouTube started pushing news about the attack on the homepages of French users, and searches about the attack started being algorithmically manipulated to prioritize "authoritative sources".

"But that didn't perform very well on our platform. The users don't want to actually see it. And so, what do you do as a platform?

Do you show it to them anyway?

I remember that very clearly because that was the first time I said to them [YouTube software engineers]: 'You know, it doesn't matter. We have a responsibility. Something happened in the world and it's important for our users to know.'

And it was the first time we started using the word responsibility, and the fact that we needed to put information in our site that was relevant even if our users were not necessary engaging with it in the same way they would with the entertainment videos."

This is Wojcicki admitting that YouTube moved from prioritising the user to prioritising "responsibility".

The obvious point being: what is the new limiting factor of this "responsibility"?

Following 2016, several artificial scandals were contrived. The first was the moral panic invented by the Wall Street Journal concerning Coca Cola adverts being shown on videos promoting extremist ideologies. Not that anyone believed that Coca Cola was endorsing the videos in the first place. The obvious truth was that a bot matches ads to users, not to videos.

Another is of Caleb Cain, a man "radicalised by alt-right figures via their persuasive YouTube videos", as featured on CNN.

The NYT spoke to Caleb in a podcast after YouTube changed their algorithm. Since the change, he found himself falling down a far-left rabbit hole instead.

As the NYT puts it: "That's when YouTube started making these big changes". These big changes include banning Alex Jones, or limiting the reach of figures such as Gavin McInnes and Stefan Molyneux.

"One of the biggest changes that we made from our recommendation system [...] We realised that there was a set of content that, even if people were repeatedly engaging with it, that it was important to make the recommendations more diversified and to show them more quality content alongside.

We began to identify content that we called borderline content; and that if users were repeatedly engaging with this borderline content, we will show them other content alongside that we've ranked as more high-quality content."

If a video breaks YouTube's terms of service regarding 'hate speech', the video will be removed outright. But Wojcicki is saying there are videos which she does not want to promote, but do not break the terms of service.

She introduced the policy of borderline content to shadow-ban these videos.

"And what that has done is... It's actually had a 70% reduction of views of borderline content in the US, and we have been in the process of rolling that out globally. It's in most English-language countries and a few large markets like France or Germany."

If you are a creator, or someone concerned about censorship, you might wonder what 'borderline content' is.

When Wojcicki was asked to define borderline content she would not do it.

"It's a complicated and nuanced area [...] We have a whole process and algorithm to find it."

That is how YouTube went from a platform that focused on the demands of the users, to a platform that focuses on the demands of the corporate media.

Check out our premium content.


Subscribe to Newsletter

Share:

Comments