Lionel Laurent, Tribune News Service
Social networks such as YouTube and Facebook have the power to make content go “viral,” spreading it at an unprecedented and uncontrollable pace. That seems innocent enough when you’re looking at a cat video, but if it’s murder, for example, the lack of a way of stopping the virus becomes glaring.
After the New Zealand mosque shootings were streamed live in March, attempts to remove the video from online platforms were shown to be hopeless. Facebook took it down 1.5 million times and it still reappeared. And it was put back up on YouTube every second for 24 hours, according to New Zealand’s Prime Minister Jacinda Ardern, who’s teaming up with French president Emmanuel Macron to try to tackle the plague of harmful online content.
The two leaders’ initiative, known as the “Christchurch Call,” is the start of a long process that will need to balance all sorts of issues – from freedom of speech to privacy. But for it to have any effect at all, policymakers must get under the hood of the social media companies’ software and understand how content gets fed to viewers in the first place. While Macron has taken the encouraging first step of embedding government officials in Facebook’s offices to monitor how it polices content, it’s yet to be seen how far these people will grasp the technology or how great their access will be to the company’s real “secret sauce”: Its algorithms.
This is certainly an interesting change of direction, though. Much of the debate so far has been about how to get the tech giants to better regulate themselves. That has included deploying more human moderators to police content, beefing up artificial intelligence tools to stop bad stuff spreading, and “de-platforming” the worst online offenders. But it’s been like fighting a forest fire with a water pistol. The heart of the problem lies with those dopamine-feeding algorithms.
Alphabet Inc.’s Google and Facebook Inc. are in the business of advertising, and it’s their ability to grab the attention of their users that makes them so powerful and “sticky.” For all the negative headlines about data breaches and toxic content over the past year, Facebook and YouTube still both reach about 2 billion users a month. That’s because they’re very good at three things: Knowing what users want, serving it to them automatically, and encouraging a feedback loop of engagement. This is all thanks to the constant fine tuning and updating of the algorithms that dictate content filtering, promotion and recommendations. The aim is to keep people on the website for as long as possible with minimal effort. It works: The human mind is no match for a supercomputer.
Where it all gets complicated is the potential business conflict between maximizing engagement, which is what advertisers (and shareholders) want, and minimizing extreme content, which is what politicians are demanding.
And just how much control do these companies even have over what their black-box filters pump out? That is a question being asked by Ardern. The Christchurch killer’s video was designed to go viral, with all of the emotional force that’s so effective with the algorithms, and the social networks proved unable to stop it.
Facebook’s founder Mark Zuckerberg has made much of his desire to “be regulated” to try to fix the problem of extreme content and fake news, but it’s too early to say whether initiatives like Macron’s will really be given the keys to the kingdom.
Bruno Patino, dean of Sciences Po’s journalism school, makes some useful new suggestions in his new book, “The Goldfish Civilization”: Greater transparency on how the algorithms function, a more ethical approach to how they are designed in the first place, and a much clearer divide between advertising and content. If that leaves new media looking more like old media – and with a potential valuation discount to boot – it seems like a worthwhile price to pay.