Just as Boris Johnson’s campaign for the general election heats up, his political spin doctors’ shameless tactics have plunged to new and indefensible lows. Here we are, overwhelmed by not just one scandal but a whole range of campaign missteps from the Conservative Party — so many, in fact, that you’re probably wondering which one I’m talking about now.
For clarity, it’s the return of digital trickery I’m interested in — in this case the doctoring of a video of Labour MP and shadow Brexit secretary Keir Starmer, in which his response to questioning from the hosts of Good Morning Britain over his party’s position on Brexit was amended to look like he’d been stumped by the interview entirely. Sound familiar?
If you cast your mind back, it was the same tactic exploited by Donald Trump and his supporters in May. House speaker Nancy Pelosi was, thanks to manipulated “shallowfake” footage, depicted as having slurred her way through a speech at the Centre for American Progress in a bid to deflect attention from his refusal to comply with congressional investigations. The problem became so pronounced that Congress launched an investigation into the dangers of deepfake technology, while a number of bills have been passed in order to stave off the influence of these videos, many of which are intended to sway easily-manipulated voters on social media.
The intention is the same in this case too. While the footage of Starmer, shared by the Conservative Party’s social media platforms, is not necessarily a deepfake — the term describes technology that uses artificial intelligence to create scenarios that haven’t occurred — but more a question of misleading editing of genuine film footage, the fact that this tactic is featuring more prominently in politics is alarming nevertheless.
This is not the first time UK politicians on the right have been accused of leaning on the art of deception while electioneering. The question of interference from the likes of Russia in the EU referendum and 2017 snap election is still under debate, with a key report kept under wraps by Johnson.
So too is the issue of alleged breaches of election law under Vote Leave and BeLeave, currently being investigated by the Metropolitan Police. Just days ago, Michael Gove shared an antisemitic tweet by a fake account that claimed to be posted by a member of Momentum, Labour’s grassroots activist network. When challenged, Gove refused to admit he was wrong to have shared such unverified material online, and continued to target prominent members of the left under the guise of wanting to “deal with antisemitism within Labour”.
At this stage, we must conclude that this form of digital behaviour is deliberate, not just because of the track record in using these tactics but because — let’s not shy away from it — it works. Parliament’s interim report on disinformation and fake news, published last year, stated that “most (social media) users do not understand how the content they read has got there, but accept it without question” due to a shared lack of understanding of how social media works. As part of the probe, the University of Bristol’s professor Stephan Lewandowsky also said that “if we try to correct people’s beliefs based on what they have heard they may adjust their belief slightly but there is a lot of evidence to suggest that they continue to rely on that information nonetheless. (…) The cognitive consequences of fake news are pervasive.”
That explains why, instead of apologising and sharing on its social media channels the genuine and considerably more comprehensive answer that Starmer gave on that breakfast TV show, the Conservative Party instead doubled down on that heavily-edited video.