If you caught sight of a screen on November 12 you may have been forgiven for doing a double take as Boris Johnson tipped Jeremy Corbyn for Prime Minister and the Labour leader returned the favour. It wasn’t a softening of either’s stance towards the other – their testy exchanges in telly debates since attest to that – both clips were deepfakes, videos that use deep learning to imitate a person.
“As we see with Donald Trump now dismissing things as fake news, it’s very easy to see a future where people will be dismissing things as deepfakes,” says Areeq Chowdhury, the head of think tank Future Advocacy (FA), the group behind the deepfakes. “That is the big threat and I don’t know what the answer is currently.”
A video showing Boris Johnson endorsing Jeremy Corbyn for Prime Minister has just landed online, another shows Corbyn backing Johnson.
— Catrin Nye (@CatrinNye) November 12, 2019
The Corbyn and Johnson deepfakes thrust the issue into the spotlight when they featured on BBC show Victoria Derbyshire – coverage has been rare in the Brexit-dominated political news.
But FA’s effort struck a chord due to the high-profile pair it sought to emulate with the deepfake videos. That was the result of six months’ work, after Chowdhury and his team joined forces with UK artist Bill Posters – who created an attention-grabbing deepfake of Facebook chief Mark Zuckerberg earlier in the year – Israeli company Canny AI and Britain’s Got Talent impressionist Darren Altman.
In anticipation of a winter general election, FA spent two months recreating Johnson and Corbyn’s faces from the nose to the chin before using Altman’s talents to recreate each politician’s voice and then ensuring that the lip-syncing matched the audio.
In total, more than 92,000 people have sold The Big Issue since 1991 to help themselves work their way out of poverty – more than could fit into Wembley Stadium.
The results are impressive – if not completely foolproof. Corbyn’s voice in particular gives the game away slightly. And that remains the last hurdle to get over, according to Chowdhury.
“Working on the deepfakes has given me a bit of hope to be honest because it took us so long,” he tells The Big Issue. “It’s much easier now to make a deepfake than it used to be. Previously it was only Hollywood production companies that could do it. Now you’re having think tanks do it. Once you get over that barrier of needing a voice impressionist and having that human as part of the process, then it becomes quite worrying. But I think we’re at least a few years from that.”
There is progress being made on the voice front too – take actor Jordan Peele’s Barack Obama deepfake in October, which saw the actor use AI to imitate the former president’s voice to warn of the threat of the technology.
But, away from the political arena, the voice is barely an issue. According to a report by Deeptrace Labs – a service designed to identify online deepfakes – 96 per cent of the 14,678 deepfakes they found were pornographic videos of female celebrities. When the goal is fulfilling sexual fantasies without the woman’s consent, a realistic voice is unsurprisingly not a priority.
While the technology improves to the point where it could meddle with democracy – imagine a time when every video you see cannot be verified – this use of deepfake tech remains the most imminent threat to most people.
The scope for creating revenge porn or cyber bulling is endless. But is the law up to stopping these videos from becoming commonplace? Traditionally, politicians have lagged behind – by the time any proposed legislation has made its way through Parliament the tech has moved on. Add in Brexit’s tendency to clog up proceedings in Westminster and Chowdhury is pessimistic about preventing the coming storm.
— Future Advocacy (@FutureAdvocacy) December 2, 2019
“These pornographic deepfakes are getting a lot of traffic on the internet and therefore raising a lot of revenue for the people who are creating them, and that adds the incentive to create more and create better ones,” he says. “We should be thinking about ways in which we can limit the damage and compensate victims, for example.
“With technology we always need to be thinking about what will be happening in five or 10 years’ time or what can happen. The laws will be slow to keep up and we need to be thinking about this now.”
When you head to the polls next week, Brexit and NHS may be top of the agenda – but when the integrity of what we are seeing is called into question it may be a case of eyes rather than ayes having the most profound impact on future politics.