{{item.title}}
Key takeaways
In a speech that John F Kennedy never gave, he was to talk of America’s steps “to carry our message of truth and freedom to all the far corners of the earth.”1
Recently, an initiative by The Times recreated the 22-minute speech that JFK was meant to give in Dallas, in his own voice, using artificial intelligence.2 It is, they say, the ‘unsilencing’ of JFK, and listeners can hear it as it would have been delivered had he not been assassinated that day.3
Is it ironic that his message of truth is somewhat untruthful, recreated as it has been from an event that never actually happened?
At the very least, it’s a question that society and business will have to grapple with as the manipulation of deepfakes – computer generated replications of people saying and doing things they didn’t – continues to grow in sophistication and frequency.
There is a blurring of reality happening in the media, and it isn’t just about fake news. In China, the world’s first AI news anchor was unveiled – replicating a human newsreader, Xinhua’s Qiu Hao.4 Qiu’s digital personage will be able to deliver news 24/7, from anywhere his image can be superimposed, and it will say whatever text it is programmed with.
On the other end of the spectrum, deepfakes – fake videos concocted from real ones – came murkily out of the realm of online forum Reddit. Using machine learning, specifically, a form of deep learning (hence the ‘deep’ in deepfakes), AI has learnt to generate new data from old. The technique was then used to create fake pornographic videos by superimposing celebrity faces on adult film stars’ bodies.5
Soon enough, free software appeared that would easily allow anyone to make the videos regardless of technical aptitude. Since then, presidents have been morphed into delivering statements they didn’t make, movie stars de-aged or brought back from the dead for films (one of the rare legitimate uses of deepfakery), and Nicholas Cage inserted into every movie known to man.6 The genre is only getting more sophisticated, moving from lip syncing to whole body swaps.
Playback of this video is not currently available
We’ve all heard the moniker ‘seeing is believing’.
But as Eric Goldman, a Santa Clara University professor, recently told The Verge, “It absolutely bears repeating that so much of our brains’ cognitive capacities are predicated on what we see. The proliferation of tools to make fake photos and fake videos that are indistinguishable from real photos and videos is going to test that basic, human capacity.”7
With fake news anchors reporting real news, and real politicians seemingly presenting fake news, the implications for society of the proliferation of these manipulations could be immense. In an era where trust is becoming more important, how will we be able to apply it to what we see?
Journalists are already sensing the potential threat. The Wall Street Journal has launched its own task force of editors trained in deepfake detection.8 Others, such as the Australian Broadcasting Corporation, are training readers in how to identify doctored videos.9
The ability to tell truth from fiction is essential – and can be required quickly in critical moments – for understanding and reacting to the world around us. From politics and the potential for war, history and the need for human rights, or justice and its abuses of power – all could be irreparably altered with one convincing fake.
On a less global but still potentially dramatic scale, individual businesses too will need to be aware of the implications of deepfake videos. In an age of shareholder activism and corporate machinations, it is certainly not hard to envision video manipulation as being used for brand and reputational damage either by competitors, ex-employees (or employees’ exes, as with revenge porn) or professional scammers.
While companies could spend time educating staff in how to spot fake videos (and there are ways, such as odd blinking patterns, strange continuity, pixel and metadata manipulation, metallic-sounding audio) the technology will continue to get better, to the point where it will be virtually impossible to identify a fake without in-depth forensics.
Technologies are being developed that may help, both in spotting altered video and in verifying the veracity of a video as it’s taken.10 But the faking technology is ever changing, and it will be difficult for detection to keep up.
PwC UK’s Arnav Joshi, an expert in data ethics and digital trust, believes that for now, business needs to be practical in its approach. He suggests that companies keep in mind the three following principles:
Deception-identifying software may eventually help us to identify fact from fiction at scale, but it remains true that we will probably have to get used to the idea that we need to check. As Goldman puts it in in his Verge article, “I think we have to prepare for a world where we are routinely exposed to a mix of truthful and fake photos and videos.”
It will undoubtedly become commonplace, strange even to think of a time when viewing something – at least outside of entertainment – would mean it was automatically assumed to be authentic.
The Guardian’s Alex Hern believes we may already be there, deciding after a year of observing the phenomenon that, “deepfakes aren’t dangerous because they’ll change the world. They’re dangerous because the world has already changed, and we’re less ready to tackle their reality distortion than we have been for decades.”11
Whether in a new era, or still approaching one, it will remain true that everyone, from individuals to businesses and greater society, will need to be vigilant in their consumption, and production, of video content.
Get the latest in your inbox weekly. Sign up for the Digital Pulse newsletter.
Sign Up
Amy is the Editor in Chief for Digital Pulse, PwC Australia
References
© 2017 - 2024 PwC. All rights reserved. PwC refers to the PwC network and/or one or more of its member firms, each of which is a separate legal entity. Please see www.pwc.com/structure for further details. Liability limited by a scheme approved under Professional Standards Legislation.