Wavell Room
Image default
Book ReviewsCyber / Information

#WavellReviews “Trust No One: Inside The World of Deepfakes” by Michael Grothaus

Available here from hachette

If you want to dip your toes into the world of deepfakes, then Michael Grothaus’ Trust No One is an excellent place to begin.  At its heart lies the author’s investigation into the world of deepfakers: those responsible for making these short films, their motives, and their ‘code’.  Engrossing and disturbing in equal measure, Trust No One is a warning and shows a far more threatening side to deepfakes than NATO’s 2020 Deepfake Primer which concluded “while the threat from deepfakes is real, the risks are somewhat narrower than are frequently portrayed”.  Two years is a long time in the world of deepfakes, and I’m with Grothaus on this one.

Grothaus makes some big, bold statements about the impact of deepfakes, and my initial impression was that he was just sensationalising the subject.  However, it becomes clear very early on that he is simply spelling out how dangerous deepfakes are.  Given the rapid advances in deepfake technology, they will change much of what we see, hear, and believe on our computers and phone screens.  From politics to entertainment, from history to healthcare and perhaps how we view death itself.

You don’t need to have an in-depth knowledge of the technology behind deepfakes to engage with the book.  Grothaus explains the basics of deep versus machine learning, explains the key role of generative adversarial networks (GANs), and the networks that compete against each other to manipulate a piece of video – one as the forger and the other as an inspector.  Describing today’s results as ‘amateurish’, it is Grothaus’ view that within the decade, deepfake software will have advanced so rapidly that real world footage will no longer be required.  Even today, deepfakers don’t need to create a face from scratch.  Simply tailoring a politician’s lips with subtle audio changes can produce the desired effect.

To give the reader the scale and speed of deepfake progress, Grothaus provides a brief ‘historiography’ of the technology involved, starting with the arrival of Face Swap Live in 2015.  In many instances, deepfakes were often mistaken for shallowfakes; two videos crudely stitched together that require time consuming manual inputs to achieve a result that today would be regarded as nothing but cringeworthy.  For those with an inquiring mind, shallowfakes will invariably be viewed with suspicion.  But not all the world is so suspicious or has such a rounded view.  In 2016, the shallowfake entitled The Hillary Song attracted over 3.5 million views on You Tube.  Simply slowing down a video of Nancy Pelosi made it seem she was drunk and slurring her words.  Basic techniques that threw pebbles into the already turbulent misinformation pond of US politics.

In December 2017, the first AI-generated fake celebrity porn videos began to appear, the work of a computer programmer who went under his Reddit username: Deepfakes.  Reface’s arrival on scene in 2020 meant that instead of needing a database of hundreds if not thousands of photos to create a deepfake, the user just needed a single selfie.  It is true that deepfake detection tools can be used to find new ‘tells’ or giveaways, but GAN technology means those imperfections can be removed almost instantaneously.  The world of deepfakes has its own AI-driven OODA loop.

Throughout the book, Grothaus invites us to look into the world of deepfakes and the deepfakers that are responsible for some of the material.  Needless to say, it’s a secretive world of nicknames and pseudonyms.  Even here, Grothaus discovers a sense of self-imposed rules and regulations – boundaries that even your average deepfaker would not cross.  This part of the book gives Trust No One a similar feel to Patrikarakos’ War in 140 Characters.  Just as Patrikarakos identified ‘a new breed of warriors’, Grothaus has shone a light on a number of individuals and technologies that are already having an impact on what we are watching and, in some cases, believing.

Although every chapter covers a different aspect of the world of deepfakes, they all have one thing in common: the overwhelming sense that that world is depressingly dark and sinister.  Grothaus does try to make a case for the positive use of deepfakes in video games or giving Siri or Alexa a physical form.  Or deepfaking celebrities to spread positive social care or health messages in a range of languages.  They are already an invaluable asset as companies generate their own virtual influencers who will never age and never embarrass their sponsors.  One such TikTok and social media star is Aliona Pole, an accomplished 18-year-old fashion designer who has modelled for Vogue.  She has achieved great things in less than 3 years because her creators can produce new videos using the latest deepfake software in a matter of hours rather than weeks.

Unfortunately for the likes of Aliona et al, this sheen of positivity is scant comfort when the balance is overwhelmingly in favour of subjects such as non-consensual and celebrity porn, political misinformation, bribery and blackmail.  Some of the areas seem bizarre.  Deepfaking images of internal body organs on scans that are meant to convince a target individual they have months to live that may cause a resignation, or even convincing that individual’s medical staff that risky invasive surgery is required.

What of the ability to manipulate history?  Less than one hundred years ago, Stalin’s purge led to the most extensive photo manipulation of the 20th century as pictures of Stalin’s now dead comrades and colleagues were destroyed.  Private citizens were also encouraged to ink over or tear out pictures of their family members who had also ‘disappeared’.  Simply watch In Event of Moon Disaster to see President Nixon announcing the failure of Apollo 11.  What next, a trusted news correspondent telling the world of a terror attack, stray drone strike or invasion?

The impact of deepfakes has already attracted the attention of lawmakers, particularly in the United States.  Who wouldn’t want to ban deepfakes for all the damage they can do?  There is, of course, the problem that state laws in America are trying to ban deepfake technology, but many argue that the First Amendment protects freedom of speech.

Grothaus opens a seemingly surreal artistic and intellectual clash, where a blanket ban is viewed by production companies such as Disney as something that will close the door on a technology that could transform the film industry just as sound did in the 20s or colour in the 50s.  For China, the legislation is simple; they have banned the use of deepfake technology in its entirety.  A useful ploy of course which means that it can label any audio or visual content that is detrimental to the party or social order – even if it’s authentic – as a deepfake.

If the chapters ‘The End of History’ and ‘The End of Trust’ make depressing reading, ‘The End of Life’, the final chapter, brings ethical matters to a head.  Grothaus describes how he watches a short film of his father as he films himself walking through the park and greeting passing joggers using a smartphone, he gives one final wave to the camera, smiles and the film ends.  Grothaus’ father died in 1999.  He never owned or knew how to operate a smart phone.  He was brought back to life using deepfakes.  As Grotahus says, “everything about deepfakes is complex – except for the expertise needed to create them”.

Phil Clare

Phil Clare is a former RAF Logistics Officer. He has over 30 years experience of single and joint service environments, as well as operational experience that spans Op GRANBY to HERRICK.

 

Related posts

Thinking the Unthinkable

#WavellReviews “I, Warbot. The Dawn of Artificially Intelligent Conflict” by Kenneth Payne

The Wavell Room Team

Sowing Darkness for Military Advantage: Deception

Charles B. Vandepeer and James L. Regens