In the past few weeks, pornographic footage has been circulating the internet in which the faces of porn actors and actresses have been replaced by celebrities’ faces. This has been done by means of freely available software called FakeApp, which makes use of deep learning. The software can be used on a simple desktop with an Nvidia GPU and enables anyone to train an algorithm with any dataset with pictures of a face which in turn can be transferred.
Until now, manipulating video at an advanced level was difficult, as it usually requires a lot of resources and manual work. In concurrence with the democratization of other digital creative domains (e.g. photography, 3D printing), FakeApp shows that video manipulation will become increasingly easy through the proliferation of cheap but advanced tools. One consequence of lowering the threshold for manipulating visual content is the continued distrust in information. In response, there is a need for precautions that can restore trust.
Hence, to combat the resulting information distrust, information sources will increasingly rely on validation through which authenticity can be proven, such as digital signatures and blockchains. Furthermore, we can already see that platforms will be pressured to remove deepfakes (e.g. Gfycat). However, this usually leads to the so-called Streisand effect, whereby removal only stimulates further distribution of said content. On a more positive note, these tools will also be used artistically, creating novel and surprising content.