“This is going to become a huge hit. Will get acquired very soon.” commented a ProductHunt user in late 2017 about newly launched iOS app Mug Life. The app may have suffered from a terrible UI but its capabilities were nothing short of scary – it was bringing the beginning of the deepfake revolution to everyone’s smartphone.
Mug Life has evolved a great deal since then. You can upload a real face or even a random meme haphazardly drawn in MS Paint and it will be brought to life. Mug Life’s Android release has meant better revenues for the company – SensorTower estimates that the app earned a net $40,000 last month. And now there are a dozen apps just like it; they use algorithms that go far beyond your ordinary dog face filters, turning images into videos that are soon going to be impossible to tell apart from reality.
Deepfakes are created using a Generative Adversarial Network – a type of machine learning architecture that takes a large volume of data as input (eg. videos of a person) and learns how to create fake content involving that person. The availability of open-source programs means you could literally buy a PC with 8GB of RAM and train the software to fake someone’s face within a week.
What does this mean for ordinary citizens? A slim chance of financial fraud or revenge porn. It’s going to be a whole lot worse for celebs, politicians and billionaires whose livelihood depends on their reputation. They’ll either have to normalize being part of traumatic adult videos or continuously spend $$$ on counter-propaganda.
Another issue might be politicians declaring real videos as deepfakes – how will it even be proved what’s real and what isn’t? Epstein obviously killed himself, because if any evidence of crime exists, it will instantly be “proven” in court to be deepfake content.
It’s not all bad, though. Deepfake tech can put a smile on everyone’s face. Like when it was used to put Carrie Fisher in Star Wars Episode IX. Or Paul Walker in the 8326th instalment of Fast & Furious. Or Donald Trump in the White Hou- oh sorry, that actually happened.
A few years from now, pretty much everyone will be able to fabricate their choice of deepfake videos on their smartphones. Whether it will be put to use for the greater good or bad remains to be seen, but we can be sure of one thing: Indian aunties & uncles will forward a lot more nonsense on family groups in 2025.