Advertisements

After the fake news, fake porn: Software allows it, and it’s scary!

Published on

This is a technology that may be talked about: if you have not yet heard about it, we advise you to read carefully. An application now makes it possible to replace the face of one actor with that of another. How does it work? What are the abuses? Decryption.

How this all started

The idea came from an active member of the site Reddit (NSFW), to the evocative pseudo Deepfakes, who had been having fun since the end of 2017 making fake porn videos. It involves transferring a celebrity’s face to the bodies of pornographic actors.

Video excerpt from a fake porn edited by UnobstrusiveBot. The redditor transferred Jessica Alba’s face to Melanie Rios’ body (a pornographic actress).

And the practice is developing, with another redditor who in turn offers FakeApps, a user-friendly tool to give everyone the means to create their own fakes. This redditor, who gave himself the nickname of deepfakeapp, told Motherboard that his intention was to further refine the application so that all users “could simply select a video on their computer […] and create their video in one click”.

Easy fake porn with any celebrity

Jessica Alba, Daisy Ridley, Emma Watson, etc. Dozens of celebrities have therefore fallen into the hands of the users of the algorithm who only need a few photos to make their assembly.

Video excerpt from nuttynutter6969 featuring Daisy Ridley (or rather her face) in a pornographic film.

There are still limits to their use, since the use of the algorithm still requires a hardware necessary for its proper functioning, including a powerful (and recent) graphics card, and a robust computer.

A stunning technology that questions

It must be said that the result is really amazing. If an algorithm is enough to produce a convincing video featuring any celebrity or anyone you know, how do you untangle the real from the fake?

Another redditor, derpfake, wanted to demonstrate the potential of this application by making the same “fake video” as the giants of Walt Disney studios who produced the blockbuster Rogue One: A Star Wars Story.

“Above, images from Snape One with a weird CGI Carrie Fisher. Budget: $200 million. Below, a fake made in 20 minutes that would have been made in much the same way with a similar actress. My budget: $0 and a few pieces of Fleetwood Mac music” says derpfake.

Beyond the quality of the resemblance or the savings made, this demonstration raises several questions: today, we can play famous actresses in porn without their knowledge, what can we do others? What could be the weight of a fake news with a quality fake video?

Several organizations have already succeeded in making very compelling videos of political figures like Barack Obama or Donald Trump, and this may not stop there. If many specialists tend to say that it will always be possible to detect fraud even if the naked eye no longer perceives it, does that mean that the people creating these technologies will also be the only ones able to detect the truth of the false?

For more information about this, we invite you to find out how to recognize fakes news, and how FakeApps works.

Leave a Reply

Your email address will not be published. Required fields are marked *

On the same subject

Advertisements