Sunday, April 29, 2018

DeepFace


Images of celebrities as minors are showing up in datasets used in making AI-generated fake porn.

I kind of hate the Deep___ thing. DeepState, DeepFake, DeepWaste, DeepFace, DeepHate. Mostly because I wrote a book about Deep Learing and Olfaction, and couldn't market it fast enough to catch up with the wave. DeepLate

Still, we can't help here at Network Address but to record all this talk. And especially when it comes to accidentally making child porn.

You might not know that people are making fake porn using photoshop-for-videos. You also might not know that there's a database of images used for importing into these fake videos (facesets, yes). This is a logical extension of the faceswapping we saw almost 10 years ago.

The problem is when you're trawling-up one of these facesets of someone and accidentally pick up some photos of a person when they were a kid. You know, like catching a porpoise, or a plastic bottle, when you're trying to get some tuna.

Except that now your fake porno can get you put in jail because it's child porn. The difference is, if you try to sell someone tuna and it's really a plastic bottle, they'll probably know the difference. But if you're watching a faceswapped porno that was generated by combining thousands of faceshots in thousands of different angles and lighting, but a few of those faces are of the same person but 1 year younger than 18 (because, you know, to a face-recognition algorithm, 18 and 18-1 are so different)...

So not only did you just accidentally make child porn, somebody else just accidentally watched it! You're all sick!

Augmented Reality Faceswap circa 2012

image source: Christian Rex van Minnen


Notes:
Fake Porn Makers Are Worried About Accidentally Making Child Porn
Mar 2017, VICE

No comments:

Post a Comment