Julia, 21, has received fake nude photos of herself generated by artificial intelligence. The phenomenon is exploding.
“I’d already heard about deepfakes and deepnudes (…) but I wasn’t really aware of it until it happened to me. It was a slightly anecdotal event that happened in other people’s lives, but it wouldn’t happen in mine”, thought Julia, a 21-year-old Belgian marketing student and semi-professional model.
At the end of September 2023, she received an email from an anonymous author. Subject: "Realistic? “We wonder which photo would best resemble you”, she reads.
Attached were five photos of her.
In the original content, posted on her social networks, Julia poses dressed. In front of her eyes are the same photos. Only this time, Julia is completely naked.
Julia has never posed naked. She never took these photos. The Belgian model realises that she has been the victim of a deepfake.
Social media as a business model? Yes, absolutely.
Is open source AI image generation a business model?
The companies that host and sell an online image to nude service using a tuned version of that tool specifically designed to convert images into nudes are definitely a business model.
I agree it’s impractical and opens dangerous free speech problems to try and ban or regulate the general purpose software, but, I don’t have a problem with regulating for profit online image generation services that have been advertising the ability to turn images into nudes and have even been advertising their service on non porn sites. Regulating those will at least raise the bar a bit and ensure that there’s isn’t a for profit motive where capitalism will encourage it happening even more.
We already have revenge porn laws that outlaw the spread of real nudes against someone’s will, I don’t see why the spread of fakes shouldn’t be outlaws similarly.
And I think if those companies can be identified as making the offending image, they should be help liable. IMO, you shouldn’t be able to use a photo without the permission of the person.