- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
If they’re going to go this way, I don’t think it should be limited to just porn. There are plenty of ways you could ruin someone’s life without a deepfake being sexually explicit.
There already are a lot of laws covering that. This one is to cover an additional angle where people create deepfake without provably publishing it, the intent being that showing it to friends and verbally threatening to “leak” them should be easier to prosecute.
If you create a deepfake and share it, you’re slapped with two crimes.
the intent being that showing it to friends and verbally threatening to “leak” them should be easier to prosecute.
That’s blackmail, which is already illegal.
Using a mobile phone while driving has always been illegal if you could argue that it was dangerous driving or driving without due care and attention. They made a law specifically saying that using a mobile phone without hands free is illegal anyway. This makes it easier to prosecute because you don’t need to argue that they were driving dangerously or without due care.
I imagine that this law has the same intent of making this specific act illegal to prevent them having to argue that it fits another crime.
Yeah the way people can recreate someone “in need of assistance” to trick family or associates is really scary especially for people who aren’t exactly tech savvy. That seems to me to be a worse crime than an explicit video that is pretty obviously doctored
I have a hard time accepting this as a crime. What if the illustration hand-drawn, or clothed but still sexual in character? Is caricature illegal, by this standard?
You’d better not have a particularly vivid imagination or else you’ll be prosecuted for daydreaming.
Yea, this is a funny thing to think about.
You can jerk off to photos of people, you can imagine some wild things involving other people etc.
If you just create some deepfake porn to masturbate by yourself to, I don’t see a big problem with that.
The only issue I can find is, that due to neglect someone else sees it, or even hears about it.
The problem starts with sharing. Like it would be sexual harassment to tell people who you are masturbating to, especially sharing with the “actors” of your fantasies.
There is however another way this could go:
Everyone can now share nudes with way less risk.
If anyone leaks them, just go: “That’s a deepfake. This is obviously a targeted attack by the heathens of the internet, who are jealous of my pure and upstanding nature. For me only married missionary with lights out.”
There’s a big difference between a deep fake and a caricature.
Yeah, but only one of degree.
How so?
It’s making an image of someone that portrays them in an unrealistic and offensive context.
So if I use AI to make pornography of 50 men gang banging you, you will consider that to be on the same level as going to a carnival and getting a characture done?
The difference is so big, it easily becomes qualitative.
Ooohh, can’t wait to see us waste billions of dollars deliberating what is acceptable just like with copyright law.
This is another law that only exists to protect rich people. Poor people can’t afford a lawyer and don’t have time to show up in court.
You seriously can’t see why deep fakes are a serious problem to everyone?
This law won’t protect just rich.
Imagine the chaos as some idiot teen creates a deep fake of some other teen in a compromising position.
Go talk to an attorney and see what they have to say about it.
Is caricature illegal, by this standard?
No.
The official government announcement is linked in the article btw.
I wonder what happens when it just accidentally looks like someone but was intended to be a fictional person. Also, how much can you base it on a real person before it’s considered a deep fake of that person? Would race-swapping be enough to make it a “new” person so it’s not illegal anymore? My intuition is that just eye colour or something wouldn’t be enough, but it’s a sliding scale where the line must be drawn somewhere even if it’s a fuzzy line.
What about an AI generated mashup of two people like those “what the child would look like” pictures back in the day. Does that violate both people or neither?
What about depicting a person older than they are now? That’s technically not somebody that exists, but might in the future.
What if you use AI but make it look like it’s hand-drawn or a cartoon?
What if you use AI to create sexual voice clips of a real person but use images that don’t look like them or no image at all?
There are just so many possibilities and questions that I feel it might be impossible to legislate in a way that isn’t always 10 steps behind or has a million unforeseen consequences.
There’s already laws against using someone’s likeness for commercial purposes without their consent, I’m guessing this will require the same fuzzy cutoff and basically just be up to the jury to decide or the judge to dismiss.
Well, let’s find out. Please give me 20 sample photos of you, 30 minutes of audio and 10 of video.
I’m going to have you get gangbanged by 100 German men and upload it to xvideos.
Now, that is probably something you deserve to consent to, isn’t it?
This is why we should be making laws around likeness rights. If you damage somebody by publicly using their name to spread falsehoods, that’s defamation or libel. But, if you produce an image or video of their likeness instead of using their name, there’s no legal recourse. Makes no sense this day in age
Who decides how similar somebody is “allowed” to look to another? There are people who bear an uncanny resemblance to others. And what of identical twins? Can one sue the other if they do porn?
The courts, probably. That’s what they are for.
“Without consent.” I’m very curious who would consent to having deepfake porn made of themselves.
I can imagine a non-zero amount of people would consent to a deep-fake porn video of themselves having sex with some generic hot woman, just as one example.
That makes sense, I hadn’t thought of that sort of deepfake.
Better make sure the generic hot woman doesn’t resemble anyone real though.
Could be very lucrative if you are already in porn and want to make some money from your likeness. This guy’s gonna pay me $500 to make a video and I don’t even have to do anything?
Could also be very good for porn stars who have “aged out” but can still make videos using their younger bodies as weird as that may be.
A user shared a story a while back about his wife and her sister giving photos and agreeing to it. Lots of kinky people out there.
seems like the only way to deal with it… make it equivalent to sexual assault…
The only way to deal with it is to let so much of it flood the digital world that nobody cares anymore because there’s a deepfake porno of everyone.
This is a waste of money to ensure rich people don’t get porn made of them by poor people.
Poor people won’t be able to afford lawyers and aren’t able to take time off to show up in court.
you have a cracked view of jurisprudence… maybe you look at too much deepfake porn…
I prefer real porn.
A naked picture of me simply existing is not equivalent to sexual assault. If you want to make it illegal then treat it as its own thing.
“Deepfake pornography is a growing cause of gender-based harassment online and is increasingly used to target, silence and intimidate women — both on and offline,” Meta Oversight Board Co-Chair Helle Thorning-Schmidt said in a statement.
considers
I think that there’s an argument for taking the opposite position. If someone could make deepfake porn trivially and it were just all over the place, nobody would care about it; one knows that it’s fake.
In fact, it’d kind of make leaked actual pornography no-impact as a side effect, unless there were a way to distinguish distinguish between deepfakes. And that’s a harder issue to resolve. I was reading a discussion yesterday about sextortion on here and talking about how technically-difficult it would be to keep someone from recording sex video chats, that there’d always be an analog hole at least. But…there is another route to solve that, which is simply to make such a video valueless because there’s a flood of generated video.
Deep fake recognition is already available. And, while what you predict sounds logical, these criminals prey on emotions. I feel that a lot of innocent people will be victimized even if deep fake porn becomes common.
So what if the deep fake consent?
Seems like a waste of resources that could better be spent on other things.
Government grasping at straws. More news at 8.
deleted by creator