It's been trivial to make images, nude or otherwise, of anyone providing you have a couple of photos of them since roughly August or September of 2022. I know because I made some funny images (for a leaving party) of one of my friends (e.g. friend as Luke Skywalker etc), and this was with open source software running on a few years old GPU.
The fact is this, this particular cat is out of the bag and will never be going back inside. It's only getting easier and more devices (including phones) have the required graphics capability to do this.
I think we just have to accept that (as has always happened), the world has evolved and there are new ways to hurt people. When I was a teen and the internet first appeared, people could get bullied via ICQ and MSN, and also got the shit kicked out of them in person too. Now kids can get bullied on Insta and have porn made of them too.
However, in my final year of high school, someone printed out a bunch of images of one of our teachers heads photoshopped onto a pornstars body. It wasn't a perfect fit but I'm sure the poor teacher didn't care about it not being "perfect."
With content-aware fill, and other more advanced tools, it's been possible to Photoshop porn-a-likes which are pretty damn good for a long time, and I dont think the new tech changes anything really. Human nature stays the same and it's that we should be trying to work on, through education, child-care, etc. "Tools" for abuse or bullying will always exist and keep evolving, and trying to play whack-a-mole with them will never work, rather like trying to ban booze or weed. Trying to create a more caring society, improving schools and budgeting for teachers and teacher training, social workers, etc, is the way forward here.
But, sadly, inevitably, we will go with the easy option, try and legislate our way out of this, which won't work at all, and the cycle of bullshit will continue.
Claiming extremely realistic deepfake is the all-in-one solution when a compromising photo/video of you is leaked online. Not specifically related to this case, but something similar has already happened at least one before: https://www.theguardian.com/technology/article/2024/may/11/s...
What I would give to find out what public opinion of these laws will be like in 20 years. I can't tell if it'll be like seatbelt laws, which the pubic ultimately embraced. Or, maybe it'll be more like cannabis, where less and less people care about it from a moral standpoint.
>Angela Tipton was disgusted when she heard that her students were circulating a lewd image around their middle school. What made it far worse was seeing that the picture had her face on someone else’s naked body.
We did the same in 1998 with photoshop ... nothing new under the sun. Yet another moral panic.
I agree, of course it's a horrible thing, and AI maybe makes it easier or more realistic, but not new.
It must be fairly shocking to have it happen to you, but surely you just point out that the images are AI fakes - so the real damage is the fact that you're being bullied in this way, not the 'nude' images themselves.
And if you want to try to draw something positive from this, it's also possible to deny any real nude images by claiming they are AI fakes.
I wonder whether I lack empathy. Sometimes I think "oh, just another shitstorm where someone tries to weaponise people's shitstorm" and I wonder how often that's an appropriate reaction.
EDIT: For example people saying "AI bad!" and pointing at bad effects, not mentioning whether the same thing was done before AI but sort of leaving the impression that it's new, relying on people's emphathy to override their memories.
The fact is this, this particular cat is out of the bag and will never be going back inside. It's only getting easier and more devices (including phones) have the required graphics capability to do this.
I think we just have to accept that (as has always happened), the world has evolved and there are new ways to hurt people. When I was a teen and the internet first appeared, people could get bullied via ICQ and MSN, and also got the shit kicked out of them in person too. Now kids can get bullied on Insta and have porn made of them too.
However, in my final year of high school, someone printed out a bunch of images of one of our teachers heads photoshopped onto a pornstars body. It wasn't a perfect fit but I'm sure the poor teacher didn't care about it not being "perfect."
With content-aware fill, and other more advanced tools, it's been possible to Photoshop porn-a-likes which are pretty damn good for a long time, and I dont think the new tech changes anything really. Human nature stays the same and it's that we should be trying to work on, through education, child-care, etc. "Tools" for abuse or bullying will always exist and keep evolving, and trying to play whack-a-mole with them will never work, rather like trying to ban booze or weed. Trying to create a more caring society, improving schools and budgeting for teachers and teacher training, social workers, etc, is the way forward here.
But, sadly, inevitably, we will go with the easy option, try and legislate our way out of this, which won't work at all, and the cycle of bullshit will continue.