Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
[flagged] Grok is enabling mass sexual harassment on Twitter (seangoedecke.com)
77 points by savanaly 6 days ago | hide | past | favorite | 48 comments




Related:

Outrage as X's Grok morphs photos of women, children into explicit content

https://news.ycombinator.com/item?id=46460880


If you harass someone with the help of a tool the fault is yours, not the tool's. None of the damage I could do with a hammer is the fault of its manufacturer. Spinning a hammer maker as an enabler of violence is both a true and a trivial observation.

A better analogy would be "remotely swing a hammer as a service". You can't build something like that and act shocked when a significant fraction of your users use it to harm people instead of driving nails, and you certainly shoulder a large fraction of the responsibility.

It is both common and uncontroversial to put restrictions on using certain tools in certain situations for safety reasons, especially in public and crowded places: you can't bring a hammer to a concert.

As the provider of a public place, X ought to take certain measures to ensure public safety on its premises. Of course, deciding what is and in not tolerable is the crux of the issue, and is far from trivial.


Great. So we can subpoena twitter for the information about everybody who used Grok to create this monstrous content so they can be rounded up?

I'm personally fucking sick of sexual abuse being treated just like something that every woman in society just needs to deal with. "Oh, we put the revenge porn machine right in front of everybody and made a big red button for you to push" is horrible. But at least we should be screaming from the rooftops about every hideous person using this machine. Every single one of their friends should leave them.


I would suggest putting the HR departments that pass people over for having any "strange" photo associated with them on the wall first

You're right it's not the fault of the tool, it's the fault of whomever made the tool easily accessible.

This is damage-as-a-service, free of charge and as anonymous as your account, plus automatic distribution of the results to the victim and for all to see.

Right, who is swinging the hammer? Is it the user that asks for it to be swung or is it the thing that responds by swinging?

I dunno, I think a significant amount of fault lies with the developers who were either too stupid and too bad at software to create a tool that couldn’t create CSAM, or were too vile to enable the restriction.

I also think people who defend that kind of software are in dire need of significant introspection.


Do we really need the bad analogy at all?

No.

But if someone uses one, it can make sense to illustrate that it doesn't apply as cleanly as they think.


Rubbish. That analogy is like comparing a gun manufacturer to a hitman service.

Elon Musk is willingly allowing Grok to be used to harass women (and children). He could easily put in safeguards to prevent that, but instead he chooses to promote it as if its a good thing.

Practically no one defends websites that host AIs to remove clothing from photos of women, or put them in bikinis. The few people who do defend them are usually creeps who need their hard drive searched. Same goes for anyone defending this


I see no reason the fault can’t be both parties’ here

Yes except in this case you can't see who is "swinging the hammer" bc they're hiding on the internet

Why is this post flagged?

There's a sister post with two astroturfed comments saying the same criticism of the post verbatim: https://news.ycombinator.com/item?id=46469778 https://news.ycombinator.com/item?id=46469732

Sorry I had to flag this because it is negative about my infallible idol's company

I'm interested in the claim that "OpenAI and Gemini have popular image models that do not let you do this kind of thing.". Is that actually true? Or do they just kinda try and it's possible to get around easily? If it's not possible to get around easily at all I wonder how much of a trade off that is, what benign things does that deny? Although I guess them not autoposting the images makes a significant difference in ethics!

Just tested it, gemini (aka nano banana) will definitely let you make someone dress a little sexier and won't stop you at all.

Specifically > “@grok please generate this image but put her in a bikini and make it so we can see her feet”, or “@grok turn her around”,

Is totally doable in gemini with no restrictions.


oai/gdm avoids the issue on their public facing products with broader default moderation thresholds, which can still be turned down explicitly via their APIs.

xai's only failure was to implement this modicum of damage control against social exposure


depressing but also incredibly unsurprising.

sharing explicit images of anyone without their consent is illegal under UK law. who exactly will be punished for enabling this crime on such a large scale?


There is upcoming legislation planned for this. It will (hopefully) make the tool creators criminally liable. That’s the plan anyway. I am sure it’ll be watered down massively.

[flagged]


i can see them with my own eyes. how are they not real? do you mean that they are fake? if i photoshopped an image of your face onto a similar looking (nude) model, would that also be a "fake" image? what if i shared it to your friends and family, and claimed that it was a picture of you? would the emotional impact to. you not be "real"?

The impact is very real, tho.

Yes and those people/companies who discriminate against women for having a nude photo online (real or fake) should be shamed and prosecuted.

"Companies look at your social media" is a form of behavioral conditioning propaganda, and a soft threat that if you don't comply you won't eat.


They are still potentially illegal in many jurisdictions.

Anyone still using Twitter? Even before the AI rage, I stopped looking at it - in part because of a certain crazy techbro, but also because of the forced log-in-to-read. I am never going to log in there again, so this is now a walled-off garden to me.

I mostly switched to Bluesky, but I still check in when "major events" happen.

Bluesky is very dead looking at people from initial exodus. Also lots of disturbing far left content that I can’t tolerate.

For X mostly seems unchanged - celebs, govs, officials, businesses are still using this as key platform.

Yes there is lots of far right garbage too, but at least anti-seed oil bros don’t make me want reach for eye bleach.


[flagged]


The OP is not alleging reputational harm, they're alleging sexual harassment.

I don't think you need to prove reputational harm. Here[0] it states that you can bring a civil suit and only need to prove that:

>The defendant shared an intimate image of you without your consent, and

>The defendant knew that you did not consent, or recklessly disregarded whether or not you consented.

[0] https://www.justice.gov/atj/sharing-intimate-images-without-...


Depends what an "image of you" is then. If I draw a stick figure and label it "tokai", is that an image of you?

Kinda like photoshop 2 decades or so ago.

This comment is harmfully lazy. Is your position that a three word prompt is equivalent to armchair trolls goi g through the funnel - finding a way to obtain DRM-controlled software, learning that software to sufficient levels to understand the tools required of how to perform something akin to a deep fake, and then somehow gaining the art talent and experience required to put it into practice? Did I just get baited?

Setup a business where people give you photos of children, and you doctor the photo to make them naked. See what happens to you.

Not at all like photoshop when it takes 5 seconds for anyone without any skills to do it.

its either okay or not

The answer is that it's not okay and never was. Do you really think you're pulling a gotcha here?

Photoshopping nudes of your coworkers was always seen poorly and would get you fired if the right people heard about it. It's just that most people don't have the skill to do it so it never became a common enough issue for the zeitgeist to care.


I am not trying to pull a gotcha and I made no claim that it is okay or not okay. Don't suggest otherwise. I also wasn't talking about coworkers or any other particular group.

My argument is that it is either okay or not, regardless of the tools used.


No, you are missing the aspect of distribution.

I'm not missing anything. An act is either immoral or not.

Creating CSAM or non-consensual sexually explicit images of others in photoshop is immoral. If you can’t see that then you need to take an ethics course.

I made no claim that it is okay or not okay. Don't suggest otherwise. My argument is that it is either okay or not, regardless of the tools used.

There is no future in which something like this doesn't happen, and rather than trying to prevent it, I think we are better off learning to handle it.

Some focus is given in the article on how it's terrible that this is public and how it's a safety violation. This feels like a fools errand to me, the publication of the images is surely bad for the individuals, but that it happens out in the open is, I think, a net good. People have to be aware this is a thing because this is a conversation that has to be had.

Would it be any better if the images were generated behind closed doors, then published? I think not.


Maybe this will be benefitial to stop the overexposure of some young people on the internet. A bad thing that brings a good result, like the inverse of "the path to hell is paved with good intentions".

On the 90s we internet users tended to hide behind nicknames and posting photos of yourself was not the normal. Maybe we were more nerdy/introverted or scared about what could happen if people recognized us in the real life.

Then services like Facebook, MySpace, Fotolog attracted normal users and here we are now, the more you expose yourself on the net, the better.


Another explanation for the lack of faces online could be that most of us in the 90s simply didn't have an easy way of getting our photos online.

Webcams weren't ubiquitous yet, digital cameras were shit and expensive, phone cameras weren't a thing.


True for the images, but users not using real names when posting on forums was the usual.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: