Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Okay then keep the AI in boxes and out of society.

Notice that people perked up and started caring about AI development primarily when these were introduced to the public? Could it be that the public has a legitimate stake in new technologies that are introduced to it?

"It seems like these days there are a hundred people dumping chemicals into rivers and millions more who are saying 'don't dump chemicals into the rivers.' We don't need em! If you aren't dumping chemicals into rivers then it's best not to shove yourself into the conversation at all."



As far as the public goes, their primary concern would be how LLMs could put a lot of them out of work in the artistic fields. AI Doomsdaying is primarily a hobby for nerdy scientists who believe too much in the singularity, not the average Joe.


I don't understand the relevance of this comment.


Your comment framed AI Doomsdaying as being a concern of the public, when it is pretty far from being something the public cares about.


Paxys initial claim was "there are millions of people uninvolved in making AI who are commentating on AI and they should shut up."

Take it up with them if you disagree that there are millions of people who care. It's not relevant to my point, which is that you need not be directly involved in the development of a technology to be a stakeholder in its development, especially when it's being deployed to the public at massive scale and breakneck speed.


None of these think tank types represent society.

Let society decide what they want to do with AI, not a bunch of compromised hype chasers.


It sure is easy to disregard critique when one can pick from such a bottomless barrel of ad hominem.

Do I work for a think tank and didn't know it?


I wasn't talking about you. I was talking about the AI talking heads in the media today.


You don't represent society, certainly not more than the millions of people with ChatGPT subscriptions.


I didn’t claim to. Paxys claim is that if you are not involved in the development in AI, you are not a stakeholder in its development. My claim is that as soon as something begins affecting the public (e.g. by mass deployment), then the public is now a stakeholder.

As a general rule, as soon as you start doing something that affects other people, they have a stake in what you’re doing. This is a basic, basic tenet of a society. It applies to listening to loud music, driving fast, and yes, developing AI.

Pretty straightforward.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: