Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

But that's always been the case? Since we basically discovered... Fire? Tools?


Yes but the narrative tries to make it about the tools.

"AI is going to take all the jobs".

Instead of:

"Rich guys will try to delete a bunch of jobs using AI in order to get even more rich".


I thought anyone with awareness of what the AI landscape is at the moment, sees those two statements as the same.


One implies "we should regulate AI" and the other implies "we should regulate the wealthy"


Should we regulate guns or dangerous people using them?


Yes, this shouldn't be controversial


Porque no los dos?


Both of them


Well it tells you who's narrative it is, if nothing else.


Those are examples that are discussed in the article, yes.


The difference in my mind is scale and reach and time. Fire, tools, war are localized. AGI could have global and instant and complete control.


Lay out a way that could happen?

Say the AI is in a Google research data centre, what can it do if countries cut off their internet connections at national borders? What can it do if people shut off their computers and phones? Instant and complete control over what, specifically? What can the AI do instantly about unbreakable encryption - if TLS1.3 can’t be easily broken only brute force with enough time, what can it do?

And why would it want complete control? It’s effectively an alien, it doesn’t have the human built in drive to gain power over others, it didn’t evolve in a dog-eat-dog environment. Superman doesn’t worry because nothing can harm Superman and an AI didn’t evolve seeing things die and fearing its death either.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: