Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The moment properly self-improving AI (that doesn't run into some logistic upper bound of performance) is released, the economy breaks.

The AI, having theoretically the capacity to do anything better than everyone else, will not need support (in resources or otherwise) from any other business except perhaps once to kickstart its exponential growth. If it's guarded, every other company becomes instantly worthless on the long term, and if not anyone with a bootstrap-level of compute will be able to also, do anything ever on a long enough time frame.

It's not a race for ROI, it's to have your name go in the book as one of the guys that first obsoleted the relationship between effort, willpower, intelligence, etc. and the ability to bring arbitrary change to the world.



The machine god would still need resources provided by humans on their terms to run; the AI wouldn’t sweat having to run, for instance, 5 years straight of its immortality just to figure out a 10 years plan to eventually run at 5% less power than now, but humans may not be willing to foot the bill for this.

There’s no guarantee that the singularity makes economic sense for humans.


Presuming the kind of runaway superintelligence people usually discuss, the sort with agency, this just turns into a boxing problem.

Are we /confident/ a machine god with `curl` can't gain its own resilient foothold on the world?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: