It has been a decade or so since I read The Singularity is Near. From what I remember, Kurzweil said that technologies will increase exponentially once they become linked to the exponential increase in computing power.
Now that Moore's Law has leveled off, I do wonder what will keep the exponential increase going. For example, LLMs are increasing exponentially in size each generation, but that also involves an exponential increase in cost, and will only be sustainable up to a certain point.
> but that also involves an exponential increase in cost
Not necessarily. For training, yes. But there surely will be custom hardware specialized for the inference of "1-bit" quantized models, which could be orders of magnitude more energy-effective than GPUs.
Hell, you probably can even bake trained models in silicon. Maybe instead of digital logic, use analog circuits. Photonics?
I believe there's so much room for improvement and I feel that GPUs are just the "training wheels" (lol, quite literally) — there will be exponentially more effective hardware for running LLMs.
Even asic will imo only give us one step-function jump. there will be a huge
jump and then it is mostly back to moore's law-ish. maybe the gradient will be a bit steeper but i doubt it.
If blockchain is any indication, ASICs give two jumps (2 orders of magnitude) over GPUs. But yeah, then further jumps will have to be via parallelism; can we put 100s or 1000s of these ASICs in a single computer?
In Kurzweil's version, Moore's law is not a metric of transistor capacity per dollar, but instead is an exponential growth of the amount of information that humans can leverage. He claims that this has been going for thousands of years and continues (quite vertically) now.
This ex-OpenAI engineer’s recent essay makes the case that LLMs are progressing at 5x the rate of Moore’s Law across a number of dimensions: compute, efficiency, and capability: https://situational-awareness.ai/from-gpt-4-to-agi/
It's surprising compute/$ has increased in a fairly smooth exponential way for more than a century, long before Moore. I guess if the financial incentives are there people find a way.
I just downloaded it which was a bit of a job as it said not yet available in UK/US but Kobo with the VPN set to Singapore worked. It seems quite jolly. Maybe a bit more philosophical than the last one.
Every day that passes brings the singularity one day closer. However ... there is no way that the singularity as promised will occur within our lifetimes.
It won't hurt you to dream though. It might keep you off the streets.
My recollection is he always says it happens around 2045 so some of us may be living. I'm not sure he defines what it is very clearly though.
I've got a personal definition involving stuff you can get done per hour of human labour. When robots can do it all with zero human labour that goes infinite so you get a proper mathematical singularity.
His original prediction about 2029 (or say 2039 or whatever near term) doesn't seem impossible anymore. I'd rate the prediction as impressive - given that it was made way before the current AI boom.
Note also that Gibson semi-predicted 2035 [0] and Ghost in the Shell also predicted 2029 [1] (maybe basing their number off of Kurzweil, I don't know).
I think there are many ways to get at a concrete prediction of 2029, or thereabouts, by noticing a halving of cost for the same compute and/or storage every 1.5 to 3 years. This is a sort of generalized Moore's law or, maybe, just Wright's law [2]. Do this calculation to get 2.3Pb of storage, the estimated storage size of the human brain, for under $5k and you get within the range of 2030, give or take a decade.
$5k is my arbitrary cutoff price point as this puts it in the "home computer" category and is, in my opinion, the price point that set off the home computer revolution.
Go read the science a bit better maybe. Nobody's predicting the extinction of humanity or even sophisticated civilization through human-caused climate change in the next century or any remotely prediction-tolerant time span. This kind of nihilistic fatalism is not only disconnected from any scientific evidence or plausible scenarios, it also pushes away anyone on the fence for an assortment of reasons.
He does quite a lot of dates and some are pretty accurate. Computers winning chess he predicted one year later than it happened.
Turing test he predicted 2029 and while not very clearly defined I'd say that is happening about now.
Predicting those things are kind of easy if you have the background in the technology that he does. But the singularity? How can you compare that to computers winning chess?
I just looked at what he says in the new book and he has:
>Eventually nanotechnology will enable these trends to culminate in directly expanding our brains with layers of virtual neurons in the cloud. In this way we will merge with AI and augment ourselves with millions of times the computational power that our biology gave us. This will expand our intelligence and consciousness so profoundly that it’s difficult to comprehend. This event is what I mean by the Singularity.
Seems kind of wacky to me. I've always believed in Moore's law type stuff but not
the nanobots which have been long hypothesised but pretty non existent.
Near came out in 2005 and Nearer in 2024 so if we assume a halving of new edition times we should have Nearerer in 2033 and the Nearererererererere... explosion by 2043.
Kurzweil's vision of the technological singularity may be more distant than his optimistic predictions suggest.
The concept of a singularity - a point of rapid, transformative change beyond which predictions become unreliable - is inherently challenging to analyze. However, examining historical examples of singularity-like events can provide valuable insights.
- The Big Bang: Often regarded as the ultimate singularity, this event marked the birth of our universe and the beginning of space, time, and matter as we understand them.
- The emergence of organic life: The transition from inorganic matter to self-replicating, complex organisms represents another fundamental shift in the nature of our world.
- The evolution of human consciousness: The development of our species, with its unprecedented cognitive abilities and capacity for abstract thought, can be seen as a singularity in the progression of life on Earth.
These examples share a common characteristic: while their impacts are immeasurably significant, the full realization of their consequences unfolds over vast periods of time. The Big Bang occurred approximately 13.8 billion years ago, yet the universe continues to evolve.
Life on Earth emerged roughly 3.5 billion years ago, with complex multicellular organisms appearing much later.
Human beings have existed for a mere fraction of that time, yet our impact on the planet is still unfolding.
Drawing parallels to these historical singularities, it's reasonable to infer that even if a technological singularity were to occur within the next few decades, its effects would likely manifest gradually rather than instantaneously.
The idea that such an event would fundamentally alter every aspect of our existence "in the blink of an eye" may be an oversimplification.
Consider, for example, the Industrial Revolution. While it dramatically transformed society, economy, and technology, these changes occurred over decades and centuries, not overnight.
Similarly, the ongoing Digital Revolution has been reshaping our world for several decades, with its full impact still unfolding.
2^9 lifetimes ago humans expand through africa
2^8 lifetimes ago hunting revolution
2^7 lifetimes ago neolithic revolution
2^6 lifetimes ago first civilizations
2^5 lifetimes ago ancient greece
2^4 lifetimes ago book printing
2^3 lifetimes ago reinassance
2^2 lifetimes ago enlightment
2^1 lifetimes ago industrial revolution
2^0 lifetimes ago first digital computer
1/2 lifetimes ago personal computers
1/4 lifetimes ago personal computers with GHz processors and GPUs
1/8 lifetimes ago AI spring
1/16 lifetimes ago transformers
1/32 lifetimes ago GPT-3
1/64 lifetimes ago GPT-4
What stands out to me in your recounting is: the duration between change and full impact keeps getting smaller.
Broad strokes:
Consciousness -> 100,000 years
Civilization -> 10,000 years
Industrial Revolution -> 100 years
Digital Revolution -> 50 years
AI Revolution -> 10 years
Singularity -> 1 year
Kurzweil’s main point that I recall from his book is that the rate of change is shrinking and eventually becomes near zero so that change happens so fast there is no non-change normal.
There was no big bang. It's just a local recycling operation being confused with something much grander due to human ego (the same ego that dreamed up gods to amplify human importance). The universe is trillions of times larger than we currently think it is, humanity is comically misjudging the macro scale (just as it comically misjudged the micro scale previously).
Now that Moore's Law has leveled off, I do wonder what will keep the exponential increase going. For example, LLMs are increasing exponentially in size each generation, but that also involves an exponential increase in cost, and will only be sustainable up to a certain point.