> Foundation-wise, observed improvements are incremental, not exponential.
Incremental gains are fine. I suspect capability of models scales roughly as the logarithm of their training effort.
> (read: drinking water and energy)
Water is not much of a concern in most of the world. And you can cool without using water, if you need to. (And it doesn't have to be drinking water anyway.)
Yes, energy is a limiting factor. But the big sink is in training. And we are still getting more energy efficient. At least to reach any given capability level; of course in total we will be spending more and more energy to reach ever higher levels.
Incremental gains in output seem to - so far - require exponential gains in input. This is not fine.
Water is a concern in huge parts of the World, as is energy consumption.
And if the big sink is “just” in training, why is there so much money being thrown at inference capacity?
I thought it was mad when I read that Bitcoin uses more energy than the country of Austria, but knowing AI inference using more energy than all the homes in the USA is so, so, so much worse given the quality of the outputs are so mediocre.
Incremental gains are fine. I suspect capability of models scales roughly as the logarithm of their training effort.
> (read: drinking water and energy)
Water is not much of a concern in most of the world. And you can cool without using water, if you need to. (And it doesn't have to be drinking water anyway.)
Yes, energy is a limiting factor. But the big sink is in training. And we are still getting more energy efficient. At least to reach any given capability level; of course in total we will be spending more and more energy to reach ever higher levels.