> closed circuit cooling systems don't evaporate any fucking water.
Nobody uses closed circuit cooling systems anymore. Pretty much every datacenter is cooled by evaporative (open loop) cooling.
The enthalpy of vaporization of water is pretty high, so for a relatively large energy use, they only need to evaporate quite small amounts of water. A single household water supply (10GPM), on full blast, could cool 1.5 megawatts of computers - a small datacenter.
Thanks to most of the world being "GPU poor", there is a lot of research and engineering effort going into making models much more compute efficient. Another way that OpenAI gets to benefit from the world of open source/weight models.
I think there are still _a lot_ of use cases that are currently prohibitively expensive for which increased efficiency will immediately induce matching demand.
Sure, but we know it’s possible in practice as we carry one of these with us and it works on almost no energy, so it gives the inspiration at least to try to get a little bit closer. There is a very large gap…
Maybe. But there is no maybe about the mess being left for the future.
If future people are forfeit as we refuse to sacrifice today, why preserve people today? I say bring chaos now so the mess makers deal with their externalities for a change instead of waving them off to make “line go up”.
What you are saying means people not learning anything new or different.
There is lots that only humans can do, including what a sentence predictor presented as AI can't imagine. It's getting better at reasoning for sure, but net new, novel stuff, I think is still the domains of most.
It's true though that GPT may offset a lot of people to have to change and grow, and it might be hardest for the people who got away with BSing. I've never had that luxury and always have had to deliver so I guess it feels a little less frightening.
Either way, I figure its easier to learn which way the currents in this realms are flowing to recognize them better, than to sit on the side of the pool looking at it with disdain.
We stop growing not only when we stop learning, but when we stop creating.
> We stop growing not only when we stop learning, but when we stop creating.
This is generic enough to leave open to debate what we should create.
It waves off the impact on the future to create.
It’s cute and poetic but as usual ignores externalities as our taught economic models inherently rely on ignoring externalities. You are refusing to engage in such discussion because we don’t socially normalize an obligation to do so.
Why must we create what we currently create?
Can you not create new skill and awareness any other way? Are you that unimaginative you must simulate the galactic differential manifold inside a machine as you cannot imagine it and draw it for yourself? At the cost of what for the species over the next decades for us?
And I say for us as you cannot guarantee there’s Lindy effects measured in centuries from the outputs of the work. It’s not unreasonable to find all this high minded talk near equivalent to hearsay and religious belief in the potential for forever growth and expansion.
It’s not unreasonable to assume this just conviction to economic memes you memorized wrapped in overly reductive poetry to obfuscate.
We didn’t need to build ChatGPT for that. Such physical measure and awareness that resources are finite and humans require resources has been in front of human eyeballs for centuries.
You’re just a contemporary worker bee following orders. The economy is built on academic BS as the older, less numerate financier generation had no ability to falsify that which they had no education in, they handed their pensions and post world war welfare job money to their kids who made up this and think the future just has to keep honoring decades old contracts while ignoring they don’t honor their elders past.
Entropy will attenuate the past. Our achievements are meaningless to the next centuries as physics will force them to be rebuilt.
The CO2 footprint of training GPT-3, 502 tons (est.), is approximately the carbon output of the air travel industry every 5 seconds. Anyone who writes about the carbon footprint of machine learning is being paid to mislead you.
>However, ChatGPT consumes a lot of energy in the process, up to 25 times more than a Google search.
That headline is misleading (and should probably be updated. 25x more energy than a search, not 25x more than the company. Which anyone who knows anything about the two things could tell you - one is running inference, the other is a database lookup. In fact, I'd be surprised if it's only 25x in that case.
Not really? Google's sustainability report gives datacenter energy usage and PUE, and you can approximately apportion energy usage per site to each site based on its size. David Patterson and his team have issued a number of papers and talks with precise figures for various ML tasks, such as "Carbon Emissions and Large Neural Network Training", "The carbon footprint of machine learning training will plateau, then shrink", and "GLaM: Efficient Scaling of Language Models with Mixture-of-Experts".
AI consumes energy, but it also gives it back by helping find solutions quickly that could take multiple searches, page loads, and ad auctions. It can also save time.
Overall, it's the source of energy that matters, not the consumption.
It seems extremely unlikely that the total energy cost of GPT models is dominated by training. The training of GPT-3 was estimated at 1287 MW-h, which is less than 1 day of a medium-sized cloud datacenter.
We shouldn't flag this. It's terrible. It's innumerate. But it is what's being written, and it probably is a good idea to see these things to see how people will play the game.
Note some interesting things about this article. Lack of sources, obviously, but the byline itself is sus.
What's even scarier is that people on HN actually accept this noncritically. We're better than that! This is a perfect article to train your media literacy and critical thinking skills on.
Just so I’m clear, we shouldn’t flag it so that we read garbage and recognize it for garbage?
No thanks, I flagged it.
It was a stupid article when someone posted it on the GPT-4o story (and I said as much in the comments then) and it’s a stupid article now that has no business being on HN.
HN regularly flags trash that climbs up the ranks, the argument that we shouldn’t do that so that people can train their “media literacy” is just absurd. Go read The Sun or any number of bad publications if you want to train your media literacy.
> ChatGPT consumes a lot of energy in the process, up to 25 times more "than a" Google "search".
The website removes three key words from the title vs what's in their actual article. And 25 times more than a google search is incredible vague how they come to that number.
> Additionally, a lot of water is also used in cooling for the servers that run all that software. Per conversation of about 20 to 50 queries, half a litre of water evaporates – a small bottle, in other words.
What? I have no idea about server farm cooling, can anyone explain this to me?
Alternatively, you can cycle closed-loop water through a chiller to a hot server and back, or just cool the air around the server (CRACs and CRAHs) using either water or refrigerant as the heat-carrying material.
For datacentres that require air conditioning as opposed to natural ventilation (most of them) a very popular approach is to use evaporative cooling towers [1] in combination with W2W chiller units [2]. The chillers cool the internal water circuit and heat the external water circuit, the excess heat is dumped to the environment by evaporating water in the cooling towers.
Of course it's possible to use air-cooled equipment and this is more common in cooler climates or smaller data centres, so it's not a rule of nature that cooling servers wastes water but it's certainly a very common outcome.
I really don't think that is how that works? Water cooling in servers works the same as in desktops, just with way bigger radiators. Maybe there are tapping into some other available way of cooling
> Water cooling in servers works the same as in desktops, just with way bigger radiators.
Data centers have separate systems to remove heat from the entire datacenter. These are often evaporative coolers, which means the water is evaporated away.
A better analogy would be the HVAC system for your house. Your computer dumps heat into the house, the HVAC system removes the heat from the house to the environment. It's the latter part that uses evaporative cooling in many data centers.
The whole point of water cooling on electronics is that the closed loop cycles the vapor away from the heat generating part and is then cooled by a radiator, making it condense back to a liquid and flow back to the hot thing, so whoever wrote that line is severely misinformed.
This is obviously not looking good, but if AGI is achieved, then cracking the problem of abundant energy might be done sooner rather than later, which will be a net positive. So, I don't know, in the grand scheme of things, it might be a small price to pay.
> ChatGPT consumes a lot of energy in the process, up to 25 times more than a Google search