One really interesting strategy the US could pursue here would be to heavily tariff solar[1] and just randomly attack wind projects[2]. Just completely self-own itself on the two cheapest energy sources.
It wouldn't make any sense, but it would be provocative, really drive engagement.
Maybe the tariff could encourage local manufacturing of solar? I have no idea, but I suppose that our local manufacturing could be getting killed by economies of scale abroad.
Or I am overthinking it and solar is something that (D) politicians support so the (R) president tautologically must oppose it. Therefore we must not have nice things
My guess is that the US is sliding into the pattern of ‘corrupt, resource cursed petrostate’ and all the dysfunction that comes with that. What will happen is whatever maximizes fossil fuel revenues, and lots of shiny distractions to prevent this from being scrutinized.
I worked for a company making GPU clouds. The biggest problem we had for deployments was not getting GPUs -- we had plenty of those sitting in warehouses. The biggest issue was finding data center space with sufficient power and cooling. There was plenty of square footage, just not enough power for it all.
They're now building gigawatt datacenters to handle all the GPUs.
The big question is were to build them. There are only a few places with cheap and plentiful power. One of those is Quebec (but it's not that big and there is a lot of regulation). Another is Texas (except their grid isn't very stable). And the last is China. And you can't build a datacenter in China unless you're Chinese.
It'll be interesting to see how this pans out. Maybe the current admin (which is big on deregulation) will make it easier to build power plants, especially nuclear ones.
Edit: I wrote this comment four days ago. I couldn't figure out why I was suddenly getting a bunch of replies to it. Apparently when HN does a second chance, they just reset the time on all the comments. Odd, but I guess it makes sense knowing what I know about how the sorting is calculated. It's probably the easiest way.
Ok but telling someone they can have GPUs online in a mere 5 years if they build as fast as SK is still going to be a very painful pill. How do we get a DC in a year?
> The company has apparently been thinking outside the box to meet its power needs, with Musk stating a couple of weeks ago that it intends to buy a power plant from abroad and import it into the US to provide energy for its data centers.
> Nuclear power plants take something like a decade to build
The most-recently completed fission power station on this planet needed 23 years under construction and it is still in testing. A recent American one took 15 years.
OK, but people who want these actually have to build them in the countries that exist. This isn't Civilization. You cannot switch to a civic that gets more hammers at the beginning of your next turn.
Sure but the argument against nuclear is always the same. Its hard right now.
The problem is that if you dont start correcting that hardness now, the next time you think you might need a nuclear industry, its still hard.
You need to train, grant experience and give money to professionals so they can exist to build stuff. It took the UK ~ 15 years to get started, then they were cranking them out like no ones business.
Think of it this way, in 15 years you can have something you might need, or you can guarantee you wont have it whether you need it or not.
Thing is its the same for all large infrastructure products. But only in nuke do we have people actively trying to prevent the industry from being created and maintained so they can use the unreadyness as an excuse.
It's why the Vogtle plant is on track to lead to the worst possible outcome: yes it was very expensive. So the best thing to do is to start building another one right away while all those lessons and skills are still current.
In my part of PA there are 3 in the process of going in nearby. I think the largest of the 3 is "only" 828 megawatts though. One of the others is supposed to be 300MW, and I'm not sure about the 3rd. There is another group talking about 3 more campuses with a combined power budget of 1.3GW about 55 miles from here. But then while we don't have cheap land, we do have nuclear and hydroelectric in the area, so I guess the makes it attractive.
Places further north are great contenders because of the free cooling. Also, many of them have cheap electricity from hydro or even geothermal, like Iceland.
Quebec has a lot of empty land. But it does not have a lot of buildable land near plentiful power, super fast internet, and highly skilled technical workers, which are all things you need to build a datacenter.
There is tons of big power producing electric dams in the middle of nowhere (north of Québec province). It’s also cold most of the year. As far as I know, datacenter doesn’t mean tons of technical workers (except during the building phase). Bringing fiber would not be impossible.
I think Texas (ERCOT) is a terrible option these days. Meta recently made a fantastic choice by picking Louisiana for their new monster. The MISO grid tends to be cheaper and less volatile than ERCOT.
I got down voted when I said China is likely to win the AI race as they are also targeting the other big cost of computing power/energy on another thread. Today solar + BESS is cheaper than coal where as costs for both keep decreasing each year.
Remove restrictions on solar import from China. 62 GW may sound a large number, but China added 277GW solar in just 2024. They have the surplus capacity and hence cheapest price.
This is honestly one of the stupidest things about this sort of policy: solar panels last 20 years and China can't take them away once you have them.
If you're doing something much more valuable with the power, then buying a lot of PV from China makes sense. If you think the panels are being unfairly subsidized then buying a lot of PV from China is effectively having the Chinese government pay you to have cheap power.
There's an enormous difference between being dependent on short term consumable resources,.and acquiring multidecadel productive assets.
The US at all points seems to not understand it's relationship with China at all.
Most people are economically illiterate and don't understand what subsidies actually imply.
They literally only "hurt" you if you have the local industry to harm to begin with. Otherwise, if someone else is paying the subsidy, it gives you a good for cheaper than you could have had otherwise.
The current admin just turns this up to 11 with their ideologically driven nonsense.
China installed 3 gigawatts of solar power every day in May: equivalent to building one coal-fired power plant every 8 hours. They are so far ahead of the US on renewables now that even if Trump had not sold out the future of the country to the fossil fuel industry, the US would have been hard-pressed to catch up.
My hunch: we’ll see three things happen in parallel
- AI backend providers vertically integrating into energy production (like xAI’s gas plants, or Meta’s local generation experiments),
- renewed interest in genuinely efficient computing paradigms (e.g. reversible/approximate computing, analog accelerators),
- a political battle over whether AI workloads deserve priority access to power vs. EVs, homes, or manufacturing, alongside an increase in energy prices.
You need cheap, reliable power + political/regulatory willingness + cooling. That’s a very short list of geographies. And even then, power buildout timelines (whether nuclear, gas, or grid-scale solar+batteries) move at "utility speed", which is decades, not quarters. That doesn’t match the cadence of GPU product launches.
Energy efficient computing is a very exciting field. I hope it will get more attention driven by these economic constraints.
As a short teaser:
Landauers principle suggests that the energy required to erase one bit off information is bounded from below by k_BTln(2). This could lead us down a path towards reversible computing, to avoid energy costs for deleting information.
The work started around 10-15 years ago and is now largely done. Many people confuse large absolute numbers like 1 kW with inefficiency but today's GPUs/TPUs are close to the practical efficiency limit with today's 3 nm technology.
I think it's mostly because laypeople think that all heat is wasted heat. Most people are pretty surprised to learn that there's a fundamental energy cost to flip a bit and that there even is a lower limit to the amount of heat generated by a bit flip.
The timing mismatch is crucial - data centers can be built in 12-18 months, but new power generation takes 5-10 years minimum. We're essentially trying to scale AI demand faster than energy infrastructure can physically respond. This creates interesting arbitrage opportunities in power-rich but compute-poor regions.
Ah, finally an acknowledgement that the melting 12VHPWR connectors short-circuiting Nvidia's top-of-the-line hardware, the RTX 4090 and 5090, may finally have economic ramifications. Heh.
Even if that were true, that wouldn't be a useful justification. People buy drugs because they are addictive, it's not a rational decision about costs and benefits. Externalities are almost not at all considered in the contract here.
It wouldn't make any sense, but it would be provocative, really drive engagement.
[1] https://seia.org/news/solar-tariff-impacts/
[2] https://www.npr.org/2025/08/31/nx-s1-5522943/trump-offshore-...