Fiber is a decades long investment into hardware- one that I would argue we hardly needed. Google fiber started with the question, what would people do with super high speed? The answer was stream higher quality videos and that's about it. In fact, by the time fiber became widespread, many had moved off of PCs to do the majority of their Internet use via cell phones.
With that said, the fiber will be good for many years. None of the LLM models or hardware will be useful in more than a few years, with everything being replaced to newer and better on a continual basis. They're stepping stones, not infrastructure.
We reemplaced one tech that was used by literally the whole world, pair copper wires, with something orders of magnitude better and future proof. My pc literally cant handle the bandwidth of my fiber connection.
Where I live (Germany), lots of people have vDSL at advertised speeds of 100 Mib/s, using pair copper wires. Not saying that fiber is not better, it obviously is, and hence the government is subsidizing large-scale fiber buildouts. But as it stands right now, I'm confident that for 99% of consumers, vDSL is indeed enough.
In the 90s and 2000s, I remember our (as in: tech nerds') argument to policy-makers being "just give people more bandwidth and they will find a way to use it", and in that period, that was absolutely true. In the 2000s, lots of people got access to broadband internet, and approximately five milliseconds later, YouTube launched.
But the same argument now falls apart because we have the hindsight of seeing lots of people with hundreds of megabits or even gigabit connections... and yet the most bandwidth-demanding thing most of them do is video streaming. I looked at the specs for GeForce Now, and it says that to stream the highest quality (a 5K video feed at 120Hz), you should have 65 Mib/s downstream. You can literally do that with a vDSL line. [1] Sure, there are always people with special usecases, but I don't recall any tech trend in the last 10 years that was stunted because not enough consumers had the bandwidth required to adopt it.
[1] Arguably, a 100 Mib/s line might end up delivering less than that, but I believe Nvidia have already factored this into their advertised requirements. They say that you need 25 Mib/s to sustain a 1080p 60fps stream, but my own stream recordings in the same format are only about 5 Mib/s. They might encode with higher quality than I do, but I doubt it's five times more bitrate.
In that, it's closest to the semiconductor situation.
Few companies and very few countries have the bleeding edge frontier capabilities. A few more have "good enough to be useful in some niches" capabilities. The rest of the world has to get and use what they make - or do without, which isn't a real option.
[1] https://internethistory.org/wp-content/uploads/2020/01/OSA_B...