I'm still a heavy advocate for requiring second/dual-sourcing in govt contracts... literally for anything that can be considered essential infrastructure or communications technology and medicine. A role of govt in a capitalist society is to ensure competition and domestic availability/production as much as possible.
While my PoV is US centered, I feel that other nations should largely optimize for the same as much as possible. Many of today's issues stem from too much centralization of commercial/corporatist power as opposed to fostering competition. This shouldn't be in the absence of a baseline of reasonable regulation, just optimizing towards what is best for the most people.
Suppose we got nuked or some calamity caused the interruption of all the fancy x-nanoneter processes. What would we actually miss out on? I don't know what the latest process nodes we have stateside are, but let's say we could produce 2005 era cpus here. What would we actually miss out on? I don't think it would affect anything important. You could do everything we do today, just slower. I think the real advancement is in software, programming languages, and libraries.
Software is much, much more bloated today than it was in 2005.
64-bit CPUs were available, but not quite mainstream yet. A "high end" consumer system had a couple gigabytes of RAM and chipset limitations generally capped you out at 4 or 8 gigs. You were lucky to have two CPU cores.
If you took today's software and tried running it on a memory constrained, slow, 2005 era system, you'd be in for some pain.
I used to daily-drive a Thinkpad X200 from 2008. As soon as you touch the modern (i.e. bloated) web, you feel the slowness. Other than that and gaming, it ran fine.
I'm talking about way more than just CPUs... And for your question, we'd pretty much miss out on modern-like mobile phones entirely. 90nm -> 18A/1.8nm is a LOT of reduction in size and energy... not to count the evolution in battery and display technology over the same period.
Now apply that to weapons systems in conflict against an enemy that DOES have modern production that you (no longer) have... it's a recipe for disaster/enslavement/death.
China, though largely hamstrung, is already well ahead of your hypothetical 2005 tech breakpoint.
Beyond all this, it's not even a matter of just slower, it's a matter of even practical... You couldn't viably create a lot of websites that actually exist on 2005 era technology. The performance memory overhead just weren't there yet. Not that a lot of things weren't possible... I remember Windows 2000 pretty fondly, and you could do a LOT if you had 4-8x what most people were buying in RAM.
> Now apply that to weapons systems in conflict against an enemy that DOES have modern production that you (no longer) have... it's a recipe for disaster/enslavement/death.
How do you maintain this production with a sudden influx of ballistic missiles at the production facility - or a complete naval blockade of all food calories to your country?
see Ukraine drone warfare ... there's a lot going on there which is more than just miniaturized motors, etc. a lot is efficient power use of the semiconductors in those drones, the image processors attached to the cameras, etc. that i suspect relies on newer processes
Fully loaded one way FPV drone peaks at over 1KW. Electronics is maybe 1% (excluding transmitters) of what drone uses for lift, its insignificant. Its all about availability and price. Lower nodes do not give you any price advantage.
The Iron Dome... Being able to use AI to calculate incoming missiles, computer trajectories and launch intercepting missiles effectively. It might be possible to do some of that, but significantly less effectively.
Like person below said, I assume the drone/AI warfare of the present and near future, along with IoT-integrated warfare and sensors and communications, function better and cheaper and faster with modern silicon.
I have an unpopular pet theory: the exponentially growing software bloat actually exists to slow computers back down to bearable levels for common folks, and that's why the most bloated frameworks have consistently replaced obsolete, less bloated ones, throughout the last decade.
Why else are everything now seem to be wrappers for wrappers? What if the bloat was, subconsciously or whatever, the point?
More bloated frameworks and the wife variety of packages built on then allow for faster feature development from less skilled and less experienced developers... Though plenty of experienced developers implement Enterprise patterns that bring plenty of bloat with them.
Electron as bad as it can be, has allowed for a level of cross platform applications in practice that has never existed... It's bloated on several levels.
Most of that ease in being able to deliver software that works well enough and quickly doing so wouldn't be possible without the improvements in technology.
While my PoV is US centered, I feel that other nations should largely optimize for the same as much as possible. Many of today's issues stem from too much centralization of commercial/corporatist power as opposed to fostering competition. This shouldn't be in the absence of a baseline of reasonable regulation, just optimizing towards what is best for the most people.