I think you're missing the point. Everyone is free to learn real analysis, and it's ok if it's a part of the standard curriculum.
But if you're aiming to provide the best education to future professionals so that they are able to excel at their field, you'd serve them far better if you taught them how to effectively use Wolfram Alpha.
The truth of the matter it that the bulk of the work in software development, and what makes more economic sense, lies in growing a system through accretion and adopting higher level components developed and maintained by third parties.
Nonsense. MIT engineers don't waste their time thinking about assembly instructions when they are tasked to design a distributed system. Competent engineers know how to switch to high-level concepts where they make sense.
> The difference between trade schools and universities is in the depth of fundamentals.
It's definitely nonsense. You're failing to understand that today's "fundamentals" are not the parlour tricks you have in mind. You're in a better position to get ahead in providing software development services if you're able to put together a customized script that launches traefik, nginx, rabbitmq and a few nodejs services fully dockerized than if you're able to tell what opcodes your compiler generated out of C code.
Now you're describing sysadmin work. You don't need university for that. University is for getting a universe of exposure to ideas, to learn fundamentals to their depth. You need to study some fundamentals to be able to create those kinds of programs like nginx or rabbitmq.
But there are still highly specific tasks out there that require knowing stuff like what opcodes your compiler generated out of C code. Not every job or team is building a web app with nodejs and the like.
> Excel at what? You need to build a strong foundation in order to innovate and to discover new stuff.
I agree. The problem is that you have a gross misconceptions about what "strong foundation" represents in this day and age.
I knew guys who fail to stop and think about "foundations" and instead use that as a scapegoat to shoehorn their gatekeeping logic and as a form of ladder-pulling. It's the kind of character who think that playing trivia games is a good way of asserting whether a candidate is a competent engineer.
Obviously the AI researchers in MIT need a strong foundation in GPU and parallel computing design and semiconductor technologies in order to innovate. Only those in trade schools `import pytorch` /s
But if you're aiming to provide the best education to future professionals so that they are able to excel at their field, you'd serve them far better if you taught them how to effectively use Wolfram Alpha.
The truth of the matter it that the bulk of the work in software development, and what makes more economic sense, lies in growing a system through accretion and adopting higher level components developed and maintained by third parties.