Just google "yarn vs npm" and you'll know the answer...
the tools change because the technology is evolving, despite node the front end is still the driver, and mobile phones / browsers / etc are constantly being improved and evolved, and the ecosystem keeps up with it. Would you rather it all stayed static like in the C world and then you program a 2016 smartphone with a 2006 stack?
The answer to your question is yes. I would rather the stack, or at least most of it, stayed stable for 10 years. The problem seems to be that the JS community keeps re-discovering the wheel at every step, instead of actually looking at other ecosystems and learning from them.
How log did it take for a sane build system to show up? Or a sane package manager (with dependency resolution, upgrades, checksums, signature verification, etc.)? Why is Grunt/Gulp a thing when we have thinks like make/CMake?
The technology is not changing, in the sense that the state of the art is not really changing. In the past 15 years the biggest paradigm shift has really been towards single page apps, ES6-7, and TDD. The rest has existed as state of the art in many other ecosystems, but the JS community is notorious for having a bad case of NIH syndrome, so it's taking forever to get to the state of the art.
The good news is that it will slow down at some point. The bad news is that it probably won't be for a few years.
> Why is Grunt/Gulp a thing when we have thinks like make/CMake?
What a ridiculous statement. It's like asking why .NET doesn't use the JVM or why Haskell doesn't use the C compiler. Grunt/Gulp solve web dev specific problems, make is a different kettle of fish.
No, the problem is never improving the existing software but rewriting a new solution from scratch every 3 months [1][2], that, while it solves the shortcomings of the previous tool, now introduces a complete set of new problems. And people that actually have stuff to deliver are always playing catch up and fighting the bugs of alpha quality software.
I feel strongly about this, especially when reading a step-by-step tutorial, that's most probably intended at frontend "newbies", that for example is pushing Yarn: I played with it the other day on a working VueJS application, and got hit by a bug, causing yarn to delete important files in node_modules. My code wouldn't work anymore. Really sorry that I can't find the bug report anymore to prove my point. The workaround was going back to npm.
Teaching people alpha quality software is irresponsible. If you want to teach, or have stuff to deliver, use the battle tested tech. Experiment when you know how the pieces work together and have lots of free time.
2: EDIT: this is a big problem of our industry, it happens in all environments, but the JS community is the worst IMO. Desktop Linux is the second worst offender.
The problem with continuously changing technologies is that it doesn't allow you to become a master of that stack, so your constantly relearning how to accomplish the same thing. It's inefficient. If I build my applications with well established technologies, then I can do it quickly and accurately. Sure, I might miss out on some cutting edge benefits, but my shit works, always.
However, if i switch to JS, I feel like I will have to not only relearn how to do things I already know how to do, but I will be stuck in a cycle of continuing education and won't ever be able to expand my skillset beyond those things.
That's the state of the computing industry. 80% of the continuing education is about learning to do what you already knew how to do in a newer, hotter stack (but not necessarily better). 20% is about learning new paradigms. And I'm being too generous. It's probably about 5%.
Suddenly being a medical doctor sounds much less stressful. I really feel some of us chose the wrong career path. Imagine having high social status, having almost always some meaningful work, all for debugging meat machines that have lots of failure modes, but most of them well understood?
Having experience in both worlds, I can assure you medicine isn't what you seem to think it is. Social status, and income aren't so grand these days, and working conditions are bad for a lot of practitioners. And "debugging meat machines" is anything but well-understood in the majority of serious cases.
I know working with Javascript can be an annoying proposition. I'm maintaining a web app I wrote a couple of years ago in vanilla JS and gotten to be kind of a hairy beast. Probably would be better to rewrite it, but it works and for all the warts at least I know where they are. Maybe I'm just too lazy to take the time and effort to recreate from scratch. I figure by the time I got it working, I'd always be lagging the newer, "better" stuff anyway.
Interestingly, your comments about education have a parallel in CME (continuing medical education). There are real paradigm switches that appear from time to time, but not as much as you'd think. Most changes have to do with small refinements like new medicines that are variations on the current ones, theoretical or research findings, administrative or procedural issues.
So yeah, the really important information approximates the 5% figure you cite, and a significant amount of "updated" info is only marginally different or new, more or less equivalent to the "newer, hotter stack" idea you mention. Sure seems all that greener grass still needs to be mowed about the same way.
the tools change because the technology is evolving, despite node the front end is still the driver, and mobile phones / browsers / etc are constantly being improved and evolved, and the ecosystem keeps up with it. Would you rather it all stayed static like in the C world and then you program a 2016 smartphone with a 2006 stack?