Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Tend to agree. All the OS attempts that tried to seriously innovate the desktop failed. Add WinFS and OpenDoc to that list.


And OS/2.


It's kind of sad that we are so stuck in the current paradigms. The 90s were definitely much more exciting in terms of innovation.


It's because the only innovations that "count" in the mainstream marketplace are the ones that take us from "not good enough" to "good enough", not those that take us from "good enough" to "excellent". In other words - as a startup, you get customers by taking a non-consumer and turning them into a consumer. It's very hard to take a consumer and turn them into a consumer of something else.

In the 70s we went from "not good enough" to "good enough" in price, but once we got to the PC clone wars of the mid-80s it was hard to go much lower. Then in the 80s and early 90s we went from "not good enough" to "good enough" in user experience, with MacOS 7 and Win 95. The 90s OSes were all attempting to take a "good enough" user experience and make it excellent, and that's where they failed - most mainstream consumers don't care enough about excellence to make it worth learning a new OS. Instead the late 90s and early 00s took us from "not good enough" to "good enough" in information, with a big cost in user experience. The web sucked as a UI and still does, but it opened up literally billions of sites worth of content that a desktop user could only dream about. Now the web has created this whole new problem of trust, which cryptocurrencies solve, but at the cost of regressing 30 years in performance and usability.


But one could ask what is the current paradigm?

For example, you can view it as the desktop stopped evolving, or that we moved to the browser as a virtual machine hypervisor running whatever environment you choose (particularly true with WebAssembly).

The 90s were exciting because the desktop was seen as the center of the computing universe; these days the browser is the center and the desktop is a "me too!" paradigm.

One could argue we got "stuck" or one could argue that evolution shifted to an internet first paradigm with more security in mind.


I prefer the '90s overall. There are good points to the modern web/internet world, but given a choice the '90s design is a better paradigm. Of course this assumes that one does their respective paradigm well, there were a lot of bad designs in the '90s that are worse than today's equivalent. However I contend that if effort had been continued in the '90s direction the result would be better.

Again, there are some major points in favor of the current internet/web world. For any "program" which you will use rarely it isn't the worth the cost to install the program.


We have two dominant mainstream paradigms.

We have the browser. And we have the app store.

Arguably, for better or worse, the desktop shifts to a combination of these and probably converges with mobile for mainstream users.


No it wasn't. The 90s was just intel and microsoft eating the market and killing innovation from competitors. Wintel easily set us back 20 years.


Indeed. Looks aside (although I liked them too), OS/2 was such an unbelievably good user OS. I wasn't programming enough at the time to know if it had a good developer story or not, although I kind of dug REXX.


Workplace Shell was pretty innovative, and the only firm I know of that took advantage of it was Stardock.

https://en.wikipedia.org/wiki/Workplace_Shell

https://www.stardock.com/stardock/articles/article_sdos2.htm...


and its follow up OS/2 Warp.

Which I believe is the only operating system advertised in the Super Bowl.

IBM's windows replacement wasn't so bad in my very limited usage of it. (I used it as the only scanner drivers we had for a scanner at IBM was for a OS/2 warp machine...)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: