Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Couple more:

(1)

Garbage collection in every high level language: Java, which was the first mainstream language to do it-- people were seriously using cpp for high level business logic at the time, and were suspicious of GC for its performance.

But Java itself got it from LISP, which had introduced GC without it ever going mainstream decades prior

(2)

No SQL had already been tried as hierarchical databases in the 70s or 80s iirc. Relational model won because it was far more powerful. Then in the early 2010s, due to a sudden influx of fresh grads and boot campers etc, who often hada poor grasp on SQL, schemaless stuff became very popular... And thankfully the trend died back down as people rediscovered the same thing. Today's highly scalable databases like Spanner and Cassandra don't ostentatiously abandon relational calculus, they reimplement a similar model even if it isn't officiallu SQL

(3)

And then there's the entire cycle that's gone back and forth several times of client based vs server based:

First there were early ENIAC type computers that werr big single units. I would consider that similar to thick client.

Then as those developed we had a long era of something more similar to cloud, in that a single computer developed processes to support many partitioned users who submitted punch card batches.

That developed even further into the apex at the time of cloud style computing: terminal systems like ITS, MULTIcS, and finally in the 70s, UNIX.

Then the PC revolution of the 80s turned that totally on its head and we went back to very very thick client, in fact often no servers at all (having a modem was an optional accessory)

We stuck with that through the 90s , the golden age of desktop software.

A lot of attempts were made to go back to thinner clients but the tech wasn't there yet.

Then of course came the webapp revolution started by Gmail's decision to exploit a weird little used API called XMLHttpRequest. The PC rapidly transformed over the next decade from a thick client to a thin vessel for a web browser, as best exemplified by the Chromebook, where everything happens in the cloud -- just like it did in the mainframe and terminal days 50 yeara ago...

The trend could stay that way or turn around -- it's always depended in hardware performance balance changes.



To be honest, NoSQL makes sense where the stream of writes is very intense, so ACID guarantees are impossible to enforce along with relational guarantees, like referential integrity. See stuff like Cassandra.

Schemaless has its place for document storage and the like, but it requires a much more careful approach, else it can devolve into insanity.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: