Hacker Newsnew | past | comments | ask | show | jobs | submit | hackthemack's commentslogin

That type of webpage style was quite common in the late 90s. Compare it to

https://www.circlemud.org/

I think the html editors of the time defaulted to some of style we now find quaint/quirky.


It was just the style at the time. There weren't a lot of HTML editors, even in 2001, and those that existed typically defaulted to an entirely blank page. People mostly wrote web pages in something like an emacs, vim, or notepad. Dreamweaver and Frontpage existed back then, but DW was only really popular with professionals, and nobody ever really used FP.

This style was a popular choice because it was easy to write, and could be displayed by just about any web browser. Compatibility and low resource usage was important back then.


Dreamweaver was extremely popular with amateurs too - they just didn't pay for it

lol maybe it just wasn't popular in my neck of the woods. I never knew anyone who even bothered with it. Now Flash, on the other hand...

Anything more complicated than this was just too difficult with the early HTML standards (there was no CSS).

Something is seriously wrong with the US justice system. Some links to bolster your point.

https://waldenconsultants.com/2020/04/13/yet-another-study-s...

https://en.wikipedia.org/wiki/High-Tech_Employee_Antitrust_L...


I noticed the Hall of Fame grading of predictive comments has a quirk? It grades some comments about if they came true or not, but in the grading of comment to the article

https://news.ycombinator.com/item?id=10654216

The Cannons on the B-29 Bomber "accurate account of LeMay stripping turrets and shifting to incendiary area bombing; matches mainstream history"

It gave a good grade to user cstross but to my reading of the comment, cstross just recounted a bit of old history. The evaluation gave cstross for just giving a history lesson or no?


Yes I noticed a few of these around. The LLM is a little too willing to give out grades for comments that were good/bad in a bit more general sense, even if they weren't making strong predictions specifically. Another thing I noticed is that the LLM has a very impressive recognition of the various usernames and who they belong to, and I think shows a little bit of a bias in its evaluations based on the identity of the person. I tuned the prompt a little bit based on some low-hanging fruit mistakes but I think one can most likely iterate it quite a bit further.

I think you were getting at this, but in case others didn't know: cstross is a famous sci-fi author and futurist :)

Something that really sticks out to me after reading the article, is how Sun had all the hype in the world when Java was released. There were internet discussions about Java replacing Windows. That Java was going to be the future of program development. It was going to run everything. It was going to run everywhere.

And now, 30 years later, Javascript is the programming language that does what Java set out to do.


Java wasn't the only one. Universal platform hype was widespread in the 90s (CHRP anyone?). Around that time OPENSTEP pretty much did run everywhere. Yellow Box ran everywhere too, even on Windows NT.

Some other, recent, JavaScript discussions

30 year anniversary since the announcement of JavaScript https://news.ycombinator.com/item?id=46146406

It’s time to free JavaScript (2024) https://news.ycombinator.com/item?id=46145365


It's a dupe, you've submitted a dupe.

Discussion over here as you mention: https://news.ycombinator.com/item?id=46146406

Share your comment in the thread!


I wish people were not so inclined to reply with "Ad Hominem Ridicule" one liners. I like a good joke, but such replies lack a certain level of content that addresses the point and feel "low effort".

I do agree that comparing the past with the present if fraught with complicated nuances, and people do tend to see the past with rose tinted glasses. But, I read Talwar's blog post more as a personal reflection on their experiences they are facing and not some kind of scientific treatise on what went wrong.


fair criticism; didn’t mean this as an ad hominem but rather a summarization of (as the comment I replied to points out) this genre of article that keeps coming up (and not just for programming); it’s exhausting mindset to see repeatedly and breaking it down into the core argument (“I liked things better when I was younger”) does have some value IMO

if this were titled “Java/JavaScript peaked” or “my reflections on XYZ” and written like that, I wouldn’t have given it a second thought. but claiming programming peaked 15 years ago leads me to not feel bad about my summarization


I agree. I got really tired of hearing tables are for tabular data! For 20+ years. My reply was always, Who cares if it accomplished the layout you want. If the meaning of a word is what got people so hung up... why not go and make a new css term that did what tables did but improve on it. Now 20+ years later, that is pretty much what they did.


Screen readers do care. A lot. Grid and subgrid solve the problem without breaking DOM and semantics, which is a huge concern in accessibility.


I have a theory that the churn in technology is by design. If a new paradigm, new language, new framework comes out every so many years, it allows the tech sector to always want to hire new graduates for lower salaries. It gives a thin veneer of we want to always hire the person who has X when really they just do not want to hire someone with 10 years of experience in tech but who may not have picked up X yet.

I do not think it is the only reason. The world is complex, but I do think it factors into why software is not treated like other engineering fields.


Constantly rewriting the same stuff in endless cycles of new frameworks and languages gives an artificial sense of productivity and justifies its own existence.

If we took the same approach to other engineering, we'd be constantly tearing down houses and rebuilding them just because we have better nails now. It sure would keep a lot of builders employed though.


We do take down a lot of old buildings (or renovate them thoroughly) cause the old buildings contain asbestos, are not properly isolated, ...


> If we took the same approach to other engineering, we'd be constantly tearing down houses and rebuilding them just because we have better nails now. It sure would keep a lot of builders employed though.

This is almost exactly what happens in some countries.


which one(s)?


Pretty common in Australia. Theres heritage laws to try to prevent replacing all the old buildings, but often they are so undesirable the owner just leaves them vacant until trespassers manage to burn it down.


Have it in Japan too. You can clearly see eras in house design. Pre 1960 almost everything is wood. Then you have wood and plaster until the 2000s or so, and after that is plastic on wood. You can see the age of a neighborhood and its residents based on what houses are made of.

If the residents die and someone new purchases the land, the old house is (generally) torn down and a new one built.


Japan, famously. Oddly enough it actually works very well in keeping buildings cheap.


I agree. But, I think the execs just say, "How can we get the most bang for our buck? If we use X, Y, Z technologies, that are the new hotness, then we will get all the new hordes of hires out there, which will make them happy, and has the added benefit of paying them less"


The problem with that is that it would require a huge amount of coordination for it to be by design. I think it's better to look on it as systemic. Which isn't to say there aren't malign forces contributing.


I agree. Perhaps, "by design" is not the correct phrasing. Many decisions and effects go through a multi weighted graph of complexity (sort of like machine learning).


Indeed. How does that saying go? Don’t attribute to malice what can be explained by stupidity?

On the other hand Microsoft and taceboook did collude to keep salaries low. So who knows.


Anyone in tech should read up on https://en.wikipedia.org/wiki/High-Tech_Employee_Antitrust_L...

It was more tech companies in collusion than many people realize. 1) Apple and Google, (2) Apple and Adobe, (3) Apple and Pixar, (4) Google and Intel, (5) Google and Intuit, and (6) Lucasfilm and Pixar.

It was settled out of court. One of the plaintiffs was very vocal that the settlement was a travesty of justice. The companies paid less in the settlement than the amount they saved by colluding to keep wages down.

https://www.mercurynews.com/2014/06/19/judge-questions-settl...


> It's unclear to me what the author thinks OOP is

I rather liked the old post "Object Oriented Programming is an Expensive Disaster that Must End" written over 10 years ago.

https://medium.com/@jacobfriedman/object-oriented-programmin...

Many complained the post was too long, and then debated all kinds of things brought up in the article (such is the way of the internet).

But the one thing I really liked is how it laid out that everyone has a different definition of what OOP is and so it is difficult to talk about.


I initially agreed with that article, but it than further and further focused on criticizing cargo culting in specific OOP languages (mostly Java). The actual problem is that there is an abstraction boundary introduced at every tiny thing. The fact that these abstraction boundaries are structured with OOP, is only incidental. I see that this is relevant to OOP, but only in so far as this seems to be a disease of OOP-only/first languages and OOP cultures. What I say is that it is true, that what he criticizes is OOP, but OOP doesn't requires any of that and the actual problem is the existence of the abstraction boundaries, not their shape.

You can't make the all generic implementation, than you will get a more complicated formulation of a turing machine. Software is useful in that it narrows down the expressiveness of computation to a single problem. A generic implementation is able to express anything and thus nothing, i.e. it doesn't contain the information of the problem anymore.


I'd settle for just getting inheritance to fade away.


I am a fan of immutability. I was toying around with javascript making copies of arguments (even when they are complex arrays or objects). But, strangely, when I made a comment about it, it just got voted down.

https://news.ycombinator.com/item?id=45771794

I made a little function to do deep copies but am still experimenting with it.

  function deepCopy(value) {
    if (typeof structuredClone === 'function') {
      try { return structuredClone(value); } catch (_) {}
    }
    try {
      return JSON.parse(JSON.stringify(value));
    } catch (_) {
      // Last fallback: return original (shallow)
      return value;
    }
  }


There's something in here for sure, switch over to TS with strict typing and you've got generics to help you out more, at least for validation.

A deep clone isn't a bad approach but given TS' typing, I don't know if they allow a pure 'eval' by default.. Still playing with this in my free time though and it's still tricky.


One thought I recently had, since using deepCopy is going to slow things down, is if the source code for QuickJS could be changed just make copies. Then load up quickJs as a replacement for the browsers javascript by invoking it as wasm.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: