Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> from the days when spinning rust was the limiting factor

How did we get back to this though? We have gigabytes/sec with NVMe and stupid fast CPU's with at least 4 cores in even low end models. Yet a text editor takes so long to load we need to load it up on boot... Such a frustrating field to work in.



I know this is such a stereotypical "get off my lawn" statement but we've lost the art of software engineering. It's all about stuffing as many features in as quickly as we can and pushing it out to as many people as possible. Performance is always secondary.

Not that I'm that nostalgic for the old days, we would have been doing the exact same thing if we were able to get away with it. But performance restrictions meant you had no choice but to care. Modern tech has "freed" us from that concern.


Niklaus Wirth wrote about this in 1995, in his essay A Plea for Lean Software.

About 25 years ago, an interactive text editor could be designed with as little as 8,000 bytes of storage. (Modern program editors request 100 times that much). An operating system had to manage with 8,000 bytes, and a compiler had to fit into 32 Kbytes, whereas their modern descendants require megabytes. Has all this inflated software become any faster? On the contrary, were it not for a thousand times faster hardware, modern software would be utterly unusable.

https://www.computer.org/csdl/magazine/co/1995/02/r2064/13rR...

That said, as someone fairly young, I still don't think that makes it wrong or something only an old man would think. Software seems to perform exactly as well as it needs to and no more, which is why hardware advances don't make our computers run software much faster.


Aside from slowness, feature creep leads to poor quality, i.e. tons of bugs and user confusion with ever-changing graphical interfaces.

If software was simpler, we could afford to offer some formal guarantees of correctness. Model check protocols, verify pre and post conditions à la Dafny, etc.

There's too much change for the sake of change.


> There's too much change for the sake of change.

+1 to this. Like a lot of issues, I think the root is ideological, but this one in particular very clearly manifests organizationally.

The companies building everyday software are ever bigger— full of software engineers, designers and various kinds of managers who are asked to justify their generous salaries. At an individual level I'm sure there's all sorts of cases, but at a general level there's often almost no other option but to introduce change for the sake of change.


I once asked a man who worked in marketing why Oreos keep making crazy new flavors like "sour patch kids Oreos" when the normal kind is great and clearly has no issues being sold. I could see some upside - it gets people talking about them, it's fun, it reinforces the normal flavor as the best chocolate cookie, etc. but I was still dubious that those benefits outweighed the cost of developing new flavors in a lab, paperwork for food safety, a new manufacturing process, new ads, new packaging, etc. especially for something temporary.

He said it's often just some new marketing exec wants to put something on their resume, and they have certain metrics that they target that don't necessarily align with long term profits of the company.

I'm sure software has a similar problem.


> There's too much change for the sake of change.

+1 to this. Like a lot of issues, I think the root is ideological, but this one in particular very clearly manifests organizationally.

The companies building everyday software are ever bigger— full of software engineers, designers and various kinds of managers who are asked to justify their generous salaries. At an individual level I'm sure there's all sorts of cases, but at a general level there's almost no other option but to introduce change for the sake of change.


This is exactly what I see as well.

At a general level, I believe there are other options - changes/features need to meet some level of usage or it is scrapped out of recognition that supporting all these features make bugs more likely, performance likely to degrade, increase difficulty of adding features, make the product more difficult to use, etc.


> Software seems to perform exactly as well as it needs to and no more

The cynical spin I would put on this idea is that software performs as poorly as it can get away with. MSFT is feeling the need/pressure to have Office load faster, and they will try to get away with preloading it.

Otherwise, there is a strong pull towards bloat that different people will try to take credit for as features even if the cumulative experience of all these "features" is actually a worse user-experience.


software authors that don't care about performance annoy me (and I am an old man.)

The amount of things a computer can do in a single thread are amazing, and computers now have a dozen or more threads to do work. If developers cared about performance, things would easily be 20x as performant as they are today.

I'm not talking about "write in assembly, duh" I'm talking about just doing things intelligently instead of naively. The developers I support often simply are not thinking about the problem they're solving and they solve the problem in the simplest way (for them) and not the simplest way for a computer.

Software is an inefficiency amplifier, because the number of developers for a piece of code is much smaller than the number of computers that run that code; how much coal has been burned solely because of shitty implementations? I'd wager that the answer is "a LOT!"

Even if you don't care about coal usage, think about how much happier your users would be if your application was suddenly 5x faster than it was previously? now think of how many customers want their software to be slow (outside of TheDailyWTF): zero.

languages like javascript and python remove you so much from the CPU and the cache that even if you were thinking of those things, you can't do anything about it. JS and Electron are great for developers, and horrible for users because of that amplification I described above.

I am dead tired of seeing hustle culture overtake everything in this field, and important things, to me, like quality and performance and support all fall straight down the toilet simply because executives want to release features faster.

things like copilot could help with this, i hope. presumably copilot will help introduce better code into applications than a daydreaming developer would, though the existence of vibe coding sort of nulls that out probably.

one thing that AI will do quite soon is increase the amount of software that exists quite dramatically. and I am kinda concerned about the possibility that it's all going to suck horribly.


I commiserate with your frustration with developers writing things suboptimally all too often. However, I disagree with the assumption that it's a JS/Python vs C issue.

Example: when VS Code came out, it was much, much faster, more responsive and stable than Visual Studio at the time. Despite being based on Electron, it apparently was much better on architecture, algorithms and multithreading than VS with its C++ and .NET legacy codebase. That really impressed me, as a C++ programmer.

Overall, it feels like folks who idealize bygone eras of computing didn't witness or have forgotten how slow Windows, VS, Office etc. used to feel in the 90s.


> Overall, it feels like folks who idealize bygone eras of computing didn't witness or have forgotten how slow Windows, VS, Office etc. used to feel in the 90s.

Let’s normalize speed over time like we do dollars, so we are talking about the same thing.

Given the enormous multiplier in CPU and storage hardware speeds and parallelism today vs. say 1995, any “slow” application then should be indistinguishable from instant today.

“Slow” in the 90’s vs. “Slow” in 2025 are essentially different words. Given unclarified co-use pushes several orders magnitude of either speed or inefficiency difference under the rug.


“Slow” is when the human waits on the computer.

The promise of computing is that what was slow in the 1960s and 1970s would be instant in 1990. And those things were instant, but those things aren’t what people did with computers anymore.

New software that did more than before, but less efficiently, came around, so everything felt the same. Developers didn’t have to focus on performance so much, so they didn’t.

Developers are lazy sacks who are held skyward because of hardware designers alone. And software developers are just getting heavier and heavier all the time, but the hardware people can’t hold them forever.

This cannot continue forever. Run software from the 1990s or 2000s on modern hardware. It is unbelievably fast.

Maybe it was slow in the 1990s, sure. I ask why we can’t (or won’t) write software that performs like that today.

The compiler for Turbo Pascal could compile something like a million lines per second in 1990. We have regressed to waiting for 60+ minute C++ compile times today, on even moderate project sizes.

Debugging in visual studio used to be instant when you did things like Step Over. You could hold the Function key down and just eyeball your watch variables to see what was going on. The UI would update at 60FPS the entire time. Now if I hold down that key, the UI freezes and when I let go of the key it takes time to catch up. Useless. All so Microsoft could write the front end in dotnet. Ruin a product so it is easier to write… absolute nonsense decision.

All software is like that today. It’s all slow because developers are lazy sacks who will only do the minimum necessary so they can proceed to the next thing. I am ashamed of my industry because of things like this.


“Developers are lazy sacks who are held skyward because of hardware designers alone”

As a programmer who studied computer and electrical engineering in university, never before have I been so offended by something I one hundred percent agree with


Counterpoint: single threaded performance hasn't improved much in the past 20 years. Maybe 5x at best. And virtually every UI programming environment still has problems with work done on the main thread.


Single thread performance increased every processor generation, and is still doing so today.


Increased yes, but not by a whole lot. See https://cdn.arstechnica.net/wp-content/uploads/2020/11/CPU-p... and https://cdn.arstechnica.net/wp-content/uploads/2020/11/CPU-p...

Source: https://arstechnica.com/gadgets/2020/11/a-history-of-intel-v...

(I'm sure someone could dig up more recent graphs, but you get the idea).

In order to get more performance, your app needs to use multithreading.


Too true!

RAM parallel bandwidth, increased caching levels and size, and better caching rules, instruction re-ordering, predictive branching, register optimization, vector instructions, ... there have been many advances in single thread execution since the 90's. Beyond any clock speed advances.


Office 4.3 loading on Win3.1 was glacial. I haven't forgotten.


> The amount of things a computer can do in a single thread are amazing, and computers now have a dozen or more threads to do work. If developers cared about performance, things would easily be 20x as performant as they are today.

Why? A good portion of programs are still single-threaded, and often that's the correct choice. Even in games a single-threaded main thread or logic thread may be the only choice. Where multi-threading makes sense it should be employed, but it's difficult to do well.

Otherwise, it's up to the OS to balance threads appropriately. All major OSes do this well today.


I think what the author wanted to say is that because computers are very fast today developers have no incentive of writing optimized code.

Nowadays you just "scale horizontally" by the magic of whatever orchestration platform you happen to use, which is the modern approach of throwing hardware at the problem in the vertical scaling days.


It’s not about programs being multithreaded. It’s about computers running multiple programs at once on different threads and they all perform well.

One can write software that uses the CPU cache in non-dumb ways no matter how many threads your program has. You can craft your structs so that they take less space in RAM, meaning you can fit more in cache at once. You can have structs of arrays instead of arrays of structs if that helps your application. Few people think of things like this today, they just go for the most naive implementation possible so that the branch predictor can’t work well and everything needs to be fetched from RAM every time instead of building things so that the branch predictor and the cache are helping you instead of impeding you. People just do the bare minimum so that the PM says the card is complete and they never think of it again. It’s depressing.

The tools to write fast software are at our fingertips, already installed on our computers. And I have had zero success in getting people to believe that they should develop with performance in mind.


So your assertion is that developers should get in a big huddle to decide how they’re going to consume L1 between applications? Which of course no dev has control over since the OS determines what runs and when.


You can make your time in the CPU more efficient by thinking of the cache and the branch predictor, or you can say “nothing I do matters because the OS schedules things how it wants.” Up to you I guess, but I know which of those approaches performs significantly better.


My standard is that software should appear to work instantly to me, a human. Then it is fast enough. No pressing a button and waiting. That would be great.


That is probably the correct measure. If “The Promise of Computing” is ever to come true, people must never wait on computers when interacting with them.

Waiting is ok when it comes to sending batches of data to be transformed or rendered or processed or whatever. I’m talking about synchronous stuff; when I push a key on my keyboard the computer should be done with what I told it to do before I finish pushing the button all the way down. Anything less is me waiting on the computer and that slows the user down.

Businesses should be foaming at the mouth about performance; every second spent by a user waiting on a computer to do work locally, multiplied by the number of users who wait, multiplied by the number of times this happens per day, multiplied by the number of work days in a year… it’s not a small amount of money lost. Every more efficient piece of code means lighter devices are needed by users. Lambda is billed by CPU and RAM usage, and inefficient code there directly translates into higher bills. But everyone still writes code which stores a Boolean value as a 32-bit integer, and where all numbers are always 8-bytes wide.

What. The. Fuck.

People already go on smoke breaks and long lunches and come in late and leave early; do we want them waiting on their computers all of the time, too? Apparently so, because I’ve never once heard anyone complain to a vendor that their software is so slow that it’s costing money, but almost all of those vendor products are that slow.

I’m old enough that I’m almost completely sick of the industry I once loved.

Software developers used to be people who really wanted to write software, and wanted to write it well. Now, it’s just a stepping stone on the way to a few VP positions at a dozen failed startups and thousands of needlessly optimistic posts on LinkedIn. There’s almost no craft here anymore. Businessmen have taken everything good about this career and flushed it down the toilet and turned teams into very unhappy machines. And if you don’t pretend you’re happy, you’re “not a good fit” anymore and you’re fired. All because you want to do your job well and it’s been made too difficult to reliably do anything well.


> languages like javascript and python remove you so much from the CPU and the cache that even if you were thinking of those things, you can't do anything about it.

Even operating systems don't get direct access to the hardware these days. Instead a bunch of SoC middlemen handle everything however they like.


Wait…those dastardly systems architecture engineers with their decadent trusted platform modules, each with an outrageous number of kilobytes of ROM. They are the true villains of software performance?


that doesn't matter; if you make your cache usage smart and your branches predictable, the CPU will take advantage of that and your program will run faster. It is in the interests of the system and CPU designers to make sure this is the case, and it is.

If you do the things which make your code friendly to the CPU cache and the branch predictor, when it comes time for your code to run on the CPU, it will run faster than it would if you did not do those things.


What's your proposal for a "compromise" language between programmer productivity and performance, especially for multiple threads and CPUs? Go, Rust, a BEAM language?


I don't think the tools are the issue here, they are tools you can do good and bad jobs with all of them. What is lacking are the right incentives. The tech market has never been as anti-competitive as it is today. Let's repeal DMCA 1201 and go from there.


Jai seems to be an excellent start. Possibly Zig as well.

Both are written/designed by people who care a lot about application performance and developer experience.


This is an unserious take. Jai doesn’t even have official documentation and zig hasn’t reached a 1.0 release


I wasn't asked for examples of software that is congruent to whatever definition you want. I was asked for a proposal of a "compromise" language, and I answered that question.


> presumably copilot will help introduce better code into applications than a daydreaming developer would

Copilot is trained on Github (and probably other Git forges w/o permission, because OpenAI and Microsoft are run by greedy sociopaths.)

I'd wager that the majority of fleshed out repositories on these sites contain projects written at the "too-high level" you describe. This certainly seems to be true based on how these models perform ("good" results for web development and scripting, awful results for C/++/Rust/assembly...) - so I wouldn't get your hopes up, unfortunately.


I dont know if its just the training data, or that CRUD and webapps are more inherently easy to parrot away.

Low level programming means actual -thinking- about the system, resources, and language decisions etc

Even humans struggle with it, Its much easier to build a website than say a compiler, for anyone, humans and llm's included


That probably plays into it as well. I have yet to see any convincing evidence that contradicts LLMs being mere pattern parrots.

My personal benchmark for these models is writing a simple socket BPF in a Rust program. Even the latest and greatest hosted frontier models (with web search and reasoning enabled!) can only ape the structure. The substance is inevitably wanting, with invalid BPF instructions and hallucinated/missing imports.


imho these tools are great i fyou know what you're doing, becasue you know how to smell test the output, but a footgun otherwise.

It works great for me, but it is necessarily an aid learning tool more than a full on replacement, someone's still gotta do the thinking part, even if the llm's can cosplay -reasoning- now


I'm also young and heavily favor simple, quality software. Age is a highly coarse proxy for one's experiences, but in this case I think it has more to do with my personality. I have enough experience in computing that I don't think I'm making demands that are unrealistic, although they are certainly unrealistic if we maintain current incentives and motives.


“What Andy giveth, Bill taketh away”


I saw the writing on the wall when I had to install a couple 150MB IDEs to run 101-level Java programs in the mid 2000's. 150 megabytes. MEGABYTES. I could consume about 1 kilobyte per minute of fantasy novel text in uncompressed ASCI, call it 1/8th that fully compressed. That means this compressed binary you're handing me is around 1.2 billion minutes of work (more if ingesting a novel is faster than writing/testing/debugging a program) for what is functionally a text editor, file manager, syntax library, compiler, and debugger. Pretty sure that got done in 150 kilobytes a generation earlier. A generation later, maybe it will be 150 gigabytes.


I looked it up, you want an illegal taxi? 168 MB: https://apkcombo.com/uber/com.ubercab/


install a couple 150MB IDEs

Not Java, but an IDE in 4K: https://en.wikipedia.org/wiki/BASIC_Programming

Having used it quite extensively (Well, five solid days over two weeks, which is about 1000x longer than most people gargling on the internet), it's surprisingly capable.

Imagine if someone with the same talent and motivation was working on today's hardware.

<aside> Someone on Atari Age wrote a LISP for the same machine.


> I know this is such a stereotypical "get off my lawn" statement but we've lost the art of software engineering.

Indeed. I am sure many of us here are burnt out on bloat. I am also sure many of us want to move to smaller stuff but cant simply because of industry momentum. BUT that doesn't mean the dream is dead, only that we must work towards those goals on our own time. I found Plan 9 and haven't looked back. I can rebuild the entire OS in seconds on a fast machine. Even my little Celeron J1900 can rebuild the OS for several supported architectures in minutes. I can share a USB device seamlessly across my network, PXE booted from a single disk without installing anything. Cat(1) is just 36 lines of C.

There's still hope. Just ignore the industry hype noise and put in the effort ourselves.


And just when we think we can't make software any more inefficient, slow, and bloated, they release things like Electron, where you ship an entire browser with your app! And then when we think it can't even get worse, we have Docker and containers where we ship the entire OS with the application.

I'm looking forward to when app developers ship you an entire computer in the mail to run their text editor.


The problem with Electron is that business-wise it is an excellent decision. You can get by with a few people to wrap the web app and integrate it with the OS, and then get updates pretty much for free.

Yet for the user it is bad -- bloated, slow, feels non-native, has specific bugs which are hard to address for the devs, etc.

I don't see any light for the desktop UI development unless there is some lightweight universal rendering engine. Tauri with WebView is somewhat promising, but it has problems on Linux and it is hard to target older systems.


It's a pretty OK example of a negative externality. A little like polluting: Just dumping your waste into the environment is business-wise an excellent decision. You avoid the cost and everyone else has to deal with the downsides.


Polluting is indeed an excellent business decision. The thing about apps is that all of them are polluting, just some of them are worse than others. And we tend to fill all available resources, so over time it only gets worse.


It's an excellent business decision... right up until your customers abandon you because you make bad quality software. Like many businesses have found time and again, deliberately sacrificing quality for profit is a short term gain for a long term loss.


there are quite a few examples of software built with electron that have very large user bases. this sounds like a personal vendetta against electron rather than meaningful insight.


Electron is horrid, but as a user, I prefer bloated "apps" to no support at all.

As for your second point: [1]

1: https://snapcraft.io/


> Cat(1) is just 36 lines of C.

Correct me if I'm wrong, but isn't it around 800 lines[1]?

1. https://github.com/coreutils/coreutils/blob/master/src/cat.c


I love that I work at a place (Row Zero) where caring about performance is baked into the culture, and I can spend days fixing weird perf edge cases our users discover without management asking why I'm wasting my time. And it's office software, no less!


>> It's all about stuffing as many features in as quickly as we can...

The problem isn't "engineering" the problem is the culture of product management. (Note: NOT product managers per se).

I ask this basic question, how many Directors, VP's or CPO's do you know who got their job based on "cutting out unused features"? If you can find one, it will end up being the exception that proves the rule. The culture of "add", "new" and "shiny" doesn't reward keeping things lean and effective. T

In the tangible world we look to accountants for this sort of thing (because they tend to have costs). Think cheap Costco hotdogs and free cookies at Double Tree. No one in product, dev and accounting is going to sit down and try to justify loosing some code, features and maybe a few customers to make it faster when you can just "engineer" you way out of it and not have to sell less is more.


> I ask this basic question, how many Directors, VP's or CPO's do you know who got their job based on "cutting out unused features"?

Google goes a step further and kills entire apps


Moment of silence for Play Music. YT Music isn't any less awful now than it was a decade ago.


Ford does. They look at connected vehicle telemetry to strip out features nobody uses in order to save software maintenance, hardware and third party IP licensing costs.

https://www.slashgear.com/1513242/ford-gets-rid-of-self-park...


My personal theory is there is a threshold of performance. Below the threshold the experience is bad enough it affects revenue so getting the program up to speed becomes a priority. Above the threshold only features are prioritized to drive more revenue. That's why despite computers getting orders of magnitude faster computer programs seem to run about the same speed.


I think the threshold is more about how much more rent can we seek to collect from users, and making things more performant or ergonomic doesn’t do anything to allow sales to add another 10% to the per-user subscription pricing (I assume this is a product with per-user subscriptions, even though it’s almost certainly unnecessary).

But adding yet another gateway to ChatGPT’s API…that’s a $15/mo/user add-on right there, and not just a monkey, but one of the slower intern monkeys, could write the code for such an incredibly innovative and, there’s no other word for it, powerful new feature that everyone should be thrilled to pay for at a 200-300% (or more!) markup. So guess who gets that VP of New Bullshit Development position, and guess what kind of choices get reinforced?

(EDIT: grammar)


How does someone get promoted at Microsoft? How do they avoid being seen as a low performer?

Performance just isn’t on that list, and it’s often more and harder work than a given new feature took to create. Office users are getting what Microsoft is designed to deliver.


I really don't think we've "lost it", I think performance has just not been a consideration in the engineering of Office for a long time, if ever.


> I know this is such a stereotypical "get off my lawn" statement but we've lost the art of software engineering.

Absolutely this. I think this is evidence that points to modern civilization starting to collapse. When we can engineer correctly, we're fucked.


> we've lost the art of software engineering

Yes! This is what all my projects are geared towards restoring. The big one is not quite ready to announce yet, but I am very proud of it, and extremely excited to release it, to solve exactly that: it makes engineering fun again!


Well that username matches


we don't have software engineering anymore than the romans had civil engineering

we now DO have civil engineering but that is it


It was always like that


It's a matter of resource allocation. Lowering your design requirements for performance can save significant developer cost. Also, Word in 2025 is doing a lot more under the hood than 97.


I'll take the hate for this, but I have been using gemini to build narrow scope apps and they are extremely fucking fast compared to their bloated software package suite $200/user/month counterparts. It's amazing how fast and efficient programs can be when not trying to cover every use case for every possible user at every possible moment on top of a sea of tech debt programming.

While true LLMs fall flat on their face when fed massive codebases, the fact of the matter is that I don't need a 200k LOC program to accomplish a single task that an LLM can do in 2k LOC.

To give an example, we have proprietary piece of software that is used to make (physical) product test systems using flow charts and menus. It's expansive and complex. But we don't need it when we can just spend 30 minutes prompting your way to working test code and it produces way faster and more robust systems.

Maybe the devs of that software package cannot dump that whole codebase into an LLM and work on it. But they are completely missing the forest for the trees.


I will make this analogy:

Have a large ZIP file. Preferably like a few gigs and lots of (small) files.

Try to open it with the built-in Windows 11 tooling from Microsoft. It's going to be super slow to even show anything never mind unpack it.

Now install say 7-zip and do the exact same thing(s) and it's going to be instant opening and unpacking it takes a much much smaller amount of time (only limited by disk speed).

Turns out optimizations / not doing stupid things is still a thing even with all this raw power we now have.


Because an entire generation of developers and their managers believe the hardware is so fast there's no point trying to optimize the software.

Besides, the only thing that matters is getting tickets "done" before the arbitrary sprint deadline in 2 weeks, so best not to spend any extra time cleaning up or optimizing the first thing that works. You can't think about performance until the sprint dedicated to performance.


100% thought process is: why waste internal resources on speeding up software when the user has enough hardware to manage the workload.


Battery use is a pretty big concern these days; also, some users like running several things at the same time.


For the local OS and sustained workloads like video playback, yes, battery optimization is huge. For an individual app with bursty compute, less so, plus some of that inefficient code can run in the cloud instead, which is costly, but premium subscriptions can pay for it, and power plants are now colocated with data centers so power transmission cost is negligible. The incentive to be efficient is insufficient.


I think people forget that some of this software may be relatively fast. The problem is, most corporate environments are loaded up with EDRs and other strange anti-malware software that impede quick startup or speedy library calls. I've seen a misconfigured Forcepoint EDR rule block a window for 5 seconds on copy and paste from Chrome to Word.

Another example: it takes ~2 seconds to run git on my work machine

    (Measure-Command { git status | Out-Null }).TotalSeconds
while running the same command on my personal Windows 11 virtual machine is near instant: ~0.1 seconds. Still slower than Linux, but not nearly as bad as my work machine.


Telemetry, syncing to the cloud by default…


Neither of which contribute significantly to size though. The size aspect is what these new preloaders would help with.


We stopped developing for users/customers and instead added layers to make developer lives easier.

Why the hell are all my desktop apps written in JS now?!


> Why the hell are all my desktop apps written in JS now?!

Have you seen the state of pretty much every non-js UX framework?

That's why.

JS/css/html won the UX domain in a way that no other language comes close to. If you look at the most recent most modern UX frameworks, they are often just half implemented poor mimics of the js/css/html stack with approximately 0 devs writing new 3rd party extensions.

Intellij uses swing, SWING, as it's UX. A framework written in the 90s filled with warts. Yet, it's still a better experience than the more recent JavaFX experience. Swing simply has more support.


Call me an idiot, but I still gladly take Swing and javafx over JS and monstrosities like react. The state of Qt is also very good. Web won because the distribution model is easier on the user, and because managers thought UX designers would be making whole apps now, saving on rates. Not because it's technically superior.


You're not an idiot for liking the Swing/javafx/QT way of doing things. Or even for thinking they are technically superior.

The bigger issue isn't the tech, it's the ecosystem. While you might like swing, you simply are never going to find the swing version of Material UI or D3.js. That's more the problem that you'll run into.

For some of our apps because we need charting, we are using GraalJS just to run the JS charting library to export to an image that we ultimately put on some of our downloadable reports. It's a huge pain but really the only way to do that.


> you simply are never going to find the swing version of Material UI or D3.js. That's more the problem that you'll run into.

I remember a time when having your application look "out of place" was undesired, and the ultimate goal was to be "as native as possible". If you are running a website selling something, I agree that you want a brand-identity and a unique look. But productive software shouldn't require users to adapt to new UX paradigms (guessing whether the cancel button comes on the left or on the right, dealing with slightly do different input method and entry shortcuts…).

Anyhow, I think things could be worse, since, as you say, we can embed a webview into any JavaFX/Qt/… app and get the best of both worlds.


It's quite something that there hasn't been any real successor living up to Delphi or even goddamn Visual Basic for modern desktops.


Skilled programmers working on boring stuff like office. Most programmers today don't have the skills they think they do and would find working on something like Office boring.


Ironically, at the start of my career working on something like Office was my dream, and would actually still be. I reserve to change my mind once I’ve seen the code base, though. ;)


Cruft built on frameworks using libraries with a zillion dependencies, some of which are cruft built on frameworks...


So, we often look back on the old days with rose tinted glasses. But let me recount my IT classes from the 90s.

We'd sometimes go to the library to write something up in MS Word. We always liked this because it would be a good 5-10 mins to boot up some kind of basic Unix menu. You'd then select windows 3.1 and wait another 10-15 minutes for that to load. Then you could fire up word and wait another 5 minutes. Then you could do 5 minutes work before the class was over!


Well, we networked all the computers together around that time, and it turned out that all the 1337 performance hacking that people did back then had severe unintended consequences. You’re talking about an era in which the NX bit would not be utilized by Windows for another ~7 years. “Smashing the Stack for Fun and Profit” was contemporary work.

It’s not rocket science to eke out oodles of performance out of a potato if you don’t care about correctness or memory safety.

Word 97 will only delight you if you use it on an airgapped computer, as a glorified typewriter, never open anyone else’s documents with it, and are diligent about manually saving the doc to multiple places for when it inevitably self-corrupts.

But at that point, why not be like GRRM and write on a DOS word processor? Those were historically a lot more reliable than these second-generation GUI apps.


> How did we get back to this though?

By piling up nonzero-cost abstractions left and right.


And it's easy to understand how we get into that trap. Each one of those abstractions is very low-cost, so it seems harmless.

And getting out of the trap is hard too, because no single abstraction is to blame - you can't just hit things with your profiler and find the hot spot. It's all of them. So now you either live with it or rewrite an entire stack of abstractions.


"How did we get back to this though?"

Probably because windows needs to make a connection for every file somewhere else first and wait for the reply, before granting you the advanced software as a service feature called text editing.

It definitely feels like this at times and I fear there is too much truth in my statement.

But it is not just windows only. My old chromebook took seconds to open a folder in the file browser (even if it was already open). But a "ls" on the terminal was instant for any folder. So getting the information was not the problem. But from there to displaying it in a GUI, there seems to be myriads of important (tracking?) layers involved.


My guess is that we're already seeing the consequences of "AI-assisted programming". Just yesterday, Microsoft's CEO revealed that 30% of their code is written by AI.


Given the game of telephone which would have to had occurred for that 30% figure to travel from developers up to the CEO, it's probably including things like autocomplete...

The Plan

In the beginning, there was a plan, And then came the assumptions, And the assumptions were without form, And the plan without substance,

And the darkness was upon the face of the workers, And they spoke among themselves saying, "It is a crock of shit and it stinks."

And the workers went unto their Supervisors and said, "It is a pile of dung, and we cannot live with the smell."

And the Supervisors went unto their Managers saying, "It is a container of excrement, and it is very strong, Such that none may abide by it."

And the Managers went unto their Directors saying, "It is a vessel of fertilizer, and none may abide by its strength."

And the Directors spoke among themselves saying to one another, "It contains that which aids plants growth, and it is very strong."

And the Directors went to the Vice Presidents saying unto them, "It promotes growth, and it is very powerful."

And the Vice Presidents went to the President, saying unto him, "This new plan will actively promote the growth and vigor Of the company With very powerful effects."

And the President looked upon the Plan And saw that it was good, And the Plan became Policy.

And this, my friend, is how shit happens.

from anonymous email


Interesting that this story is not really a game of telephone, but instead a single layer replaced the meaning on purpose.


By software, most of it is probably generated scaffolding.


There are no financial or career incentives to optimise existing, working things in these institutions. The structures which these large companies use are for people working on new things, so that's what we get. Making office faster doesn't create headlines, but a new product that does the same thing somehow will.


Ever notice how Windows 7 and 10 and 11 have basically the same features and benchmark the same on performance tests yet 10 and especially 11 completely shit the bed if you try to run them off a hard drive? Like they might all boot in twenty seconds from an SSD but booting from a hard disk might take W7 two minutes and W11 ten minutes to a stable desktop.


Software 'engineering' is too abstract, yet imagine the outrage if every new highway had a 20km/h limit...


In the old days there was the saying 'What Intel Giveth, Microsoft Taketh Away'


How much engineering time do you think is spent optimizing startup time on most modern editors? I’m guessing next to nothing.


"eight megabytes and constantly swapping"




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: