Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
How to Know When It's Time to Go (thecodist.com)
354 points by kiyanwang on July 14, 2024 | hide | past | favorite | 217 comments


Retired from Apple a few years ago (at age 57).

I was not obsolete. A big company like Apple, there are always things that need taken care of.

I assumed with iOS, Swift, etc., maybe the guys on the Cocoa team were obsolete? Of course not. That code is still there, still needs maintaining, interoperability with the new languages, frameworks, etc.

I'm more surprised they want to stay on.

And that is in fact why I left Apple: the job had changed, the "career" had changed. The engineers were no longer steering the ship. It had been that way when I started in 1995 though. A "team", let's say the graphics team, would figure out what API to revisit, what new ones to add — perhaps how to refactor the entire underlying workflow. The "tech lead" (who would regularly attend Siggraph since we're talking about the graphics team) would make the call as to what got priority. Marketing would come around after the fact to "create a narrative" around all the changes to the OS. I hate to say it, but many, those were the good ole' days.

(And let's be clear, in the 90's, Apple's customers were more or less like the engineers, we also loved the machine for the same reasons they did — so we did right by them, made changes they would like because we wanted them too. You can't say that as convincingly for the phone, being a mass consumer device.)

Marketing took the reins long ago though — especially as Apple began to succeed with the iPhone (which, someone can correct me if I am wrong, but I think was an engineer driven project initially — I mean most things were up to that point).

I stuck around nonetheless though because there was money to be made and kids still to raise.

When the last daughter flew the coop though, so did I.


I want to create a company that is like Apple of the 90's and resurrect the "bicycle for the mind". Is there any chance you would consider mentoring?


There's plenty of people who want to build that thing, and most of us have a good idea how it probably should be, at least at first. The problem isn't building it, it's funding it without selling out the user.


If you have to make a grand reveal to the world and change all of it a la Steve Jobs with a single product, you're dead in the water. But we have the Internet and crowd funding and open source these days. Which part of the thing are you most interested in building? Go build that on top of whatever's closest, open source it (yes, even hardware), and iterate from there. It's a failure of the imagination (or perhaps the wisdom of age, or the lack of the foolishness of youth) if you're blocked by the fear of selling out the user.


I'm too cynical to be a mentor, ha ha. I too aware of all the engineering-led companies that ultimately tanked. General Magic is the first one that comes to mind.

But no reason for me to mentor — really it's just let the engineers drive. Sometimes it works (or sued to in another era), sometimes it does not. Either way, it will be a fin place to work. ;-)


Do not discount the cynic. I persist. Here is my question, for example. How would Apple and its ecosystem have evolved differently if they never adopted Objective-C? Would it be possible to accomplish everything with C and better tooling?


I like programming. I can still do it.

What I don't like is all the bullshit around it. Primarily now the barrier is that I don't have to work, so why would I put up with abusive hazing? I mean of course, hiring processes, which have only gotten worse over time (hallmark property of the cycle of hazing).

I'm not doing on-call rotations anymore. Either allow us to engineer the thing to be resilient, or pay off-duty people (a wonderful opportunity for offshore people that management is so desperate to use).

Finally, I don't want to code in Python or JavaScript. As a long time programmer, it is annoying that we keep going backwards and wasting more and more hardware power.

Nobody is producing anything exciting in software anymore. I can't think of a pure software company doing anything I would be excited for, because google and facebook and the like control the internet.

It doesn't even pay that well anymore, and AI is just another huge excuse to drop wages by management.

Apple is a fantastic example: operating systems are stagnant, hardware outside of the architecture switch is stagnant (and how much of that was simply priority access to the state of the art TSMC node tech).

Nobody makes good solutions for anything anymore.


>Nobody is producing anything exciting in software anymore.

They're not? Some of the newer programming languages seem very interesting, attempting to fix some of the mistakes done in older languages. Of course, most of the really interesting problems are already solved by existing solutions, but perhaps there's room for improved solutions instead of just using the incumbent.

>I like programming. I can still do it. What I don't like is all the bullshit around it.

If you have spare time (you sound like you might be retired), perhaps you should try getting involved in an open-source project that interests you (and isn't in Python or JS of course).

>I can't think of a pure software company doing anything I would be excited for, because google and facebook and the like control the internet.

Personally, I work in robotics and find it quite interesting. I would also find writing software for spacecraft interesting. Neither of those are "pure software", but still I think they're applications that will change the world, hopefully for the better, and don't already have some huge incumbent dominating the market.


I think robotics is going to be the next the thing that I would push programmers to concentrate on

Political shifts probably mean that manufacturing will be onshoring and automating. Software in the last 3 decades has basically mirrored like Conway's law management structures without source manufacturing elsewhere.

The next economy will have a lot more near shored manufacturing. And that means hardware, robotics and hardware software boundaries

The real world/physical materials is also a little bit more resistant to AI replacement


Robotics have been around forever, it's only by being AI driven will it gain interest and funding. No one wants to laboriously hand write gcode to actuate a motor to move an arm. People want to ask the robot maid to make coffee for them in a regular human coffee making fashion. A French press or a pour over or whatever. Traditional programming hasn't been able to achieve that in the decades it's been around, so the only way robotics is getting interesting (which it is) is the addition of AI/ML models.


I agree cuz one thing that AI is good is breath knowledge of apis if it's properly trained.

What is robotics but a whole wide range of apis to do all kinds of different things with different devices.

But the point is you're going to need a human to verify that the robot is doing what the AI generated code is intended to do.


> It doesn't even pay that well anymore, and AI is just another huge excuse to drop wages by management.

Can you expand on this? I haven’t noticed salaries going down where I’m at (only fewer open positions the last ~1.5 years due to the global economic climate - but I’m sure this is cyclical and will swing back soon enough).


Salaries are not going down but costs of living skyrocketed. I am in the top 3% earning bracket as a dev in my country (Poland), I live in a relatively cheap area (south-east) and when I reached my 30s I could afford to buy just a ~100sqm city apartment which cost exactly 5x of my parents 200+sqm house which they bought without mortgage as factory workers without higher education.

And each year I can afford less and less with my relatively huge salary.


When your parents bought, your country had few highly-paid jobs, and places like that have low housing costs: the presence of many highly-paid workers is what causes housing prices to be high.


But “not as well paid as it used to be” is relative to other wage earner, not to cost of living. Two uneducated polish factory workers today won’t be able to buy your parents’ house either.


Idk, maybe it's not the exact book definition but I always thought about how well I'm paid in terms of the purchasing power, not the arbitrary value, and especially not by comparison with others (strictly speaking, because costs of living are kind of that, isn't it?).

The software jobs slowly go into the direction of not being worth the effort (I don't really believe that we will reach that point but that's the current direction).

> Two uneducated polish factory workers today won’t be able to buy your parents’ house either.

Of course - because from my point of view factory jobs are currently paid terrible. They used to be paid better (worth the effort due to being able to buy more).


Did you hear that, bruh, you should be totes cool with not being able to afford what your parents could because if they were where you are they wouldn't be able to afford shit either. You just need to understand economics is all. That makes it all ok.


I'm not saying we should be fine with it, just trying to understand what "doesn't even pay that well anymore" mean practically. E.g. if other careers have gotten better compensated while programming has gotten worse, it would have meant they'd be incentivized to change careers.

What I see locally is that cost of living has gone up but no other jobs (aside maybe real estate?) have improved compensation relatively to programming in that time.


There is a lot more value being produced, but the capitalist class has managed to capture a much higher percentage of it, leaving the worker bees with less and less. At least we have lots of toys to distract us!


Outsourcing everything is what's really destroying the salaries in "advanced" countries. And high inflation seems to go hand in hand with the domestic economic shutdown and trade imbalances.


Thing is, Poland IS the country a lot of stuff is being outsourced into :)


Worry not we started to outsource to India as well... My wife's company (low code domain-specific software creators) laid off 80% of their workforce and contracted a smaller amount of people from India.

It's not going well but I bet it's gonna take at least a year before anyone notices.


well I'm pretty well paid for Denmark, but my wages haven't really gone up in the last few years and when I look at payment rates in Denmark I'm still pretty much at the top for programmers who don't consult - but my wage used to be the same as the average two person household in Denmark and now it is a couple hundred dollars less.


It could be a nod to a shift into ‘do more with less’. In Sweden hospitals and schools have basically lost all their administrative staff and students per class has grown. Teacher salaries might not have gone down, but expectations are much higher, as they have less support from non-teaching staff.


Assistants, like Copilot, substitute for junior devs, making it possible or more likely that a team will put off hiring. I have seen this silent killer in action.


I'm much earlier in my career than you are but have had serious thoughts about leaving the industry altogether for similar reasons. The interview processes are absolutely toxic these days, much of the industry seems like an outright scam (crypto, AI, etc.) and the trend is to casually waste resources.

Part of my motivation to go into writing code in the first place was that I noticed software getting worse and more user-hostile in the 2010s and I wanted to change that. Turns out the people making software worse think the stuff that makes it worse are "best practices" so you're fighting an impossible battle and nobody is going to dare allowing you to advance into a leadership position or often even get a job in the first place unless they think you're a true believer in the BS.

I also have no interest, at this point, in writing code unless I'm paid to do it. It's hard to find motivation to write code when I mentally associate it with all of the corporate BS and the grifting con artists of the tech industry. The one saving grace was that the money was good and it was possible to switch jobs for more money or because you're tired of 1 particular company's BS. Now, even that isn't possible anymore so what's the point?


> Part of my motivation to go into writing code in the first place was that I noticed software getting worse and more user-hostile in the 2010s and I wanted to change that. Turns out the people making software worse think the stuff that makes it worse are "best practices"

Yea, this has been the biggest change I've witnessed during my career. When I got into computers and programming, it was all about empowering the user, helping the user solve their problems, and providing the user with the tools that make the computer do what he wants it to do.

Now, the software industry is mostly about empowering the software company (and its "partners"), solving the company's problems, and making the user's computer do what the company wants it to do. From the software company's point of view, the user is seen as either 1. an annoying middle-man who just happens to (for now) possess the computer and/or 2. a cow to be milked for money, attention, engagement or time.


I wonder how many other people were at Apple from the mid-90's until a few years ago, that's an incredibly long tenure. It seems like one of the more interesting places to be during several very interesting transformations.


May you live in "interesting" times.

I've had 3 jobs in the ~decade-long range. I was really ready to move on in each case. Partly I was ready for a change and partly the company had changed.


To separate the decision process from the outcome, is wisdom always.


I started programming at 10 and now I'm 50, and right now it feels like I've reached this point -- it's boring, I have trouble keeping up, I feel the things work lets me work on are not important. Interesting work goes to younger colleagues.

The problem is, I have a family and finding fulfilling work that you have no experience in, in this country, at 50, is close to impossible.

So for now I consider myself lucky and try to rediscover the fun things in programming.


I'm a bit older, I don't really feel trouble keeping up but looking at the landscape it's just not that interesting anymore. So many "new" ideas are actually old ideas but the people pushing them are too young to know that.

I don't have any doubt in my ability to learn new languages and frameworks, but running in that hamster wheel just gets boring after a while.


What are some examples of new ideas that are old?


Lambdas (in the cloud): see CGI scripts and inetd.

Containers: see BSD jails, Solaris zones.

WASM: see JVM and Smalltalk VM.

Async / futures / actors: see Erlang, Lua, Oz.

The cool type system of Typescript: see OCaml and Haskell.

Numpy: see APL.

Through the list above, there's usually a 20 to 40-year gap between the first availability and the turning into "new hotness".


It's not every day that we see Oz mentioned here! I was very involved in writing the Mozart/Oz 2.0 VM.

I also wrote a "toy" (read: for school) dialect of Scala compiling to Oz and therefore turning every local variable or field into what Scala calls a Future, for free. Performance was abysmal, though! But in terms of language idioms, it was quite nice.

---

Unrelated: about Wasm, none of what it does is new, obviously. What's interesting about it is that

a) browser vendors agree to do it together, and

b) the design grows to accommodate many source languages. This used not to be the case but the eventual arrival of WasmGC significantly redistributed the cards of the game.

Relevant background here: I'm the author of the Scala to JavaScript compiler, and now co-author of the Scala to Wasm compiler.


I'm a bit hesitant to describe $NEW_CONCEPT/TECH as just $OLD_CONCEPT/TECH. Echoes of older things in a new context can really amount to something different. Yes, VMware didn't create the idea of virtualization and Docker et al didn't create containerization but the results were pretty novel.


I'd rather say that good ideas keep on returning, no matter whether they are remembered or getting reinvented.

It's not that those who reapplied the old concept in new circumstances are not innovators; they are! Much like the guy who rearranged the well known thread, needle, and needle eye and invented the sewing machine, completely transforming the whole industry.

But seeing the old idea resurfacing again (and again) in a new form gives you that feeling of a wheel being reinvented, in a newer and usually better form, but still very recognizable.


The plumbing behind Docker is not particularly novel but the porcelain was imho a major advance.

There were plenty of ways to do "containers" (via vservers, jails, zones etc) but the concept of image never caught on before Docker.

You could sling tarballs of chroots around and at times this did happen but it was a sort of sysadmin thing to do, there was no coherent "devex".


Generally agreed.

About WASM, it is not the first sandboxed bytecode interpreter but the first that runs in a browser and that has usable toolchains to compile not “browsers first” languages into it. I’d argue that that’s where the novelty is.


Did Java applets arguably not do this 20+ years ago?


Maybe you know this better than me. Were non-JVM native languages available 20 years ago for Java applets?

My conception of it is that they were pretty much Java only (with Clojure and Scala also available in the later years before they got deprecated?). Is this conception wrong?


Yes, you could write applets in other languages. The choice was rather narrow, but you could use [Python], [Scala], or [JRuby].

[Python]: https://www.jython.org/jython-old-sites/archive/21/applets/i...

[Scala]: https://cs.trinity.edu/~mlewis/ScalaApplet/scalaWebApplet/We...

[JRuby]: https://www.jruby.org/getting-started — offers to run as an applet in the first few lines.



Some other >20 year old JVM languages not yet mentioned in sibling commins: Kawa (a Scheme) and Groovy (a JVM native dynamic language)


Java, Flash, Silverlight, ActiveX? There were loads of technologies to run different languages in a browser, but they were all proprietary to a point; none of them were a web standard, they all needed separate installation or a specific browser, and they were all basically black boxes in the browser. Whereas (from what I understand) wasm is a built-in browser standard.

There was (is?) also asm.js, which IIRC was a subset of JS that removed any dynamicness so it would be a lot faster than vanilla JS. But again, no broadly carried / w3c standard.


I thought those were interpreted by the JVM, which was subject to security issues. WASM faces no such security issues, no?


WASM also has potentials for security exploits, but those selling it are quite silent on those.

Everything Old is New Again: Binary Security of WebAssembly

https://www.usenix.org/conference/usenixsecurity20/presentat...

Just one of the many articles that are slowly surfacing, now that WebAssembly is interesting enough as possible attack vector.

While there is a sandbox, you can attack WASM modules the same way as a traditional process via OS IPC, by misusing the public API in a way that corrupts internal memory state (linear memory accesses aren't bound checked), thus fooling future calls to follow execution paths that they shouldn't. With enough luck, one gets an execution path that e.g. validates an invalid credential as good.


The JVM implemented properly should not have security issues. The class library however... (i.e. it's a lot easier to sandbox things if you start without any classes that interact outside the sandbox).


The JVM is fairly good at sandboxing, as these things go. Turns out sandboxing arbitrary software is an extremely hard problem (as the WASM folks are starting to encounter in the wild)


At least in so far as the higher level (DOM, browser runtime) and lower level (memory access, to the extent that it's mediated by the WASM VM) have no security issues...

The VM itself is pretty tight, but abstractions have a nasty habit of being leaky.


Oh, sweet summer child


Well yeah but the problem was that you still needed that runtime, WASM should solve this.


Couple more:

(1)

Garbage collection in every high level language: Java, which was the first mainstream language to do it-- people were seriously using cpp for high level business logic at the time, and were suspicious of GC for its performance.

But Java itself got it from LISP, which had introduced GC without it ever going mainstream decades prior

(2)

No SQL had already been tried as hierarchical databases in the 70s or 80s iirc. Relational model won because it was far more powerful. Then in the early 2010s, due to a sudden influx of fresh grads and boot campers etc, who often hada poor grasp on SQL, schemaless stuff became very popular... And thankfully the trend died back down as people rediscovered the same thing. Today's highly scalable databases like Spanner and Cassandra don't ostentatiously abandon relational calculus, they reimplement a similar model even if it isn't officiallu SQL

(3)

And then there's the entire cycle that's gone back and forth several times of client based vs server based:

First there were early ENIAC type computers that werr big single units. I would consider that similar to thick client.

Then as those developed we had a long era of something more similar to cloud, in that a single computer developed processes to support many partitioned users who submitted punch card batches.

That developed even further into the apex at the time of cloud style computing: terminal systems like ITS, MULTIcS, and finally in the 70s, UNIX.

Then the PC revolution of the 80s turned that totally on its head and we went back to very very thick client, in fact often no servers at all (having a modem was an optional accessory)

We stuck with that through the 90s , the golden age of desktop software.

A lot of attempts were made to go back to thinner clients but the tech wasn't there yet.

Then of course came the webapp revolution started by Gmail's decision to exploit a weird little used API called XMLHttpRequest. The PC rapidly transformed over the next decade from a thick client to a thin vessel for a web browser, as best exemplified by the Chromebook, where everything happens in the cloud -- just like it did in the mainframe and terminal days 50 yeara ago...

The trend could stay that way or turn around -- it's always depended in hardware performance balance changes.


To be honest, NoSQL makes sense where the stream of writes is very intense, so ACID guarantees are impossible to enforce along with relational guarantees, like referential integrity. See stuff like Cassandra.

Schemaless has its place for document storage and the like, but it requires a much more careful approach, else it can devolve into insanity.


I have to say that all your “old” ideas (they are all from the 90s AFAICT) seem new to me ;)

For example, for Haskell [1990] (ok, not so much the type system bits, but…), see FP [1977] (https://en.m.wikipedia.org/wiki/FP_(programming_language))


1990s happened three decades ago. 80486, Doom, Win 95, JavaScript did not exist half of that decade, Linux was a hot new thing, etc. Remember these?

It's like referring to ideas from 1960s as "old" in 1994. Not ancient, but already not really recent.


I think you missed my point that what you're describing as the "old" idea were actually the "new" idea with a corresponding "old" idea. For instance, you mention OCaml's typing, but OCaml is from 1996 and Milner's type-inference work (which was for an early version of ML) is from 1982. And ML itself is from 1973 according to https://en.wikipedia.org/wiki/ML_(programming_language) ...

[My personal experience from doing related-work searches for research papers is that there was often an at least somewhat relevant reference from the 60s...]


about 15 years ago the joke was, `cat /etc/services | mail apply@ycombinator` as at the time it seemed like startups were just doing file transfer, email, network file systems, etc. it wasn't far off, as unix is file based, and the internet is also file based.


And to a point they were correct; file transfer 15 years ago was closely linked to piracy and dodgy websites that scam you into pressing an ad instead of a download button. It's only thanks to e.g. dropbox / cloud file storage suppliers, wetransfer, etc that that bit has been resolved.

Dunno about email though, the last real innovation in that space that I can remember with lasting impact was gmail. There were a few more tidbits like inbox (RIP), the inbox zero methodology, and Airmail (?) but none of them really took off.


There's always a push and pull between old and new tech and I agree some of the hot new tech is regurgitated old tech, but most of your examples aren't really comparable.


I would say that my examples are rhymes, different developments of the sane theme. They are not literal repetitions, of course; comparable, not identical.


Cloud: IBM mainframes and Service Bureaus.


Basically all of Tailwind CSS. Inline styles are nothing new, neither are utility classes, or the scalability issues of inline styles that led to Tailwind reinventing classes with their `@apply` macro for creating component classes.

Edit for another: RPC calls are really old and went out of style maybe 15 or 20 years ago in most codebases. Most of the modern JavaScript metaframeworks are now using RPC calls obscured by the build/bundling process.


Thank you for mentioning Tailwind. Every time some young dev talks about how Tailwind is "forward thinking" I just want to scream into a pillow. This is also the case now that SSR is becoming popular again.


I can deal with the SSR becoming hip again, but can we please settle on either back or front-end rendering? Either was good, but trying to combine the two is evil.


SSR is the most mindblowing of the lot, it's gone full circle.

I mean granted, I've worked with e.g. Gatsby for a while which is SSR on the one side but a hydrated SPA with preloading etc on the other making for really fast and low bandwidth websites, but still.


RPc calls ala SOAP may have been obsoleted but things like gRPC were and are the building blocks of many large companies.


Sure, I'm not saying RPC isn't used today or that it doesn't solce specific problems.

It is a reinvention of an old idea though. There was around 15 years where RPC rotted on the vine until Google brought it back for (mostly) the enterprise scale, and another 6 or 7 years before JavaScript frameworks rediscovered it again for fullstack web applications.


… Eh? The predecessor to gRPC seems to have started internally at Google in 2001, and Google open-sourced it in 2015. In 2001, CORBA was all the rage; by the mid-noughties this had been replaced with SOAP, and maybe Thrift rpc in trendier places. I gather there was a whole parallel Microsoft ecosystem with DCOM and things, though that wasn’t my world and I don’t know much about it. But the point is that there hasn’t been a time where some form of RPC wasn’t in fairly common use since at least the early 90s.

The details change, and each one tries to solve the problems of the past (typically by inventing exciting new problems), but conceptually none of these things are _that_ different.


I may have completely missed a generation of RPC tooling. I was thinking specifically about web development in this context, but in general I don't remember hearing anything about RPC use between the early 2000s and mid to late teens (other than legacy systems maybe).


Watching web tech evolve is a good example. So much churn rebuilding the same thing over and over.


And never once reaching parity with desktop UI frameworks. Not even close.


Web frameworks barely even abstract much. You still spend so much time marshaling things in and out of strings everywhere, and cramming information into URLs.

Mind-numbing makework, really.


Mono-repos are now coming back with a "hipster" shine to them, with fancy in-repo build systems and what not.

What's funny about this example is that it's arguably not even that much of a time-difference between the two epochs of forgetting and re-learning. It's just that everyone jumped on the microservices bandwagon so much that they couldn't deal with it in a mono-repo context, so they dumped it and convinced the world that many smaller repos was "better". Then they learnt the hard lessons of distributed and complicated version dependencies and coordinating that across many teams and deployments. Their answer to this? Not back to mono-repos, no no no, semantic versioning dude, it's the hip new thing! When that was a bust and no one could get around to being convinced of using it "the right way", they were forced to begrudgingly acknowledge the value of mono-repos. But not before they made a whole little mini-industry of new build or dependency systems to "support" mono-repos as if they're just lots of little repos all under a single version-controlled repo.

These days I get this kind of stuff: "Hey you guys wrote this neat module as part of your project, can you separate it out and we can both share it as a dependency? Because, you know, it's a separate little mini-something inside of their codebase." ...Only to then be told that separating it out would "ruin" their "developer experience" and people would have to, gasp, manage it as a dependency instead of having it in their repo.

/rant. It's really hard not to be shocked and disgusted at this level of industry-level brain rot. I never thought I'd be "that guy" complaining about my lawn, but seriously, our industry is messed up and driven by way too many JS hipsters and their github-resume-based-development.


This is kinda why I really, really dislike the "social coding" meme that went around in the 2010s.

I get it, it's a team sport. It's just that the more people you put on your "team" the less agency everyone feels because responsibility gets diffused and it becomes more about about the "team" and less about actually doing the thing.


It's not the cycling that's the problem but that one can do nothing to stop it. People (perhaps including me) are dumb and insist on learning by making the mistake.


> Only to then be told that separating it out would "ruin" their "developer experience" and people would have to, gasp, manage it as a dependency instead of having it in their repo.

I hate having to do this, because then I have to get Nexus working with whatever the package manager in question is (Maven, npm, pip, NuGet all have different ways of publishing packages), setup CI for the publishing and god forbid I also need to manage the Nexus credentials for local installs and possibly even might have a Git submodule somewhere in specific cases, which also confuses some tooling like GitKraken sometimes.

It does prove your point, but honestly dependency management is a pain and I wish it wasn’t so; separating a module from your main codebase and publishing it as a package should be no harder than renaming a class file.


AI. A lot of the things that are "new" were just waiting on hardware advances and cost reductions.


Write-ahead logs seem to be rediscovered every 3-5 years.


The thing is ... The industry needs us. It's making a mess all over and valorizing complexity and novelty. Constantly. Programmers with experience in our age range have, I think, a better sense of how to manage this and encourage simplicity (partially out of necessity). But age and novelty bias in our industry means this knowledge doesn't pass on.

It's tough to tell younger engineers that have cut their teeth swimming in intricacies and edge cases and integration nightmares and constantly surfing on the edge of chaos, and managing it, that they're likely contributing to the problem, not fixing it. But someone needs to.

I can't remember details like I used to, things mark&sweep out of my brain much faster they used to. (Probably not just because I'm older but because as a parent, home owner, and spouse... I just have a lot to manage on top of it.) But.. really... a good system, a well-built system ... should be resilient to that, and people with experience.. that's hopefully what we build.


I have been lucky enough to have been the youngest person on every team until my mid 30s. I worked with some truly gifted engineers, who had almost no ego, over the course of my career they just were much older than me.

When I reflect I do cringe a bit at what I was zealous about and things I took way too far. But, I do think the discussion, sometimes debate, around the fancy/new vs tried/true resulted in much better results.

Now that I am old, but not that old, the younger engineers who are passionately discovering new tools and “new” design patterns keep me interested in software development. Being able to share where things come from then we can compare/contrast together. It is rarely a straight copy and it’s fun to see how things get better/worse with reinvention.

So, I think trying to get a mix of ages on a team is really beneficial. Passionate young engineers help prevent the old engineers from getting too jaded.


I too was always the youngest person on every team... until I wasn't, and it seemed like I went from youngest to oldest in a blink and I still can't figure out how that can happen.

I got into the industry during the .com boom with no degree, without finishing university, so kind of jumped the queue, age-wise, I guess.

And yes, I often cringe in remembrance of past-self. I cringe at present self, too, though :-)


On the other hand, you have to guard against being that person who is in a perpetual state of "Benn there, done that. Didn't work the last 5 times we tried it." Because sometimes the circumstances/market/tech ecosystem genuinely are different.


The key us in understanding why the previous times failed. What constraints existed then, which possibly no longer exist now.

Projects fail for many reasons. Technical, market, capital, time and so on. But things change. Building an add-on for electric cars would likely fail 20 years ago, again 10 years ago. But now? Or 10 years from now?

Only by -really- understanding what caused a project to fail can you determine if that barrier is no longer in place. Which means you can try again, and potentially find the next barrier or success.


I'm in the retirement age bracket. My last experience as a consultant led to disgust with valorizing quantity of work as measured by an arbitrary metric susceptible to simple gaming.

That, and prioritizing "new" features over maintenance because the former were booked as CapEx work, thus amortizable, while maintenance was booked as OpEx, combined with companies wanting to minimize CapEx ratio for accounting purposes.

Notice how little of what is measured and managed has anything to do with building working software to satisfy user needs?


Similar age point, the problem is not keeping up, is fighting the continuous push to management, which I don't plan to ever do, unless forced by life circunstances.

It appears that the only path left for us in many European countries, is to go freelancer, and I vouch for the same problem regarding skills, forget about having Github repos, or open source contributions, if the technology company X is looking for isn't the one we haved used in the last 5 years or so on day job.


I'm a European in my late forties and have been a freelancer for the last five years but I find it harder and harder to motivate myself to continue working. What really takes all joy out of working as a software engineer these days for me are the endless Scrum ceremonies almost all companies in my area have embraced.

In the old days (say until ~7-8 yrs ago) I didn't have to attend very many meetings but of those I had to go to most were useful/necessary. These days I could probably count the useful meetings I attend in a year on one hand but the amount of Scrum-worship-meetings per week requires two hands.

The same amount of actual work I could do in a week in the old days would now take several months because it needs to be planned in detail. And no, not any technical detail, but rather discussions on how to divide it into stories but without doing any proper technical analysis and then straight ahead to story point guesstimates, yay! Then after a brief period of actual coding it's stuck in code review for weeks because no one will look at a PR unless prodded with a stick.

While I do think that code reviews can some times be beneficial, most of the time they are (in my experience unfortunately) pretty useless. Most comments (and I have to admit I'm guilty to this as well) are more bike-shedding than bug-preventing. Complex bugs are rarely found in code-reviews in my experience.

While these are my experiences during the last 7-8 years or so, it's more or less the same on all the half a dozen companies (or so) I've worked for during that period (which is also a very big reason why I've worked on half a dozen companies in that period).


Doing software right will require a lot of planning, irrespective of whether that planning occurs up front or as you go. If you plan more up front, that will eliminate a whole lot of guesswork when the time to do the programming comes. You need systems analysts -- generalists who understand the business and work well with people -- to come in and characterize, in detail, how the business currently works in terms of systems and subsystems, and then propose and design new systems, again to a high level of detail. Once that's done, inasmuch as you need software, producing the software is a simple matter of translating the detailed requirements into language for the machine.

Unfortunately, modern methods are basically just institutionalized guesswork: this is what Agile is all about. It's a methodology designed by programmers for programmers, in order to bamboozle management and inflate the programmers' own sense of self-importance. The correct way to design a business's internal systems, including but not limited to its software, appears to have been forgotten, except a pastiche of it lives on as a strawman called "Waterfall" for Agilistas to take down.


> is a simple matter of translating the detailed requirements into language for the machine.

This is actually the hardest part. I can write detailed requirements about the car I need. Create a PowerPoint presentation that shows a schema of the system and subsystems; the engine block, transmission and steering wheel etc. with lines how they are connected.

That's the easy part. Now you need the team of skilled engineers developing the actual car. And you need them to be experienced and good at it.

You need at least one guy who is able to load a complete mental map of everything that's needed to be engineered. Who understands the business requirements and is able to create a vision for the product and technical solution. He needs to understand databases, web services, authentication, authorization, security, performance, web standards back- and front-end solutions. Be smart about what logical components are needed and have an high level idea how they could be implemented technically. Ideally that guy can also open a repository and read what's going on.

Especially with larger corporations there's still so much potential for automation. Yet what we see is a big fragmented mess. Systems and subsystems that are poorly integrated. Exactly the car you'd expect that was designed in PowerPoint by non-engineers.


> This is actually the hardest part. I can write detailed requirements about the car I need. Create a PowerPoint presentation that shows a schema of the system and subsystems; the engine block, transmission and steering wheel etc. with lines how they are connected.

> That's the easy part. Now you need the team of skilled engineers developing the actual car. And you need them to be experienced and good at it.

In this analogy, the engineers who design the car are the equivalent of the systems analysts. The programmers are the machinists on the shop floor actually building the car.

> You need at least one guy who is able to load a complete mental map of everything that's needed to be engineered. Who understands the business requirements and is able to create a vision for the product and technical solution. He needs to understand databases, web services, authentication, authorization, security, performance, web standards back- and front-end solutions. Be smart about what logical components are needed and have an high level idea how they could be implemented technically. Ideally that guy can also open a repository and read what's going on.

Yes -- that's your systems analyst! More importantly, they need to understand the business and the information needs of the people involved. A high-level, 10,000-foot understanding of technical requirements is important, but the details should be left to the programmers. That's what programmers are good at. It's the big-picture, business-centric, people-oriented view that's missing in today's culture, and prevents us from "building the right thing right".


> The programmers are the machinists on the shop floor actually building the car.

No, because with software there's no human execution. It's the computers that execute the design. The developers design the blueprints of what the computers need to execute. They are the architects.

For an analogy you can probably best compare this with 3D printed houses.

> A high-level, 10,000-foot understanding of technical requirements is important, but the details should be left to the programmers.

But why leave the details to the programmers? Why doesn't the systems analyst produce a proper CAD-like blueprint that leaves no room for interpretation? His system design should produce the exact same result regardless which contractor implements it. Yet that's never the case.

The reason is because he can't. The systems analyst doesn't have a clue what he's designing. If he would be able to write a proper blueprint we could just hand it off to the computer and have it executed. No need for programmers. But now the systems analyst has become a developer.


> Doing software right will require a lot of planning, irrespective of whether that planning occurs up front or as you go.

I'm not opposed to planning but I'm opposed to the kind of meta-planning game that is wont when scrum is involved. I've been in meetings where the thing we're planning is literally to change one line of code and we say as much but the PO still insistently asks if it shouldn't be multiple stories. The whole thing eventually took man days in meetings even though we insisted it was extremely quick. Turns out the whole thing was sold upstream to management as a big feature so a single 1-point story wouldn't cut it.

As a contractor I can at least remind myself that I'm getting paid for sitting through all those meetings but as someone who likes to actually do things I feel like I slowly die inside.


I've similar experiences about Scrum. In the worst case there's one or more developers, usually junior, in the team that are very eager to improve processes. Eventually it's tenth time you are forced to discuss what's the optimal way to define story points.


Tech became too profitable to be left to "those nerds" so now you have very bloated orgs. Though a freelancer should be able to sidestep the grifters unless you're selling yourself as an employee for some reason.


Yeah, my first year as a freelancer was quite sweet actually. Then came the pandemic and me and my spouse got ourselves a vacation house as we couldn't travel any more. While this was a great relief for our mental health during the pandemic, it meant a much higher mortgage so I needed the more stable income.


freelancer

I get that they're depressing if you let them get to you, but I've always looked at it as I bill the same whether the client wants me in a useless meeting or a useful one. Helps me through.


I am 63. Canada based. Have no problems keeping up with tech. I am on my own since 2000 and mostly develop new products for various clients. Have couple of product for myself that bring some dosh. The range is very wide. Microcontrollers, Enterprise Backends, Desktop, Multimedia, Browser based, etc. etc. It is not programming per se that keeps me going (I find it boring enough) but designing systems and interactions from scratch and then watching it work.


I’m in a similar positions (in my 50s with a family to support). For the most part I can get my boring corporate work done fairly quickly. Then I spend some time each day working on personal programming projects where I get my true satisfaction.


I’m 55. Started as a 6502 cracker on the C64.

I still get enjoyment out of some coding - C++ on Linux for enterprise applications - but I do miss the “magic”.


These two perspectives are not incompatible. 49 here and still love programming. But only discovered that after quitting my Google job and spending a year working on my own things. Then housework wasn't getting done because I was writing code instead, and I realized I just love doing it, still, and I'm a far far better programmer than I ever was 20 years ago. I can do things I only dreamt of back then. And faster!

but that's not the same thing as enjoying writing the dreck that many employers want, and keeping up with their endless stack of messy JIRA issues, planning meetings, poor design docs, and management shenanigans....


I sometimes think that big corporations pay more because the actual work there sucks more for an engineer (likely to a manager or a sales, too).


Are you involved in the still-thriving C64 demo scene at all? Possible way to reconnect with the magic if not. Especially by attending (in-person, ideally) one of the many demo parties around the world.

There are also parallels with embedded device and FPGA work that I personally find thrilling.

Plus we on the VICE (open source Commodore emulator) team are always looking for devs.


I bought a C64 and SD card a few years ago. I enjoyed running up a few technical masterpieces - like DropZone - but the gaming interest has waned.

I don’t code 6502 nowadays but I’m active on r/c64.


FWIW I find the vscode + kickassembler + VICE toolchain a pretty fun way to iterate on C64 code.


Same age. Got started on a ZX-81 and a university mainframe.

I still enjoy writing code or shall we say solving problems via code. I still get excited about new things. I'm also a manager and I enjoy helping others. What I enjoy less is the politics.

Building things is fun, I don't think this goes away, it was always fun and is still fun.


Good point. It’s the problem solving.

Thing is I’ve solved so many problems, over the years at different companies, that there aren’t many new ones. Obviously, I can knock out the code quickly to the surprise of many. It’s just experience and I’m not a magician.

Like yourself, I enjoy helping others, younger coders in my case, work through their problems.

I guess that’s why I keep having to switch teams to pull them out of the quagmire they’ve gotten themselves into.


For personal projects I make heavy use of LLMs now and coding is still fun when I do it with the latest and greatest tools. I'm about 5X as productive as I would be if I had to crank out code myself.

I'm used to verifying code with a compiler/interpreter and a unit test -- not by going through my code line by line and declaring to an interviewer "yes I think it's correct". My way of doing things is to just run the damn thing with the right tests and it will tell me if something is wrong.

Unfortunately job interviews these days are still hellbent on whiteboarding Leetcode problems. I'm past that. Unfortunately they aren't. It's this kind of BS -- not being allowed to use the best tools that exist -- that makes me not want to code for work anymore.


I respect the OP's vulnerability and the advice. I've felt like it was "time to go" before, but as a young man I just assumed it was burnout, treated it that way, and got back in the game once I had renewed desire.

Right now I feel like I'll never want to stop making things, but that if I were rich enough and good enough at creating in a different medium other than code, I completely understand the desire to walk away from the terminal and never look back. Few things have been as frustrating to me as programming. Yet since few things have been so rewarding, I persist.

It's a great article because it's making me think about my own life. I'll keep pondering. Thanks for posting it.


> walk away from the terminal

What do you mean, this is the best part of the job, the part I look forward to most each day.


I mean that sometimes I get so sick of the constant problems and churn and breakages and errors and mistakes and bugs and weird behaviors that I feel like doing something completely different.

If you never feel that way, I'm very happy for you!


It does happen, but I find it overall more preferable than documents, tickets, meetings, meetings and meetings.

Spending time in the terminal with vim and shell is healing.


Kinda similar to how Kobe Bryant knew it's time to retire from the game of basketball. He said in an interview (https://youtu.be/Ya8hY0S-8t0?t=54) that he knew it's time when during his morning meditations, his mind will not drift to basketball anymore.


What I wonder is how did Kobe know this was a sign that he was done. Why didn't he think he was just burned out and maybe just needed a break before returning to the game? How can you tell when you're burned out vs. just being completely done with it all?


I think because he was already 37 or 38 at that time, and was suffering from injuries, too. There was no point continuing if both his body and mind are not the way they used to be.


This sounds about right for most pro sports; few continue as players in their 40's, especially in the higher impact / energy sports like basketball.


I meant applying this mentality to programming - how can one differentiate between burnout and being done with the field?


> how can one differentiate between burnout and being done with the field?

Money. If you are financially independent, you are done with the field. If you are financially dependent, you are burnt out.


Time and wisdom.

I think most devs can delay burnout/leaving the field if they begin viewing programming as a means to an end, e.g. "programming as a way to build their own business," or "using programming knowledge to mentor others," or "using programming knowledge in another domain they're interested in to great effect."


Thank you, this was a very interesting interview.


I retired in 2017 and sometimes I think I got out at just the right time, or close to it. In the past 7 years we’ve had the pandemic, remote work then the clawback to the office, and so. many. JavaScript frameworks and changes. And now layoffs all over the place and having to keep up with AI to stay relevant, AI and LLMs are changing faster than React and its ilk.

Instead I now program in a great language, Elixir, working on projects that I want, and reading books that I’ve been putting off for decades.


Huh, that's about the same time I became a full-time freelance frontend dev. I've primarily used Vue that whole time and it's been pretty good.

AI has mostly been a nice benefit to - Copilot really makes writing code more pleasant. I haven't really seen any downsides to AI in my work.

I'm almost always remote, and mostly like that, too.


> having to keep up with AI to stay relevant

This, at least for the time being, seems more a thing that people worry about than a real phenomenon.


> I still write code every day in support of my generative art practice. The code is much more complex than anything I did previously, and much of it does not have anyone else doing it, so it's a lot of invention, which is fun.

Can relate. I've been "retired," since I was 55, and SV was nice enough to let me know that I was too old to play in their pool.

Pissed me off, something fierce, but, in the long run, it's the best thing that ever happened to me.

I could have made millions -for other people- maybe for me, as well, but I have never really been interested in that kind of thing. The work and the technology has always fascinated me.

I've found that what I really enjoy, is making UI tools for nontechnical folks. That's what I do, these days. I make free software for folks that can't afford the kind of stuff I do.


I’m in my late 50’s, and I still love making software, maybe even more now than when I was younger. What’s happened to me over decades as a professional is that I’ve totally lost any interest in “career” or the large corporate entity that employs me. Once any organization grows beyond about 20 people, it starts to become dysfunctional, so I’ll be retiring the day I can convince my spouse we have enough money. That will give me more time to work on things I care about, including software.


Call it the blackjack rule ... once you cross 21 it's a bust.


>Once any organization grows beyond about 20 people, it starts to become dysfunctional

I guess if that's the way absolutely everything functions then perhaps dysfunction is actually just function.


A large org is a totally different beast from a small org, and yet they seamlessly transition into each other. So, a large org is dysfunctional when viewed through the lens of a small org. A large org is also dysfunctional when it is run like a small org, as is a small org when it's run like a large org.


I think people just have unrealistic expectations about humans ability to manage anything. We label things as dysfunctional, yet dysfunction is the norm. I think it gives a slightly warped sense of what the norm is since it implies that things ought to be functioning properly but they aren't. When really it's usually just par for the course.


Hopefully people will tell me why I'm wrong, but right now programming is just feeling like a bit of a dead end in general? The demand seems to be for AWS gurus, data analysts, low-code, prompt engineering etc. I'm not against learning new things to stay employable, but the new things that are in demand don't really seem to be programming. I learned a bit of Rust because it's kind of new(ish) and exciting, but apparently there's a massive glut of Rust devs. Whereas 15 years ago I learned Python and my employment prospects rocketed.


The nature of the majority of programming work now is not enjoyable IME.

Seems the majority of work is overly complex (in terms of "system design") CRUD stuff that uses whatever constellation of "Services" are cool this month, and "solving problems" that Ruby on Rails or ExpressJS solved like ten years ago, but now with way more yaml configurations and and other imposed complexity for dubious gain and benefit.

The new hyper focus of LLM chatbot hype isn't helping either.


There's two things going on. One is, yes, I think the quality of work mostly sucks all over.

But the other is it's a down part of the cycle and there's just a glut of us all, and a bit of disrespect from employers as well.

It's been a long time since we had one, and many people either didn't work through one before, or have forgotten.

That part will bounce back. In 5 years it'll be a crazy job market again, and having Rust on your resume will be valuable.

(To put it in perspective, I learned and wrote Python in 1996, 1997. And I really liked it. But nobody even knew what it was, and nobody would hire for it. I moved on, and lost my taste for dynamically typed languages, and then all the sudden Python was huge, and if I'd stuck with that, it would have been a big thing for me, I guess. I suspect a similar thing will happen with Rust, etc. At least I hope so, since Rust is my day-job :-) )


Work is work and always has it's plusses and minuses.

But, yes, even if the tech cycle isn't terrible at the moment (e.g. dot-bomb nuclear winter) it's definitely down. I somewhat regret effort and money I put in a couple of years ago to get myself setup to do various stuff post "retirement" because, while I haven't exactly been beating the bushes, opportunities haven't been falling off trees either.


What is it that you want to build? I mean if you frown upon AWS gurus, analysts and low code?

Programming for the sake of "writing code" is probably going to miss the target.

For example "analyst". My take is that is where it all started. Someone looking at numbers and needing computers to help making sense of them.


> Programming for the sake of "writing code" is probably going to miss the target.

Why do you have to be so demeaning?

I'd argue almost nobody is "writing code for the sake of writing code". In my case I love solving problems with code. Not by clicking through AWS' terrible website. Not through taking a deep breath and trying to reformulate a ChatGPT prompt for the 17th time.


> Not by clicking through AWS' terrible website.

Most places with a decent level of engineering maturity are using some form of infrastructure-as-code (Terraform/OpenTofu, Cloudformation, etc). Though more broadly speaking, it's true that software developers are now frequently expected to move beyond just compiling a JAR file and calling it a day. Expectations of knowledge of the underlying infrastructure that's running your code and how to operate it is more common than it was 15 years ago. I consider this a good thing overall though.

> Not through taking a deep breath and trying to reformulate a ChatGPT prompt for the 17th time.

I don't know anyone who's doing this at their programming job. GenAI is really good at 1) acting as an enhanced, customizable StackOverflow replacement for specific one-shot algorithms ("given a pandas dataframe with these columns, write code that groups by X and gets the median of the top 3 values"), and 2) pumping out boilerplate code that wasn't interesting to write anyway, like object mappers and certain unit tests. The tougher problems around software architecture, class design, and the trade-offs are still fully in the realm of humans, for now.


> Though more broadly speaking, it's true that software developers are now frequently expected to move beyond just compiling a JAR file and calling it a day.

And you are demeaning as well for no reason. I even went out of my way to clarify I like SOLVING PROBLEMS WITH CODE, not "just compile a JAR file" which you conveniently ignored and pushed your narrative. Not cool, dude.

> Expectations of knowledge of the underlying infrastructure that's running your code and how to operate it is more common than it was 15 years ago. I consider this a good thing overall though.

I don't deny it on the premise but again, most vendors want to lock you in so their UX is terrible and specific. I had much more fun making scripts and cookbooks that setup a VPS for my customer's app. Nowadays this has been mostly remedied by Dockerfiles though integrating with k8s and its 5000+ friends is making me want to retire for the next 3 lives.

> I don't know anyone who's doing this at their programming job.

I don't do it either but I've met plenty of "programmers" who do, and swear by it, even though they had to chase 2-3 subtle bugs that took them 12+ hours to find and correct... whereas just writing those 150-200 coding lines would have taken them 4 hours tops, tests included. It's quite funny.

---

My bigger comment here was to criticize the very weird direction the area is trying to go to. It will fail btw. Marketing people are pushy and get their way... INITIALLY. Sooner or later reason prevails.


> Not by clicking through AWS' terrible website

Look into IaC (infrastructure as code) which all major clouds, and even smaller clouds support. Much more sane way of managing resources.

> trying to reformulate a ChatGPT prompt for the 17th time

Is the company mandating you use AI to solve problems... or? Anecdotally I don't use AI very much at $DAYJOB, nor do any of my co-workers.


> Look into IaC (infrastructure as code) which all major clouds, and even smaller clouds support. Much more sane way of managing resources.

No, sane way is automating it 100% with zero UI required. But you do you.

> Is the company mandating you use AI to solve problems... or? Anecdotally I don't use AI very much at $DAYJOB, nor do any of my co-workers.

As mentioned in a reply to your sibling comment, I don't do it because I was sure from the get go that it will only get some algorithms right, and only for the most popular languages, and I was on point. But I had fun watching colleagues banging their heads against the wall many times.

And again as per the reply to the sibling comment, I was commenting on the general "future" state of the area.


> sane way is automating it 100% with zero UI required

Umm... that's what IaC is for, you write resource blocks in a file then use a command to deploy said resources.


>No, sane way is automating it 100% with zero UI required. But you do you.

Infrastructure as code means text, not a UI. Please Google the things people are suggesting before getting confrontational


Don't you still have to prepare stuff in the vendor-locked UI beforehand? Or is it much better these days?


I'm turning 50 in a few months. I still enjoy coding; and I expect to be coding for another few decades.

One thing I figured out early on is that your choices in languages and tech really matter. There are only so many things you can learn and you have to make some educated bets on things getting traction or not. And if you make the right bets, it's easier to keep your skills fresh and relevant.

Some things look fancy and nice and then five years later it's all outdated and obsolete. And some other things go big. Java was one of those things and in 1995, when I was in university, they decided to use it for teaching programming to first year students. So I ended up being a teaching assistant and now have nearly thirty years of experience with the JVM ecosystem.

I recognized the signs of the platform and language (especially) going a bit stale about fifteen years ago. It's becoming the Cobol of my generation (plenty of work but not the kind that gets me excited). I realized I needed to move on if I wanted to stay relevant. Since then I've touched a lot of languages. Right now, I do a lot of Kotlin but I keep an eye out for new things. Kotlin was a bit of a bet ten years ago. I fully committed to mastering it six years ago. And at this point it's starting to feel like a good bet. The language is modern, has a lot of momentum and there's lots of interesting stuff happening with the language, compiler, tools, etc. Particularly multi-platform is opening up a lot of possibilities.

I've dabbled with other things along the way but never really got the feeling that mastering that stuff was worth my time. E.g. Ruby was interesting but it's now mainly used by people in their forties (i.e. my age). Younger generations seem to not be interested in it. Same with things like Scala. Lots of stuff still happening with both of course but it seems that they are both a bit past their peak.

Python on the other hand keeps surprising me by not getting replaced with something else. I kind of like the language and have done some things with it over the years. And I like that they are clearing out technical debt (like the GIL) and keeping the language fresh. I work with some twenty year old interns that know and love it. People will be doing Python long after I die. That's a bet I didn't make but it would have been a good one. And not too late obviously. I know enough python to be able to jump in a project and use it. I've done so on a few projects in recent years. It's a very approachable language; kind of by design.


> [Java] I recognized the signs of the platform and language (especially) going a bit stale about fifteen years ago.

Java has become a lot better. It was always a boring, but easy to write, easy to debug, easy to build with language, on purpose. That was the selling feature, so that finding developers was easy. It purposely didn't move quickly, or adopt the latest fasions, waiting them out to see what stuck and adopting later.

Ok, so what? Well, Java has started to evolve more quickly with the new development cycles. I still write Java. We easily deal with null safety, we have a good build system, we have a monorepo, we don't (often) have dependency hell. Most of our code runs on ZGC and we don't think too hard about garbage, because ZGC is so damn fast. Our hotpath is different. We think very carefully about GC and memory access etc on the hotpath. And we have one language, covering both types of coding, in the same repo, the same build, the same tooling, the same monitoring, deployments, etc etc.

Java is a real workhorse and is becoming nice to work with too.


Have you played with Go?


Yes, IMHO a bit of a downgrade relative to Kotlin.


I always refer people to the Doris Day performance of Carl Sigman and Herb Madgison's hit song "enjoy yourself, it's later than you think".

https://www.youtube.com/watch?v=nQxsG9Vcndw

There's also a Guy Lombardo and a Louis Prima version but I like this one.

I've been singing this at work for a year or so, trying to give people a gentle hint about my future.


This is a great retrospective. Thanks for sharing.

> It's not worth working and being miserable.

Agree 100%. I've quit several jobs after the environment becomes more stressful than fun. Over the years my tolerance for BS has lowered, possibly to the detriment of my bank account. But I've never regretted my decision to leave. The weight off my shoulders is priceless.

> Age and ability are not correlated.

I wonder how subjective this is. Cognitive decline with age is real, but maybe keeping the brain active with programming can help keep it at bay. A study about this would be interesting.


>> Age and ability are not correlated

> ... cognitive decline..

I know this is not the age groups you thought about, but on the topic: I think they ARE correlated, but the other way: At 40 I have had time to get to know so ridiculously much more than someone starting out in their early 20s. And I see its effect very real, people in early 20s (generalizing ofc) can spend so long on things on have seen so many times...or spend more time making lots of bugs and finding them than just writing the code with fewer bugs.

Or spend their brain cycles on the "how to code" part of the job, instead of that just being second nature and focusing on the underlying ideas.

Or young people may be be competent coders, but completely baffled reading and really grasping underlying ideas in existing codebases (especially this I know I have progressed at with training over the years..)

I feel experience can be undervalued in our industry in a way it is not in others. It is valued... but not as much as I feel it should be..

Of course this effects drowns a bit in the noise of all of the programmers like the OP talks about that barely get by, in all age groups. But within the set of skilled coders... from what I have seen, I would always prefer working with the older to the younger to get a project done..

(Ofc there may be a point where this turns. I lack personal experience with coders 20 years older than myself.)


> But within the set of skilled coders...

If you ever manage to come up with a reliable way of identifying skilled coders, you'll be very, very rich.

At my previous job we had 20 interns and one senior who was like 50 and his attitude boiled down to "why can't we just keep doing things the way I was taught when I was a student".

At my current job there's me, another Junior, and a Senior. The other Junior works very fast, very well, always has valuable input in discussions. With the senior I need to work carefully, because while the guy has knowledge in certain areas, he misidentifies priorities, makes mistakes, doesn't communicate shit, while at the same time demands things to be his way because he is the senior so he has authority. On top of that his English sucks so every meeting in which he's involved takes three times as much time as it could.


> Cognitive decline with age is real

My father worked as a consultant designing analog-style ICs until his mid 70s - his customers were therefore presumably happy to pay his consulting rate. I’m going to vote for “early” cognitive decline being overrated…


To prioritize well-being over enduring a toxic or stressful work environment


I consider myself retired, but I have a different take on the concept.

I like programming computers, but just not 2000 hour a year. I can afford not to do that, so I don't.

I hit a point in my 30s where I could sock away a year's worth of savings in 3-6 months of contracting, so that's pretty much where my full-time phase ended. I came back "out of retirement" when the first kid was born and worked 5 years semi-fulltime to save up enough for houses, college, etc., ramping down to 4 day weeks for the last few years because I really value my free time.

Since then, I've done the odd 3-6 month/year stint (since programming and working on a good team that can ship is still pretty fun.) Recently I've been doing that part time, 2-3 days a week, a few months a year.

I don't know what most people would call my situation. I call it Retired as I want to be at any given moment. I expect I'll keep doing it for the dozen-odd years between now and when I hit "Retirement Age". But maybe not. It's almost more of a hobby at this point.

I guess the point is that it seems like a silly idea to do something all day every day for most of your life, then suddenly drop it completely. If it was fun, do more of it. But on your own terms, and only enough that it's still fun.


Your story resonates with me. How did you get into contracting?


I've been happily programming since 1965, and I've been doing C++ since 1995. There is still so much to learn, it's never stopped being fun. However, Apple Notarization may just be the straw that breaks the camel's back.


I really thought this was going to be a post on The Go language when I clicked the link.


Me too!

The problem with Golang is that it has the same name as a common verb instead of a noun ("go" is used as a verb like 99.99% of the time).

I remember coming across a thread maybe 10-14 years ago where the Golang creators were asked to change the name of the language. They declined. If I recall, one of the arguments was that the name would naturally become associated with Golang. Here we are in 2024, and the confusion still happens.

The TFA blog was great, by the way, even though it was not about about Golang as I had expected.


Some comments here: "I'm at age XX, thought about quitting, but now am rediscovering fun in programming". Go is a good fit here.

I discovered Go around 10 years ago, and it was a point in my career where I was fed up with the overgrowing complexity of the mainstream languages and cultures around them. I was seriously considering switching to other fields. Go has changed that direction 180 degrees.


I was with you on this


> You probably don't know any retired programmers

Ha I know lots of retired programmers. I was one for a while, but like most I really wanted to get back to work


How long were you retired for, and what brought you back?


5-6 years, then I started another company


I'm nostalgic about the good old days prior to 2016 when every technology change meant improvement. Nowadays, tech doesn't change as much, but when it does, it's for the worse.

It's been a challenge for me to adapt to the new reality of coding as a game of busy-work and lock-in through complexity.

It has become a bit of a theatre for me, unfortunately. I know I could do something in a way that's 100x more efficient but it would negatively impact my job security so no thanks. Also, if I do the right thing, taking all the risk upon myself, nobody will appreciate. I'll stick to inefficient popular tools and methodologies. I'll play the game of Whac-a-mole... Like a bad gardner who pulls the weeds out by the leaves and leaves the roots behind. That's the smart move.

I tried the other approach, doing my very best, outperfoming and it couldn't have worked out worse. The manager class feels nothing but contempt for people who outperform. "Good boy! Here, have a pat on the back... Sucker."


I've only been programming for 6 years. I don't feel the same burning passion as I did when I first started coding. I'm a frontend developer, but I've made a lot of lateral switches into DevOps, backend, leadership, etc but I prefer just building what I'm good at though

But I'm basically semi-retired to a degree in my field. I'm doing the bare minimal to get by at this point. I ultimately would love to quit some day, and pivot into a different career, not entirely related to coding. I'm not at that point yet financially though, and am spending energy elsewhere

I would love to start a non-coding related business one day though.


I turn 70 tomorrow, and have been programming for about 53 years. Started a new job about six months ago. I immensely enjoy banging out the code every day. It still feels like a guilty pleasure! "Shouldn't I be doing homework right now rather than creating 3-D worlds algorithmically?" My first language was APL, which I learned in a college computer science course. This hard-wired me to think in functional terms; I personally think Iverson and Dijkstra were saying the same thing, but Iverson said it better: reason about your code from an "outside of time" perspective rather than mentally imitating the fetch-execute cycle of the machine. I view software development as a form of discrete mathematics; inductive reasoning for sequential blocks of code, Pnueli-style temporal logic for concurrent and parallel code. I've learned from some wonderful people how powerful it can be when a team likes each other and gets into a collective flow state. It is a bit like a mental version of quantum entanglement, and it is a very satisfying and meaningful thing when you get there. I've benefited from friends who helped me get that next job, and I've helped friends get their next jobs. About 20 years ago I made a switch to medical device software development. That is a domain that requires dedication to learn relevant mathematics, it is not going to go away, and you become a valued commodity when you have specialized skills and a talent pool that is not too large. And, you get to do things like visit your grandchildren in a NICU and see neonatal ventilators that you helped develop. So, I've been lucky, being able to play all day and do something I love! There are a million different paths through the space of software development; I've tended to traverse the space using the "what would be fun to goof around with on a Saturday" metric.


I'm still going, but the biggest issue I'm finding is that knowledge keeps turning over. I have made big efforts in the past to learn technologies, languages and libraries that became obsolete then starting all over again. Software Engineering now is completely different to what I started with.

I have more Project experience but technically I dont know much more about Cloud, JS frameworks, modern DBs etc than someone 30 years old. Ironically my main advantage seems to be I can focus more and work longer hours than younger people who seem to value WLB much more than we used to.


I have been programming professionally since 1989. I think my strength is to find straight forward solutions where you can focus on delivering value reliably. But now days I feel that are so many smart people jumping from project to project and leaving mountains of technical debt behind. We spend a lot time managing accidental complexity and I no longer enjoy my job like I used to.


Possibly tangential, but this article and some of the discussion here reminded me of this HN thread from 2020: "Ask HN: Former software engineers, what are you doing now?" - https://news.ycombinator.com/item?id=23951850

A former boss of mine retired early once he decided he was done with the politics required to navigate his work. He fully unplugged from corporate life and got comfortable delving into his passions... cycling, woodworking (so much woodworking), etc. He seemed to be having a blast from what I could see. I recently caught up with him, and it turned out after about a year off, he'd just accepted a job at a fintech that his former boss reached out about. I guess for him it was time to go from that specific job but maybe not quite from engineering in general.


I’ve been in programming for two decades and one of the things I enjoy about it is that things change. I did my stint in both architecture and management because I thought what you were supposed to do, but I went back to programming because I like programming. I’ve worked on so many different technologies that I’ve probably forgotten more than some people even learn yet I’ve always liked it.

I do get how you can burn out, especially on the business side of things. A lot of jobs just aren’t important. The trick is to avoid them if you can and leave them as soon as possible if you can’t. Every non-startup / non-economic boom job comes with some degree of Kafka, and you’re either going to learn to not care about it or go crazy. I’m not sure that is especially unique for programmers though, this seems to be most things. Unless you’re extremely talented at the HR part of organisational politics (which most programmers aren’t) you’re also going to have to build some really stupid stuff during your career because change management is hard. So hard that it’s virtually impossible for talented HR staff to do when the direction is upwards, which it’ll always be for programmers. Again, it’s something you either learn to laugh about or burn out on.

The change in technology, however? Isn’t that part of the fun? If it isn’t, is that because you don’t have the time for it? Because if don’t (and a lot of jobs won’t give you this) then you’re frankly in one of those “leave as soon as possible” positions. Even so, niche work rarely dies. The author mentions mainframe work, but mainframe work is still some of the highest paid work in the world because those grey beards who actually know and want to do it are so retired that a lot of them are frankly dead. I’m not sure how you could ever work on mainframes for 40+ years and then not be able to get paid handsomely by banks.

Anyway to each their own. It’s a nice perspective, and it offers you a few insights into just how much of a cog in the machine you’re going to be in virtually any job. Even one where you’re extremely well liked and rewarded. I think the best thing I learned from my stint in management is how everyone, and I do mean everyone, is replaceable. It’s just a matter of cost. Which can sound depressing, but it’s also very liberating because it teaches you to not get overly attached to jobs or employers.


Two decades is only half a software career. I've been coding for 40 and it's different. See how you feel in another 20 years. Even in niche domains, almost all software is a massive tangle of unnecessary complexity. At some point it becomes like another game of Magic (The Gathering) with millions of cards and twisty little rules all alike. It's some teenager's idea of fun, but if you take it home to gramma, she doesn't understand why you can't just sit around the table and talk.


I've been coding professionally for 40 years and especially don't care for how the management of software projects has evolved. Rather than design, I've seen the emphasis shift to process and testing, necessarily in the guise of 'agile' even when the design goals don't fit well with the agile manifest (like ML or scientific programming). The inability of IT management to imagine an approach to software other than "one size fits all" has left me increasingly disengaged in work that used to be fun every day.

Of course, the shift in the past few decades from coding-from-scratch to cut-and-paste has also pushed me out of the 'flow' of slinging code that I long enjoyed. So maybe the insistent adoption by management of agile-oriented groupthink and processes should be a clear message to me that resistance is futile, and it's time for me to make way for devs who are happy to play a game whose rules have changed, one that I no longer enjoy as I once did.


For me, it was very obvious at the end. Technical but not programming. Felt like I was winding down. Circumstances were such there wasn't a lot of mobility within the company. Was somewhat disappointed that I didn't get a package as part of some layoffs but I assume powers that be didn't want to voluntarily lose headcount.

Ended up hanging around for a year effectively working part time. Not sure that was the right idea or not (had lots of vacation which I pretty much all took) but year+ passed by and it was pretty obvious at that point I couldn't drag my feet any longer and didn't have the interest or need to do a job search.


I am repurposing the life lesson my grandfather imparted to me before passing. It was meant about one's sex life, but for me it's applicable also here:

"If the struggles outweighs the pleasure you should stop doing it".


> "If the struggles outweighs the pleasure you should stop doing it".

Kind of bleak when it comes to romantic relationships


I recently put in a word for a senior programmer I worked with in a previous job and he got hired. Well, it's really clear he doesn't care anymore and doesn't find anything about software development interesting. Now I'm in a tough spot because he's a major burden and my manager wants to give it some more time but I don't see it working out.

I heavily relate to this line in the article:

> Some time ago, I knew a programmer with the same number of years of experience as me. Yet he seemed unable to comprehend what was required of him, and I had to review everything he wrote because it rarely worked


It sounds like you're describing two different things though, of course, they can look somewhat similar from the outside.

There's unable and there's not caring. I can imagine not having some specific skill sets and I can imagine just not having the interest in putting in the effort and learning what's needed. The results may look somewhat similar but they're different situations.


Echoing some other posts a bit: my problem is that I see silly things happening over and over again and I find that telling people doesn't help. Every generation appears to have to learn the same lessons again the hard way. It's a bit depressing.

I made some of these mistakes. I didn't have anyone to help me avoid them. Now I can help but .... I cannot help. Only after the balls up and even then often not.


> They couldn't comprehend why anyone would retire. One of them, whom I had worked for for two of those jobs and always made his life easier, never spoke to me again or even said goodbye.

If I had a nickel for every leader I've worked for who didn't know when it was time to go... I'd have 3 nickels, which is still a surprising amount


Right now I'm in this goofy spot where I'm probably walking away from it. Nobody wants backend and API-layer Java devs anymore. They probably need them, but they don't know it in the midst of the AI bubble.


It's actually still super popular at many or most large companies. Not small companies.


A lot of those companies are either looking like they'll never change (e.g. banks) or are actively transitioning to things like Kotlin. I've made that transition myself and at this point I'm seeing a lot of signs that this was the right move.

E.g. Facebook, Google, Amazon, and others have used a lot of Java in the last twenty years and many of their teams are transitioning to Kotlin at this point (each of them have talked about this in public). With many millions of lines of code that's a slow transition obviously but Kotlin is apparently the goto choice for a lot of new stuff.

Java is turning into the Cobol of our generation. People will still be doing this for a long time. But not a lot of young people are likely to want to do that. It's not an obvious choice for new projects at this point.


Kotlin isn't really used as much as people think outside Android, besides it is useless without the Java ecosystem it depends on, in a platform written in a mix of Java and C++.

https://learnhub.top/the-most-in-demand-programming-language...

https://newrelic.com/resources/report/2024-state-of-the-java...

https://www.itpro.com/careers/29133/the-top-programming-lang...


Actually, I use Kotlin in a browser using Kotlin multi platform. Things are changing rapidly and you can do a lot without any Java libraries whatsoever at this point. The multiplatform ecosystem is growing very rapidly now.

Also, you might want to read up on how Amazon, Google, Facebook and other companies are moving to Kotlin internally at scale for server side use. They've been pretty vocal about that. A lot of Java shops are of course a bit glacial in their adoption of new technology. That's why I refer to it as the new Cobol. That has less to do with the language and more to do that this kind of companies simply don't change very easily. You'll find some actual Cobol lurking in a lot of these companies as well probably.

Kotlin was originally developed to be a drop in replacement for Java in any Java project. Android developers embraced it in a hurry because they were stuck with a relatively old and crappy version of Java because of the whole Oracle law suit with Google. Google made that official shortly after Kotlin 1.0 released acknowledging that many Android developers were voting with their feet at that point already. They also just embraced Kotlin multi platform at Google IO a few months ago.

On the server side, people have been using Kotlin for about as long. But things move more slowly there. Spring launched their Kotlin support around Spring Boot 2.0. That's six years ago already. It worked fine before that but that's the moment they started shipping lots of out of the box Kotlin support. Frankly, if you are using Spring and not using Kotlin, you're missing out big time. Lots of companies of course insist on doing things the verbose and hard way with Java.


Snow flake hyperscallers, with vouched interest in Android ecosystem, aren't the same as everyone else.

The Oracle excuse is bonkers, everything on Android depends on JVM, written in Java, and Maven Central libraries, written in Java.

If Oracle actually played a role they would have switched Android to Dart and Flutter.

Kotlin was adopted by some management folks on Android team that are Kotlin heads, and to this day most Android documentation samples use Java 8 as counterexamples to how Kotlin makes things better, completely dishonest.

Looking forward to see JetBrains shown off their ecosystem rewrite in Kotlin Native, showing the rest of the world how Java is so passé.


I'm not sure what you are going on about here. Around the time Kotlin launched, Java was on version 8 and most of the Android world was still stuck with version 6 and missing out on quite a bit of nice stuff that came with newer versions. The Oracle dispute was specifically about Google's implementation of Java on top of Apache Harmony (developed by IBM orginally) which was an alternative implementation of the Java standard library. Since the dispute was resolved they transitioned to basing their libraries on OpenJDK and these days are a bit more up to date.

Anyway, Kotlin came along as that was still in the courts. Google understandably wasn't putting a lot of effort in updating the compiler or the core Java libraries. With Kotlin, developers gained access to a lot of modern language features.

So, not bonkers but well documented history. If you want to try your compose apps on IOS, you can now. It's currently in Alpha release. Zero java libraries running on that platform. All Kotlin multi-platform backed by IOS native.

As for Kotlin getting endorsed by Google. That wasn't management doing anything other than responding to a lot of Android developers enthusiastically adopting Kotlin long before it was even releases properly and getting some good results. App development is super competitive and developers aren't afraid to try out new stuff if it gives them an edge. And Kotlin did exactly that.

Between that and the Oracle dispute, it was a logical move for Google to make it official. In the same way Jetbrains move to push kotlin multiplatform and compose multiplatform originated outside of Google. And is now getting endorsed by Google as well. I guess the compose team and the flutter team are in different parts of the org chart.


Yet in 2024, the anti-Java speech with Java 8 samples continues, and they only updated ART to Java 17 LTS, because Android was losing out on Modern Java ecosystem.

While using Android Studio, Gradle, Android SDK, Kotlin compiler, D8, R8, all running on top of Java Virtual Machine, using Java libraries from Maven Central.

So much for breaking free from Oracle.


> Java is turning into the Cobol of our generation

Despite speed of Java evolution accelerating and Java being the best Java has ever been, I agree with this. Everything is Javascript or Python these days, Java is seen as old, boring, hard, inconvenient as you need to compile it and often it doesn't compile. Javascript/Python happily run until they don't and fail at runtime which feels like it just works (until it randomly doesn't) and you can pick up a teach yourself Javascript/Python in 24hrs style book and feel you are a competent developer very quickly.

A lot of Java positions I see now are maintenance roles, which isn't a great thing. Maintenance roles are never great, often seen as bottom tier developers. Unlike Cobol, there's likely many more people who have programmed Java than Cobol so even if Java dev becomes a niche I suspect they'll still be enough of them around that compensation is not anything significant.

Kotlin is nice, let's see where it is in ten years. Closure, Groovy, JRuby, Scala are also nice but ultimately Java won as the modern Java and all those other JVM languages that once had interest and promise are now niche or completely dead.


That's what I'd have thought, but it's looking bleak. Who do you have in mind?


Look for corporate jobs outside of California/SV-type places. Financial firms, insurance companies, also universities. You won't be blazing any new trails and you might be shocked by the pay difference but you'll work 9-5 with your weekends free and probably good benefits.

You'll have to be able to tolerate some level of Initech-style management but if you just accept that and play along the work pace is pretty relaxed.


I’ve reached the point where every time I see a comment about “nobody wants to hire X programmers”, in my mind I append “… at hot SV startups” because that must be the only way it’s true.


Heh, good for you! I can't wait til I'm at that point too.


Google does.


I thought this was going to be about Go the language. Damn title case


Interesting insights. Thank you for sharing.


[flagged]


Never been so bored and frustrated that you just came in to work, dropped off your laptop, picked up your coffee mug, and walked out without a word?


I make $300K/year. I’ll leave when I’m ready to retire.

And if I’m lucky enough to get laid off (around retirement time), that would be a huge windfall to move me toward retirement.

Programmers planning to work after 50 are fools.


You're in like, 0.1% of programmers financially. I know living in certain bubbles it feels like everyone makes that kind of money, but it's absolutely not. It's like a lottery winner telling people that if they plan to work over 50 they are fools.


I'm sure it's my FAANG bias, but 0.1%? I would have guessed 5%.


By freelancing you can save nice nest egg in most places. I did that for 11 years in Europe and now I work because I want to, not because I must. Disclaimer: not consulting anymore, I moved back to startup grind once more because I feel more connected to the work than what you do as a consultant.


Freelancing is another bubble I think.

I don’t think most developers have the skills necessary to freelance and/or most enterprises are not setup to work with freelancers.


I don't feel like being my own boss. Shrug emoji


Even if you make half that, which is a common offer all over America, you can retire early.


Americans are their own little bubble again - I'd wager that most of the world's programmers don't live in US. Globally your average programmer will be someone writing utterly boring code making same salary as a teacher, maybe slightly above that if lucky. Happy to be proven wrong, but the whole "If you aren't making 6 figures as a programmer you failed at life" meme needs to go away - it's a tiny tiny sliver of all programmers that actually manage to achieve that.


Retiring early is more about expenses than income. If you can't retire early with $100k income, you probably can't do it with $300k either.

Most people can't do it, because their expenses grow to match their income. They want bigger and better everything, and they always find new "mandatory" expenses. Especially if they have kids.


My spouse and I make just a touch over the GP, combined of course. Our mortgage is 1.3k/month, and our daycare costs are 3k/month. In a year darcaye goes down by half, and 2 years after that we’re going to have money coming out our ears.

I am fortunate my spouse was on board with this plan, which was mine. I’m 38, spouse is 36, for reference. Assuming we have college saved for, I don’t want to work a day past 55 if I can manage it.


I probably dragged things out a year or three past where I prudently could have. Not sure if I made the right call or not. COVID messed up a lot of things (and obviously sadly killed a lot of people) and I might have made different decisions on a more normal timeline.


With age, your ability to spend money greatly diminishes. Even driving to expensive restaurants is tiresome. The need to impress other people goes away, and rocking chair on the porch doesn't cost much. I'm struggling to spend 1/5 of my investment income. And the kids have forbidden me from giving them more money because they want to 'make it themselves'.


Did You factor kids in?


Programmers haven't always made this much.


During the .com boom (1990's) an intermediate programmer would easily command $100k USD/year in salary. Many made millions on stocks. Later how many programmer millionaires did the big tech companies make? 10's of thousands?

25 years ago. Adjusted to today that's ~190k USD in salary. Add equity, bonus, ESPP etc. There's ups and downs but I think there are lots of US software jobs that pay this today.


And a lot of those programmers in the .com boom ended up with stock that was worth zilch and a ton of people ended up leaving the industry in the 2001 or so period. I was incredibly luck to get laid off a couple weeks after 9/11 and to get another tech-adjacent job about a month later based on a lunch I had with someone I knew a few days after I was laid off. That was not a common experience in that period.

(Mind you, that wasn't a great stretch for me in terms of compensation. But a later job made up for it at least by my standards.)


There's always a range of experiences. Most of the people I know did ok through the .com bust and are still in tech. Some people had millions on paper and didn't sell and held on to it and not only was it worth zilch they also owed taxes but many really made millions. Pretty much anyone that had a job in FAANG or similar companies did really well from 2001 to today - a lot of people. Tech grew like crazy and took a lot of people with it for the ride.


Certainly the last 15 years in particular have been very good for a lot of people in tech even if they didn't hit a big jackpot in FAANG or wherever. I had a pretty ordinary tech industry job in that period and it set me up better than my whole prior career did.


But they have, plus or minus 40% adjusted for inflation. Soul sucking corporate jobs have always paid well.


There are way, way, way more jobs that make 300k+ in our industry than there ever have been.

It's not that these jobs have never existed, it's that they are in greater quantities.


40% is a big fraction


How much do you need to accumulate before you retire?

I'm guessing retiring just means working on your own things with enough runway til you drop dead.


At my current rate of spend, about $3M-3.5M. (Spending 100k/year)

I’m working to bring down the spend rate, but even if I don’t I should be good well before 50. I’m saving/investing about 160k/year, plus or minus. Some of the money saved is pre tax in 401k and some isn’t, so it’s not exactly dollar for dollar equivalent.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: