Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"as tech jobs evaporate"

sooooo... you want us to be in the office but even if we're there it's not good enough. So now we're forced away, and since there won't be remote jobs either... :shrug:

I think this is going to be a lot worse than 2000 bust. Especially with AI coming into the picture. There is a feeling that AI will be writing ALL the code soon.

Is there something to look forward to in this specific market or at this point should I just look at carpentry jobs?



Idiots writing tons of code with AI will create employment for armies of people who actually know how to debug it into working.

Remember Knuth's witticism about debugging being twice as hard as writing? So if you're as clever as you can be when writing a program then you might not be smart enough to debug it?

Now suppose you're as clever as you can be, plus armed with AI ...


It was Brian Kernighan and P. J. Plauger (The Elements of Programming Style (1974, 2nd edition 1978), chapter 2, page 10):

> Everyone knows that debugging is twice as hard as writing a program in the first place. So if you're as clever as you can be when you write it, how will you ever debug it?


This is why you should never be clever when writing code.


I would actually bet that AI will prove to be particularly good at classical debugging - i.e. the effort of changing the code to make a given failing test pass while keeping all the other tests green too. This is pretty much what SWE-Bench is judging, and the progress there has been phenomenal.

We will still probably need humans to do validation (vs. verification), but I don't think it'll take "armies of people".


Classical debugging is looking at a dump of memory in search for the bit that is flipped wrong. :)


never had to do that in 25 years of software dev


24 years would only put you back to 1999, and by that time GUIs had largely taken over, with Win 95 having been released and the pentium was just hitting its stride. There's still a need to pour over memory dumps if you're doing reverse engineering or early stage hardware engineering. What killed it really though was the Internet and the cloud. The expertise to debug a system and get to know it intimately well just isn't as necessary these days. just fire up an ASG and cycle the instances before the problem happens. Software that's running on your servers instead of on customer computers means you can debug those systems and don't have to rely on customer reports.


never had to analyse a memory dump. I tried to, sometimes, just for fun, but nothing good came of it. Crash dumps are symbolicated, so you never go at the bit level.


I can't help but recall a certain former coworker who apparently believed that he was getting paid by the word, the most committed user of autocomplete and IDE code-generation features I've ever seen, and imagine what he must be doing now with the help of AI. At the time, I felt like simply following him around and tidying up his sprawling, repetitive, bug-ridden verbiage could have been someone's full-time job; with his output accelerated by AI, I imagine he could keep a whole team busy.


Or organizations stuck with that code will be locked into impenetrable tech debt, while newer companies who didn’t do this idiocy murder then.

Or, probably, those larger companies force regulatory lock-in, and everyone is miserable.


"Code expands to fill the maintenance capacity of the available team".

There, Parkinson's law updated.


Why are they idiots?


What is currently advertised as AI (LLMs) cannot replace all software developers. It is a tool that can make software developers more productive so there will be less need for low level positions but there will still be a substantial need for senior developers. LLMs, if tuned for software development, present an opportunity for a company, that can plan beyond the next couple of quarters, to produce software of unprecedented quality and with as yet unseen features and usability. Of course, there will be companies which will use LLMs incorrectly and replace most developers (and other people) and create products that are even worse than what is currently available since they don't understand what LLMs are or how to utilise them correctly. Those companies will either realize and fix their mistakes or go bankrupt.


> LLMs, if tuned for software development, present an opportunity … to produce software of unprecedented quality and with as yet unseen features and usability.

Given that I can barely get LLMs to produce an error free couple of functions I doubt that. Though I suspect it’ll boost the median quality of software.

IMHO software quality requires care and skill along with a management that values quality.

Take Meta, they had billions of dollars and spent it freely. Yet the Metaverse software is mediocre at best. They had John Carmack in their rosters and the Mets developers reportedly ignored him. Giving those developers the ability to make more code won’t make the results magically good.


Otherwise you make good points. Companies like Intuit using AI as a justification to lay off lots of devs will face issues later when it comes about that it doesn’t scale.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: