Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Adding to this: it's not just that the apprenticeship ladder is gone—it's that nobody wants to deal with juniors who spit out AI code they don't really understand.

In the past, a junior would write bad code and you'd work with them to make it better. Now I just assume they're taking my feedback and feeding it right back to the LLM. Ends up taking more of my time than if I'd done it myself. The whole mentorship thing breaks down when you're basically collaborating with a model through a proxy.

I think highly motivated juniors who actually want to learn are still valuable. But it's hard to get past "why bother mentoring when I could just use AI directly?"

I don't have answers here. Just thinking maybe we're not seeing the end of software engineering for those of us already in it—but the door might be closing for anyone trying to come up behind us.





> Now I just assume they're taking my feedback and feeding it right back to the LLM.

This is especially annoying when you get back a response in a PR "Yes, you're right. I have pushed the fixes you suggested."

Part of the challenge (and I don't have an answer either) is there are some juniors who use AI to assist... and some who use it to delegate all of their work to.

It is especially frustrating that the second group doesn't become much more than a proxy for an LLM.

New juniors can progress in software engineering - but they have to take the road of disciplined use of AI and make sure that they're learning the material rather than delegating all their work to it... and that delegating work is very tempting... especially if that's what they did in college.


I must ask once again why we are having these 5+ round interview cycles and we aren't able to filter for qualities that the work requires of its talent. What are all those rounds for if we're getting engineers who aren't as valued for the team's needs at the end of the pipeline?

> I must ask once again why we are having these 5+ round interview cycles and we aren't able to filter for qualities that the work requires of its talent.

Hiring well is hard, specially if compensation isn't competitive enough to attract talented individuals who have a choice. It's also hard to change institutional hiring practices. People don't get fired by buying IBM, and they also don't get fired if they follow the same hiring practices in place in 2016.

> What are all those rounds for if we're getting engineers who aren't as valued for the team's needs at the end of the pipeline?

Software development is a multidiscinary field. It involves multiple non-overlapping skill sets, bot hard skills and soft skills. Also, you need multiple people vetting a candidate to eliminate corruption and help weed out candidates who outright clash with company culture. You need to understand that hiring someone is a disruptive activity, that impacts not only what skill sets are available in your organization but also how the current team dynamics. If you read around, you'll stumble upon stories of people who switch roles in reaction to new arrivals. It's important to get this sort of stuff right.


>It's important to get this sort of stuff right.

Well I'm still waiting. Your second paragraph seems to contradict the first. Which perfectly encapsulates the issue with hiring. Too afraid to try new things, so instead add beuracracy to leases accountability.


> Well I'm still waiting. Your second paragraph seems to contradict the first. Which perfectly encapsulates the issue with hiring. Too afraid to try new things, so instead add beuracracy to leases accountability.

I think you haven't spend much time thinking about the issue. Changing hiring practices does not mean they are improve. It only means they changed. You are still faced with the task of hiring adequate talent, but if you change processes them now you don't have baselines and past experiences to guide you. You keep those baselines if you keep your hiring practices then you stick with something that is proven to work albeit with debatable optimality, and mitigate risks because your experience with the process helps you be aware of some red flags. The worst case scenario is that you repeat old errors, but those will be systematic errors which are downplayed by the fact that your whole organization is proof that your hiring practices are effective.


>Changing hiring practices does not mean they are improve.

No, but I'd like to at least see conversation on how to improve the process. We aren't even at that point. We're just barely past acknowledging that it's even an issue.

>but if you change processes them now you don't have baselines and past experiences to guide you.

I argue we're already at this point. The reason we got past the above point of "acknowledging problem" (a decade too late, arguably) is that the baselines are failing to new technology, which is increasing false positives.

You have a point, but why does tech pick this point to finally decide not to "move fast and break things"? Not when it comes to law and ethics, but for aquiring new talent (which meanwhile is already disrupting heir teams with this AI slop?)

>those will be systematic errors which are downplayed by the fact that your whole organization is proof that your hiring practices are effective.

okay, so back to step zero then. Do we have a hiring problem? The thesis of this article says yes.

"it worked before" seems to be the antipattern the tech industry tried to fight back against for decades.


> No, but I'd like to at least see conversation on how to improve the process. We aren't even at that point. We're just barely past acknowledging that it's even an issue.

The current hiring practices are a result of acknowledging what they did before didn't work. The current ones work well enough that people don't wanna change it, the only ones who wanna change it are engineers not the companies.


Nit (not directed at you) : I don't appreciate being flagged for pointing out the exact issue of the article and someone just dismissing it as "well companies are making money, clearly it's not a crisis"

This goes beyond destructive thinking. Again, I hope the companies reap what they sow.


>What dumpster fire?

If you're not going to even acknowledge the issue in the article, there's no point in discussing the issue in a forum. Good day.


There's no fix for this problem in hiring upfront. Anyone can cram and fake if they expect a gravy train on the other end. If you want people to work after they're hired, you have to be able to give direct negative feedback, and if that doesn't work, fire quickly and easily.

>Anyone can cram and fake if they expect a gravy train on the other end.

If you're still asking trvia, yes. Maybe it's time to shift from the old filter and update the process?

If you can see in the job that a 30 minute PR is the problem, then maybe replace that 3rd leetcode round with 30 minutes of pair programming. Hard to chatGPT in real time without sounding suspicion.


That approach to interviewing will cause a lot of false negatives. Many developers, especially juniors, get anxious when thrown into a pair programming task with someone they don't know and will perform badly regardless of their actual skills.

I understand that and had some hard anxiety myself back then. Even these days I may be a bit shakey when love coding in an interview setting?

But is the false negative for a nervous pair programmer worse than a false positive for a leetcode question? Ideally a good interviewer would be able to separate the anxiety from the actual thinking and see that this person can actually think, but that's another undervalued skill among industry.


I don’t know why people are so hesitant to just fire bad people. It’s pretty obvious when someone starts actually working if they’re going to a net positive. On the order of weeks, not months.

Given how much these orgs pay, both direct to head hunters and indirect in interview time, might as well probationally hire the whoever passes the initial sniff test.

That also lets you evaluate longer term habits like punctuality, irritability, and overall not-being-a-jerkness.


Not so fast. I "saved" guys from being fired by asking to be more patient with them. The last one was not in my team as I moved out to lead another team. Turned out the guy did not please an influencial team member, who then complained about him. What I saw instead was a young silent guy, given boring work and was longing for more interesting work. A tad later he took ownership of a neglected project, completed it and made a name of himself.

It takes considerably more effort and skill to treat colleagues as humans rather than "outputs" or ticket processing nodes.

Most (middle) management is an exercise in ass-covering, rather than creating healthy teams. They get easily scared when "Jira isn't green", and look someone else to blame for not doing the managing part correctly


Sunk cost. You've spent... 20 to 100 hours on interviews. Maybe more. Doing it again is another expense.

Onboarding. Even with good employees, it can take a few months to get the flow of the organization, understanding the code base, and understanding the domain. Maybe a bit of technology shift too. Firing a person who doesn't appear to be preforming in the first week or two or three would be churning through that too fast.

Provisional hiring with "maybe we'll hire you after you move here and work for us for a month" is a non-starter for many candidates.

At my current job and the job previous it took two or three weeks to get things fully set up. Be it equipment, provisioning permissions, accounts, training (the retail company I worked at from '10 to '14 - they sent every new hire out to a retail store to learn about how the store runs (to get a better idea of how to build things for them and support their processes).

... and not every company pays Big Tech compensation. Sometimes it's "this is the only person who didn't say «I've got an offer with someone else that pays 50% more»". Sometimes a warm body that you can delegate QA testing and pager duty to (rather than software development tasks) is still a warm body.


It's really not obvious to calculate the output of any employee even with years of data, way harder for a software engineer or any other job with that many facets. If you've found a proven and reliable way evaluate someone in the first 2 weeks you just solved one of the biggest HR problems ever.

What if, and hear me out, we asked the people a new employee has been onboarding with? I know, trusting people to make a fair judgment lacks the ass-covering desired by most legal departments but actually listening to the people who have to work with a new hire is an idea so crazy it might just work.

> I don’t know why people are so hesitant to just fire bad people.

"Bad" is vague, subjective moralist judgement. It's also easily manipulated and distorted to justify firing competent people who did no wrong.

> It’s pretty obvious when someone starts actually working if they’re going to a net positive. On the order of weeks, not months.

I feel your opinion is rather simplistic and ungrounded. Only the most egregious cases are rendered apparent in a few weeks worth of work. In software engineering positions, you don't have the chance to let your talents shine through in the span of a few weeks. The cases where incompetence is rendered obvious in the span of a few weeks actually spells gross failures in the whole hiring process, which failed to verify that the candidate failed to even meet the hiring bar.

> (...) might as well probationally hire the whoever passes the initial sniff test.

This is a colossal mistake, and one which disrupts a company's operations and the candidates' lives. Moreover, it has a chilling effect on the whole workforce because no one wants to work for a company ran by sociopaths that toy with people's lives and livelihood as if it was nothing.


> manipulated and distorted to justify firing competent people

If you have that kind of office politics going on, that's the issue to be solved.

>toy with people's lives and livelihood as if it was nothing.

If the employee lies about their skills, it is on them.


Every style of interview will cause anxiety, that's just a common denominator for interviews.

The same could be said for leetcode. Except leetcode doesn't test actual skills in 2025.

The bar for “junior” has quietly turned into “mid-level with 3 years of production experience, a couple of open-source contributions, and perfect LeetCode” while still paying junior money. Companies list “0-2 years” but then grill candidates on system design, distributed tracing, and k8s internals like they’re hiring for staff roles. No wonder the pipeline looks broken. I’ve interviewed dozens of actual juniors in the last six months. Most can ship features, write clean code, and learn fast, but they get rejected for not knowing the exact failure modes of Raft or how to tune JVM garbage collection on day one. The same companies then complain they “can’t find talent” and keep raising the bar instead of actually training people.

Real junior hiring used to mean taking someone raw, pairing them heavily for six months, and turning them into a solid mid. Now the default is “we’ll only hire someone who needs zero ramp-up” and then wonder why the market feels empty.


It's the cargo cult kayfabe of it all. People do it because Google used to do it, now it's just spread like a folk religion. But nobody wants guilds or licensure, so we have to make everyone do a week-long take-home and then FizzBuzz in front of a very awkward committee. Might as well just read chicken bones, at least that would be less humiliating.

And who would write the guild membership or licensure criteria? How much should those focus on ReactJS versus validation criteria for cruise missile flight control software?

Guild members? Who else?

You’re asking these rhetorical questions as if we haven’t had centuries of precedent here, both bad and good. How does the AMA balance between neurosurgeons and optometrists? Bar associations between corporate litigators and family estate lawyers? Professional engineering associations between civil engineers and chemical engineers?


> Professional engineering associations between civil engineers and chemical engineers?

One takes the FE exam ( https://ncees.org/exams/fe-exam/ ). You will note at the bottom of the page "FE Chemical" and "FE Civil" which are two different exams.

Then you have an apprenticeship for four years as an Engineer in Training (EIT).

Following, that, you take the PE exam. https://ncees.org/exams/pe-exam/ You will note that the PE exams are even more specialized to the field.

Depending on the state you are licensed in (states tend to have reciprocal licensing - but not necessarily and not necessarily for all fields). For example, if you were licensed in Washington, you would need to pass another exam specific to California to work for a California firm.

Furthermore, there is the continuing education requirements (that are different for each state). https://www.pdhengineer.com/pe-continuing-education-requirem...

You have to take 30 hours of certified study in your field across every two years. This isn't a lot, but people tend to fuss about "why do CS people keep being expected to learn on our own?" ... Well, if we were Professional Engineers it wouldn't just be an expectation - it would be a requirement to maintain the license. You will again note the domain of the professional development is different - so civil and mechanical engineers aren't necessarily taking the same types of classes.

These requirements are set by the state licensure and part of legislative processes.


So what you’re saying is that it’s a solved problem. If we can figure out how to safely certify both bridge builders and chemical engineers working with explosives, we can figure out a way to certify both React developers and those working on cruise missile flight control software.

I'm saying the idea that you can do one test for software engineering and never have to study again or be tested on a different domain in the future isn't something that professional engineering licensure solves.

Furthermore, licensure requires state level legislation and makes it harder for employees (especially the EIT) to change jobs or move to other states for work there.

Licensure, the way that people often point to it as a way to solve the credentials problem vs interviews, isn't going to solve the problems that people think it would.

Furthermore, it is only something if there is a reason to do it. If there isn't a reason to have a licensed engineer signing off on designs and code there isn't a reason for a company to hire such.

Why should a company pay more for someone with a license to design their website when they could hire someone more cheaply who doesn't have a license? What penalties would a company have for having a secretary do some vbscripting in excel or a manager use Access rather than hiring a licensed developer?


You seem to be confused. The AMA doesn't control physician licensing. That's done by state medical boards.

But are you suggesting we have separate licenses for every different type of developer? We have new types coming up every few years.

The whole idea of guilds for developers is just stupid and impractical. It could never work on any long term or large scale basis.


Good catch on the AMA. I should have said medical licensing boards.

> But are you suggesting we have separate licenses for every different type of developer? We have new types coming up every few years.

I didn’t suggest that at all and I honestly can’t figure out how you came to that interpretation unless you are hallucinating.

> The whole idea of guilds for developers is just stupid and impractical. It could never work on any long term or large scale basis.

What a convincing argument! You should get a cabinet post.


Guilds and licensure perform gatekeeping, by definition, and the more useful they are at providing a good hiring signal, the more people get filtered out by the gatekeeping. So there's no support for it because everyone is afraid that effective guilds or licensing would leave them out in the cold.

Yeah, I'd be more than fine with licensing if I didn't have to keep going through 5 rounds of trivia only to be ghosted. Let me do that once and show I can code my way out of a paper bag.

I can understand such process for freshman, but for industry veteran with 10+ years of experience, with with recommendation from multiple senior managers?

And yet welcome to leetcode grind.


Yeah, I was told I'd get less of this as I got real experience. More additions to the pile of lies and misconceptions.

If you need to fizzbuzz me, fine. But why am I still making word search solver project in my free time as if I'm applying for a college internship?


I’ve started using ChatGPT for their take home projects, with only minor edits or refactors myself. If they’re upset I saved a couple hours of tedium, they’re the wrong employer for me.

And I’m being an accelerationist hoping the whole thing collapses under its own ridiculousness.


Also they explicitly say to not use AI assistance for such assignments.

Recruitment is broken even more than before chatgpt.


> there are some juniors who use AI to assist... and some who use it to delegate all of their work to.

Hmmm. Is there any way to distinguish between these two categories? Because I agree, if someone is delegating all their work to an LLM or similar tool, cut out the middleman. Same as if someone just copy/pasted from Stackoverflow 5 years ago.

I think it is also important to think about incentives. What incentive does the newer developer have to understand the LLM output? There's the long term incentive, but is there a short term one?


Dealing with an intern at work who I suspect is doing exactly this, I discussed this with a colleague. One way seems to be to organize a face to face meeting where you test their problem solving skills without AI use, the other may be to question them about their thought process as you review a PR.

Unfortunately, the use of LLMs has brought about a lot of mistrust in the workplace. Earlier you’d simply assume that a junior making mistakes is simply part of being a junior and can be coached; whereas nowadays said junior may not be willing to take your advice as they see it as sermonizing when an “easy” process to get “acceptable” results exists.


The intern is not producing code that is up to the standard you expect, and will not change it?

I saw a situation like this many years ago. The newly hired midlevel engineer thought he was smarter than the supervisor. Kept on arguing about code style, system design etc. He was fired after 6 months.

But I was friendly with him, so we kept in touch. He ended up working at MSFT for 3 times the salary.


    > Earlier you’d simply assume that a junior making mistakes is simply part of being a junior and can be coached; whereas nowadays said junior may not be willing to take your advice
Hot take: This reads like an old person looking down upon young people. Can you explain why it isn't? Else, this reads like: "When I was young, we worked hard and listened to our elders. These days, young people ignore our advice." Every time I see inter-generational commentary like this (which is inevitably from personal experience), I am immediately suspicious. I can assure you that when I was young, I did not listen to older people's advice and I tried to do everything my own way. Why would this be any different in the current generation? In my experience, it isn't.

On a positive note: I can remember mentoring some young people and watching them comb through blogs to learn about programming. I am so old that my shelf is/was full of O'Reilly books. By the time I was mentoring them, few people under 25 were reading O'Reilly books. It opened my eyes that how people changes more than what people learn. Example: Someone is trying to learning about access control modifiers for classes/methods in a programming language. Old days: Get the O'Reilly book for that programming language. Lookup access modifiers in the index. 10 year ago: Google for a blog with an intro to the programming language. There will be a tip about what access modifiers can do. Today: Ask ChatGPT. In my (somewhat contrived) example, the how is changing, but not the what.


> Old days: Get the O'Reilly book for that programming language. Lookup access modifiers in the index. 10 year ago: Google for a blog with an intro to the programming language. There will be a tip about what access modifiers can do. Today: Ask ChatGPT.

The answer to this (throughout the ages) should be the same: read the authoritative source of information. The official API docs, the official language specification, the man page, the textbook, the published paper, and so on.

Maybe I am showing my age, but one of the more frustrating parts of being a senior mentoring a junior is when they come with a question or problem, and when I ask: “what does the official documentation say?” I get a blank stare. We have moved from consulting the primary source of information to using secondary sources (like O’Reilly, blogs and tutorials), now to tertiary sources like LLMs.


[Disclaimer: I'm a Gen Xer. Insert meme of Grandpa Simpson shouting at clouds.]

I think this is undoubtedly true from my observations. Recently, I got together over drinks with a group of young devs (most around half my age) from another country I was visiting.

One of the things I said, very casually, was, "Hey, don't sleep on good programming books. O'Reilly. Wiley. Addison-Wesley. MIT Press. No Starch Press. Stuff like that."

Well, you should've seen the looks on their faces. It was obvious that advice went over very poorly. "Ha, read books? That's hard. We'd rather just watch a YouTube video about how to make a JS dropdown menu."

So yeah, I get that "showing my age" remark. Used to be the discipline in this industry is that you shouldn't ask a question of a senior before you'd read the documentation. If you had read the documentation, man pages, googled, etc., and still couldn't come up with an answer, then you could legitimately ask for a senior mentor's time. Otherwise, the answer from the greybeards would have been "Get out of my face, kid. Go RTFM."

That system that used to exist is totally broken now. When reading and understanding technical documentation is viewed as "old school", then you know we have a big problem.


I like your sentiment about "first principles" of documents -- go to the root source. But for most young technologists (myself included, long long ago), the official docs (man pages for POSIX, MSDN for Win32 etc.) are way too complex. For years, when I was in university, I tried to grasp GUI programming by writing C and using the Win32 API. It was insane, and I did little more than type in code from a "big book of Win32 programming". Only when I finally tried Qt with C++ did the door of understanding finally open. Why? It was the number of simple examples that Qt docs provided they really helped me understand GUI (event-driven) programming. Another 10 years went by when I knew enough about Win32 that I was able to write small, but useful GUIs in pure C using the Win32 API. The very reason that StackOverflow was so popular: People read the official docs and still don't understand... so they ask a question. The best questions include a snip of code and ask about it.

To this day, I normally search on Google first, then try an LLM... the last place that I look is the official docs if my question is about POSIX or Win32. They are just too complex and require too much base knowledge about the ecosystem. As an interesting aside, when I first learned Python, Java, and C#, I thought their docs were as approachable as Qt. It was very easy to get started with "console" programming and later expand to GUI programming.


Despite my pro-documentation comment above, I think there is a legit criticism that a lot of official documentation is a mess. Take man pages, for instance. I don't think it's a good look for greybeards to say "just go read the man page, kid." Many of those man pages are so out of date. You can't legitimately adopt a position of smug superiority by pointing juniors to outdated docs.

No. Just no.

If I have a problem with a USB datastream, the last place I'm going to look is the official USB spec. I'll be buried for weeks. The information may be there, but it will take me so long to find it that it might as well not.

The first place to look is a high quality source that has digested the official spec and regurgitated it into something more comprehensible.

[shudder] the amount of life that I've wasted discussing the meaning of some random phrase in IEC-62304 is time I will never get back!


> I can assure you that when I was young, I did not listen to older people's advice and I tried to do everything my own way.

Hot take: This reads like a person who was difficult to work with.

Senior people have responsibility, therefore in a business situation they have authority. Junior people who think they know it all don't like this. If there's a disagreement between a senior person and a junior person about something, they should, of course, listen to each other respectfully. If that's not happening, then one of them is not being a good employee. But if they are, then the supervisor makes the final call.


> Old days: Get the O'Reilly book for that programming language. Lookup access modifiers in the index. 10 year ago: Google for a blog with an intro to the programming language. There will be a tip about what access modifiers can do. Today: Ask ChatGPT. In my (somewhat contrived) example, the how is changing, but not the what.

The tangent to that is it is also changing with the how much one internalizes about the problem domain and is able to apply that knowledge later. Hard fought knowledge from the old days is something that shapes how I design systems today.

However, the tendency of people who reach for ChatGPT today to solve a problem results in them making the same mistakes again the next time since the information is so easy to access. It also results in things that are larger are more difficult... the "how do you architect this larger system" is something you learn by building the smaller systems and learning about them so that their advantages and disadvantages and how and such becomes an inherent part of how you conceive of the system as a whole. ... Being able to have ChatGPT do it means people often don't think about the larger problem or how it fits together.

I believe that is harder for a junior who is using ChatGPT to advance to being a mid level or senior developer than it is for a junior from the old days because of the lack of retention of the knowledge of the problems and solutions.


They’re going to get promoted anyway. The “senior” title will simply (continue to) lose meaning to inflation.

Yeah Ive got to agree with this hot take. Put yourself in the junior's shoes: if s/he wasn't there you'd be pulling it out of Claude Code yourself, until your satisfied with what comes out enough to start adding your "senior" touches. The fact is the way code is written has changed fundamentally, especially for kids straight out of college, and the answer is to embrace that everyone is using it, not all this shaming. If you're so senior, why not show the kid how to use the LLM right, so the work product is right from the start? It seems part of the problem is dinosaurs are suspicious of the tech, and so dont know how to mentor for it. That being said, Im a machine learning engineer not a developer, and these LLMs have been a godsend. Assuming I do it correctly, there's just no way I could write a whole 10,000 line pipeline in under a week without it. While coding from outputs and error-driven is the wrong way for software Juniors, its fine by me for my AI work. It comes down to knowing when there's a silent error, if you haven't been through everything line by line. I've been caught before, Im not immune, its embarrassing, but every since GPT was in preview I have made it my business to master it.

I have a friend who is a dev, a very senior one at that, who spins up 4 Claudes at once and does the whole enterprises work. Hes a "Senior AI Director" with nobody beneath him, not a single direct report, and NO knowledge of AI or ML, to my chagrin.

So now I'm whining too...


This isn’t a question of the senior teaching the junior how to use the LLM correctly.

Once you’re a senior you can exercise judgement on when/how to use LLMs.

When you’re a junior you haven’t developed that judgement yet. That judgement comes from consulting documentation, actually writing code by hand, seeing how you can write a small program just fine, but noticing that some things need to change when the code gets a lot bigger.

A junior without judgement isn’t very valuable unless he/she is working hard to develop that judgement. Passing assignments through to the LLM does not build judgement, so it’s not a winning strategy.


There are some definite signs of over reliance on AI. From emojis in comments, to updates completely unrelated to the task at hand, if you ask "why did you make this change?", you'll typically get no answer.

I don't mind if AI is used as a tool, but the output needs to be vetted.


What is wrong with emojis in comments? I see no issue with it. Do I do it myself? No. Would I pushback if a young person added emojis to comments? No. I am looking at "the content, not the colour".

I think GP may be thinking that emojis in PR comments (plus the other red flags they mentioned) are the result of copy/paste from LLM output, which might imply that the person who does mindless copy/pasting is not adding anything and could be replaced by LLM automation.

The point is that heavy emoji use means AI was likely used to produce a changeset, not that emojis are inherently bad.

The emojis are not a problem themselves. They're a warning sign: slop is (probably) present, look deeper.

Exactly. Use LLMs as a tutor, a tool, and make sure you understand the output.

My favorite prompt is "your goal is to retire yourself"

Just like anything, anyone who did the work themself should be able to speak intelligently about the work and the decisions behind its idiosyncrasies.

For software, I can imagine a process where junior developers create a PR and then run through it with another engineer side by side. The short-term incentive would be that they can do it, else they'd get exposed.


Is/was copy/pasting from Stackoverflow considered harmful? You have a problem, you do a web search and you find someone who asked the same question on SO, and there's often a solution.

You might be specifically talking about people who copy/paste without understanding, but I think it's still OK-ish to do that, since you can't make an entire [whatever you're coding up] by copy/pasting snippets from SO like you're cutting words out of a magazine for a ransom note. There's still thought involved, so it's more like training wheels that you eventually outgrow as you get more understanding.


> Is/was copy/pasting from Stackoverflow considered harmful?

It at least forces you to tinker with whatever you copied over.


Pair programming! Get hands-on with your junior engineers and their development process. Push them to think through things and not just ask the LLM everything.

I've seen some overly excessive pair programming initiatives out there, but it does baffle me why less people who struggle with this do it. Take even just 30 minutes to pair program on a problem and see their process and you can reveal so much.

But I suppose my question is rhetorical. We're laying off hundreds of thousands of engineers and maming existing ones do the work of 3-4 engineers. Not much time to help the juniors.


having dealt with a few people who just copy/pasted Stackoverflow I really feel that using an LLM is an improvement.

That is at least for the people who don't understand what they're doing, the LLM tends to come out with something I can at least turn into something useful.

It might be reversed though for people who know what they're doing. IF they know what they're doing they might theoretically be able to put together some stackoverflow results that make sense, and build something up from that better than what gets generated from LLM (I am not asserting this would happen, and thinking it might be the case)

However I don't know as I've never known anyone who knew what they were doing who also just copy/pasted some stackoverflow or delegated to LLM significantly.


> Is there any way to distinguish between these two categories?

Yes, it should be obvious. At least at the current state of LLMs.

> There's the long term incentive, but is there a short term one?

The short term incentive is keeping their job.


> This is especially annoying when you get back a response in a PR "Yes, you're right. I have pushed the fixes you suggested."

I've learnt that saying this exact phrase does wonders when it comes to advancing your career. I used to argue against stupid ideas but not only did I achieve nothing, but I was also labelled uncooperative and technically incompetent. Then I became a "yes-man" and all problems went away.


I was attempting to mock Claude's "You are absolutely right" style of response when corrected.

I have seen responses to PRs that appear to be a copy and paste of my feedback into it and a copy and paste of the response and fixes into the PR.

It may be the that the developer is incorporating the mannerisms of Claude into their own speech... that would be something to delve into (that was intentional). However, more often than not in today's world of software development such responses are more likely to indicate a copy and paste of LLM generated content.


> However, more often than not in today's world of software development such responses are more likely to indicate a copy and paste of LLM generated content.

This is nothing new. People rarely have independent thoughts, usually they just parrot whatever they've been told to parrot. LLMs created common world-wide standard on this parroting, which makes the phenomenon more evident, but it doesn't change the fact that it existed before LLMs.

Have you ever had a conversation with an intelligent person and thought "wow that's refreshing"? Yeah. There's a reason why it feels so good.


This. May you have great success! My PR comments that I get are so dumb. I can put the most obvious bugs in my code, but people are focused in the colour of the bike shed. I am happy to repaint the bike shed whatever colour they need it to be!

> Part of the challenge (and I don't have an answer either) is there are some juniors who use AI to assist... and some who use it to delegate all of their work to.

This is not limited to junior devs. I had the displeasure of working with a guy who was hired as a senior dev who heavily delegated any work they did. He failed to even do the faintest review of what the coding agent and of course did zero testing. At one time these stunts resulted in a major incident where one of these glorious PRs pushed code that completely inverted a key business rule and resulted in paying customers being denied access to a paid product.

Sometimes people are slackers with little to no ownership or pride in their craftsmanship, and just stumbled upon a career path they are not very good at. They start at juniors but they can idle long enough to waddle their way to senior positions. This is not a LLM problem, or caused by it.


> This is especially annoying when you get back a response in a PR "Yes, you're right. I have pushed the fixes you suggested."

And then in the next PR, you have to request the exact same changes


I get that. I think that getting to know juniors outside of work, at a recurring meetup or event, in a setting where you can suss out their motivation level and teachability level, is _a_ way of going about it. That way, if your team is hiring juniors, you have people you have already vetted at the ready.

IMO teachability/curiosity is ultimately orthogonal to the more base question of money-motivation.

In a previous role I was a principal IC trying to mentor someone who had somehow been promoted up to senior but was still regularly turning in code for review that I wouldn't have expected from an intern— it was an exhausting, mind-numbing process trying to develop some sense of engineering taste in this person, and all of this was before LLMs. This person was definitely not just there for the money; they really looked up to the top-level engineers at our org and aspired to be be there, but everything just came across as extremely shallow, like engineering cosplay: every design review or bit of feedback was soundbites from a how-to-code TED talk or something. Lots of regurgitated phrases about writing code to be "maintainable" or "elegant" but no in-the-bones feeling about what any of that actually meant.

Anyway, I think a person like this is probably maximally susceptible to the fawning ego-strokes that an AI companion delivers alongside its suggestions; I think I ultimately fear that combination more than I fear a straight up mercenary for whom it's a clear transaction of money -> code.


I had one fairly-junior teammate at Google (had been promoted once) who was a competent engineer but just refused to make any choices about what to work on. I was his TL and I gave him a choice of 3 different parts of the system to work on, and I was planning to be building the other two. He got his work done adequately, but his lack of interest / curiosity meant that he never really got to know how the rest of the system operated, and got frustrated when he didn't advance further in his career.

Very odd. It was like he only had ever worked on school projects assigned to him, and had no actual interest in exploring the problems we were working on.


In my experience, curiosity is the #1 predictor of the kind of passionate, high-level engineer that I'm most interested in working with. And it's generally not that hard to evaluate this in a free-form interview context where you listen to how a person talks about their past projects, how they learn a new system or advocated/onboarded a tool at their company.

But it can be tricky to evaluate this in the kind of structured, disciplined way that big-company HR departments like to see, where all interviewees get a consistent set of questions and are "scored" on their responses according to a fixed rubric.


That does not even sounds like a problem? Like when people are that picky about what exact personality the junior musr have that good work is not enough ... then there is something wrong with us.

When presenting the three projects, I gave pros and cons about each one, like "you'll get to learn this new piece of technology" or "a lot of people will be happy if we can get this working". Absolutely no reaction, just "I don't care, pick one".

This guy claimed to want to get promoted to Senior, but didn't do anything Senior-shaped. If you're going to own a component of a system, I should be able to ask you intelligent questions about how you might evolve it, and you should be able to tell me why someone cares about it.


I am honestly totally fine with person like that. Sounds like someone easy to work with. I dunno, not having preference between working on three parts of the system is not abnormal. Most people choose randomly anyway.

Just pick the two you like the most.


>not having preference between working on three parts of the system is not abnormal.

I suppose it depends on the team and industry. This would be unheard of behavior for games, for example. Why you taking a pay cut and likely working more hours to just say "I don't know, whatever works?". You'd ideally be working towards some sort of goal. Management, domain knowledge, just begin able to solve hard problems.

Welp, to each their own I suppose.


Yea a lot software developers I’ve worked with, across the full spectrum of skill levels, didn’t have a strong preference about what code they were writing. If there is a preference, it’s usually the parts they’ve already worked on, because they’re already ramped up. Strong desire to work on a specific piece of the code (or to not work on one) might even in some cases be a red flag.

What I'm talking about is like asking "do you want a turkey sandwich or a ham sandwich" and getting the response "I don't care" - about everything. Pick something! Make a choice! Take some ownership of the work you're doing!

Why would having an idea of where to direct their career be a red flag?

I didn’t say anything about career direction. I’m talking about what project or part of the project. I have worked with developers who insist that they only want to work on this very narrow section of the code, and won’t consider branching out somewhere else, and that kind of attitude often comes from people who are difficult in other ways to work with.

You implied it here:

>Strong desire to work on a specific piece of the code (or to not work on one) might even in some cases be a red flag.

I understand an engineer should compromise. But if you want to specialize in high performance computing and you're pigeonholed into 6 months of front end web, I can understand the frustration. They need to consider their career too. It's too easy for the manager to ignore you of you don't stand up for yourself. Some even count on it and plan around the turnover.

Of course, if they want nothing other than kernel programming as a junior and you simply need some easy but important work done for a month, it can be unreasonable. There needs to be a balance as a team.


I don't think it's beyond the call of duty to expect someone to acquire context beyond their immediate assignments, especially if they have ambitions to advance. It's kind of a key prerequisite to the kind of bigger-picture thinking that says "hey I noticed my component is duplicating some functionality that's over there, maybe there's an opportunity to harmonize these, etc"

> Just thinking maybe we're not seeing the end of software engineering for those of us already in it—but the door might be closing for anyone trying to come up behind us.

It's worth considering how aggressively open the door has been for the last decade. Each new generation of engineers increasingly disappointed me with how much more motivated they were by a big pay check than they were for anything remotely related to engineering. There's nothing wrong with choosing a career for money, but there's also nothing wrong about missing a time when most people chose it because they were interested in it.

However I have noticed a shift: while half the juniors I work with are just churning out AI slop, the other half are really interested in the craft of software engineering and understanding computer science better.

We'll need new senior engineers in a few years, and I suspect they will come from a smaller pool of truly engaged juniors today.


This is what I see. Less of door slamming completely shut, more like, the door was enormous and maybe a little too open. We forget, the 6 month coding bootcamp to 6 figure salary pipeline was a real thing for a while at the ZIRP apex.

There are still junior engineers out there who have experiments on their githubs, who build weird little things because they can. Those people were the best engineers anyway. The last decade of "money falls from the sky and anyone can learn to code" brought in a bunch of people who were interested in it for the money, and those people were hard to work with anyway. I'd lump the sidehustle "ship 30 projects in 30 days" crowd in here too. I think AI will effectively eliminate junior engineers in the second camp, but absolutely will not those in the first camp. It will certainly make it harder for those junior engineers at the margins between those two extremes.

There's nothing more discouraging than trying to guide a junior engineer who is just typing what you say into cursor. Like clearly you don't want to absorb this, and I can also type stuff into an AI, so why are you here?

The best engineers I've worked with build things because they are truly interested in them, not because they're trying to get rich. This is true of literally all creative pursuits.


I love building software because it's extremely gratifying to a) solve puzzles and b) see things actually working when I've built them from literally nothing. I've never been great at coming up with projects to work on, but I love working on solving problems that other people are passionate about.

If software were "just" a job without any of the gratifying aspects, I wouldn't do nearly as good a job.


heh. i am making software for 40 years more-or-less.

Last re-engineering project was mostly done when they fired me as the probational period was almost over, and seems they did not want me further - too expensive? - and anyone can finish it right? Well...

So i am finishing it for them, one more month, without a contract, for my own sake. Maybe they pay, maybe they don't - this is reality. But I want to see this thing working live.. i have been through maybe 20-30 projects/products of such size and bigger, and only 3-4 had flown. The rest did not - and never for technical reasons.

Then/now i'll be back to the job-search. Ah. Long lists of crypto-or-adtech-or-ai-dreams, mostly..

Mentoring, juniors? i have not seen anything even faintly smelling of that, for decade..


> I think highly motivated juniors who actually want to learn are still valuable.

But it's hard to know if a candidate is one of those when hiring, which also means that if you are one of those juniors it is hard for you to prove it to a prospective employer.


> Adding to this: it's not just that the apprenticeship ladder is gone—it's that nobody wants to deal with juniors who spit out AI code they don't really understand.

I keep hearing this and find it utterly perplexing.

As a junior, desperate to prove that I could hang in this world, I'd comb over my PRs obsessively. I viewed each one as a showcase of my abilities. If a senior had ever pointed at a line of code and asked "what does this do?" If I'd ever answered "I don't know," I would've been mortified.

I don't want to shake my fist at a cloud, but I have to ask genuinely (not rhetorically): do these kids not have any shame at all? Are they not the slightest bit embarrassed to check in a pile of slop? I just want to understand.


> If I'd ever answered "I don't know," I would've been mortified.

I'm approaching 30 years of professional work and still feel this way. I've found some people are like this, and others aren't. Those who aren't tend to not progress as far.


  > embarrassed to check in a pile of slop
Part of being a true junior, especially nowadays, is not being able to recognize the differences between a pile of slop from useful and elegant code.

It seems so obvious now, but it does make me thankful that my training drilled into my head to constantly ask "what is the problem I am trying to solve?". Communication in a team on what's going on (both in your head and the overall problem space) is just as important as the mechanical process of coding it.

I feel that's the bare minimum a junior should be asking. the "this is useful" or "this is slop" will come with experience, but you need to at least be able to explain what's going on.

the transition to mid and senior goes when you can start to quantify other aspects of the code. Like performance, how widespread a change affects the codebase at large, the input/outputs expected, and the overall correctness based on the language. Balancing those parameters and using it to accurately estimate a project scope is when you're really thinking like a senior.


More to the point, I think part of being a senior is being able to dig up code you wrote a few years ago and say “how awful”

I do think that there's a meaningful difference between writing code that was bad (which I definitely did and do) and writing code where I didn't know what each line did.

early on when I was doing iOS development I learned that "m34" was the magic trick to make flipping a view around have a nice perspective effect, and I didn't know what "m34" actually meant but I definitely knew what the effect of the line of code that mutated it was...

Googling on it now seems like a common experience for early iOS developers :)

https://stackoverflow.com/questions/14261180/need-better-and...

https://stackoverflow.com/questions/3881446/meaning-of-m34-o...

https://khanlou.com/2012/09/catransform3d-and-perspectives/


Senior level. Still can't sometimes. Just the other day I looked over some code I wrote and realized what a pile of slop it was. I kept wondering "What was I thinking when I wrote this? And why couldn't I see how bad it is till now?" My impostor syndrome is triggered hard now.

>Now I just assume they're taking my feedback and feeding it right back to the LLM.

seems like something a work policy can fix quickly. If not something filtered in the interview pipeline. I wouldn't just let juniors go around and try to copy-pasting non-compilable Stackoverflow code, why would I do it here?


New students are presented with agentic coding now, so it's possible that CS will become a more abstract spec refine + verify. Although I can't make it work in my head, that's what I took from speaking with a young college student.

Some juniors are even using AI for communication in Slack channels or even DMs. It's so uncanny.

I'm staff and that is probably the main thing I use AI for. It's maybe a bit ironic that AI is a lot better at sounding like an empathetic human being than I am, but I'm still better at writing code.

I don't know what world you're living in but software development has always been a cut throat business. I've never seen true mentoring. Maybe a code review where some a-hole of a "senior" developer would come in having just read "clean code" and use some stupid stylistic preferences as a cudgel and go to town on the juniors. I'm cynical enough to believe that this, "AI is going to take your programming job!" is just a ploy to thin out the applicant pool.

Wow, you must have worked in some REALLY toxic places. I had one toxic senior teammate when I first started out - he mocked me when I was having trouble with some of the dev environment he had created - but he got fired shortly thereafter for being bad at his job.

Everybody else through my 21-year career has almost universally either been helpful or neutral (mostly just busy). If you think code reviews are just for bikeshedding about style minutia, then you're really missing out. I personally have found it extremely rewarding to invest in junior SWEs and see them progress in their careers.


Sure have. Finance, research labs, government contracting. Can't wait for people to chime in with their horror stories. I've seen some of the most dysfunctional crap you can imagine.

You seem to have chosen the most toxic (and famous for it workplaces) and now you're misleadingly claiming that's the whole industry.

It is not.


People usually are not living good workplace, therefore there are many more open positions to the toxic teams than to the good ones.

Nice attitude. Thank you for proving my point. If someone works in a toxic environment it must be their fault for "choosing it".

Toxicity is spread out and touching most of the industry. Is it fully toxic? Absolutely not. But I found some level of toxicity everywhere I worked for the past 20+ years in this industry.

Sorry you've worked for such nightmare places, but it's far from universal. There are LOTS of good companies and teams out there.

In my experience the style bikeshedding comes about when PRs are not properly scoped. At least I have learned to just say TLDR.

Seriously. I guess I wouldn’t describe it as a “cut throat” thing, but absolutely nobody in 20 years of working has ever given a shit. The idea of being “mentored” is ridiculous. It doesn’t happen.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: