Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Notes from an Interviewer (devwithahammer.wordpress.com)
127 points by thunderbong on Feb 24, 2024 | hide | past | favorite | 190 comments


I am also doing lots of interviews, and I try to stay away from theoretical questions about SOLID, OOP , the difference between a GET and a POST, what is an inner join, or similar.

Most of the seniors I interviewed were acing questions like this (they probably answered them tens of times before), and there's no way to differentiate them, once you get perfect answers. As the author of blog post mentioned, some of them don't even listen the question till the end before reciting manual phrases about Single Responsibility.

Some people would disagree but for us there are a few factors that predict if the candidate would be a good fit:

* The ability to solve easy and medium leetcode-like challenges and explain his choices. Nothing fancy, no dynamic programming or graph theory. This tells us if the candidate knows how to write code, which in most cases is at least 50% of the job.

* His academical record. It's a good predictor, because people who actually did good in school and finished their homeworks have a good work ethic. Of course, we are not recruiting PHDs to do backend work, but people who did decently well in school are usually nice to work with. There are brilliant dropouts, of course, but they are not the norm.

* We test how opiniated the candidate is when it comes to technology by asking outrageous questions (relative to the Status Quo): Why dependency injection is bad. We like people with opinions, but we find difficult to work with "evangelists".

The rest is commentary (at least in our case).


> “His academical record”

Here at the General 1950s Engineering Corporation, we hire clean-shaven men who did well in college, are dedicated breadwinners for their family, and killed at least five enemies in the war.


I get the joke, but it's not like that. We have a few metalheads in the office. They did kill their fair share of dragons in video games.


I appreciate the ambiguity of your reply because I was expecting it to be “It’s not like that, we do hire women” — but instead got: “It’s not like that, we do hire metalheads.”


I do get your point, but are you under the presumption there are no women metalheads?


Being a woman who attends metal concerts, the scene in the USA is 90% guys. It's significantly more gender-skewed than programming is


> We test how opiniated the candidate is when it comes to technology by asking outrageous questions (relative to the Status Quo): Why dependency injection is bad. We like people with opinions, but we find difficult to work with "evangelists".

This is a great point. The more senior someone is, the more often their answer should end up, 'it depends'. Evangelists often end up not following the standards set in an existing project causing all sorts of maintenance issues later. They also often to seem to always be arguing for rewrites because it doesn't fit their opinionated view of the world.


> It's a good predictor, because people who actually did good in school and finished their homeworks have a good work ethic.

Wow, you actually ask for their transcript? I don't think I've run across this after my first job (an internship), thank god. The main issue with this, of course, is that people are generally much more motivated by pay than by grades and foisted requirements, so you're going to get a very noisy signal from this.


I’d rather have people who are internally motivated to do good work. I don’t look for people who get good grades for the sake of getting good grades. But people who have good grades as a byproduct of just wanting to do well. I think it’s a pretty good signal for me.


> I don’t look for people who get good grades for the sake of getting good grades. But people who have good grades as a byproduct of just wanting to do well.

What is the distinction? Surely a "internally motivated" person would be driven to learn regardless of grades.

Anyway, I'm fairly certain nobody will ever work for you out of simply "internal motivation".


I'll use an example outside my field. I have a friend who's a mechanical engineer. He's smart and pretty talented, and is a real hard worker. But the reason he's a mechanical engineer is that he wasn't good at theoretical physics and by going "down" a step, he was able to apply all the physics ideas to mechanical engineering. So he was able to do well in the field. But it's not what he "does for fun".

There are also people at his small company who spend every weekend building race cars just for fun. They have barns full of engines and tools (I'm jealous of how much space they can afford out in the rural areas!). The reason that they are such good engineers is that they find it fun. They do it in their spare time for fun. They find solving these kind of problems fun. So when they go to work during the week, they're having fun which makes them really good at what they do.

My friend said it to me before that "you can't beat someone who does it for fun" and that's just about for everything. If you look at the best hedge fund managers, sure they had more money than I did to get started. But they also find it fun to do (what I think is) the drudgery of learning the space that they're working in.

So back to grades...if you find someone with good grades who sounds like they worked for the good grades, that's fine, they're a hard worker and you probably won't go wrong with them. But if you find someone with good grades who just was having fun solving problems and happened to get good grades, then you have a real winner. And I think you need to just talk to people and see what they say. You can tell the "grinds" from the others.

I'm also careful to keep in mind that lots of it is luck: having interest (finding it fun), talent (also being good at it), and value (someone willing to pay for it) doesn't make you a better person, it just makes you good at the one thing that someone wants. For example, the things that I find fun are related so solving certain kinds of mathematical problems with software. I was able to find a company that needs that, but there are very small number of places that need what I love. If I had to work for Big Software Co, I'd be ok at it, but wouldn't be a 10x developer. I guess I'm just saying that luck plays a lot into it.


How do you tell the difference?

People who excel academically (in my experience) often have a rough transition to the working world because the feedback loop is so different from what they’ve trained themselves to be successful at. They go from 9 weeks of studying to being declared excellent and moving on to something else and repeating the process to a lot more ambiguity in both feedback and outcome.

I’ve gotten to the point where I don’t even consider education (either as a plus or minus) anymore.


I'm not sure how to tell the difference. As an interviewer, I'm decent at picking up subtle negative signals and there are people that I voted against that the rest of the team said yes to. And I have a 100% record of them quitting/being managed out in less than a year. But, there are other people I was pretty neutral on that turned out to be awesome. So that means I need to get better at interviewing and finding positive signals.

But I guess your comment about people who spend 9 weeks studying is exactly the people I'm not looking for. People who talk about how hard school was because they were "studying" all the time are the people who are working for the grade, not just doing stuff they love.

I know that talent and interest are not evenly distributed in the population. And if you actually are good at something you're interested in it's also vanishingly rare that it's something someone will pay you for. So I'm not trying to make the "if you love what you do, you won't work a day in your life" argument. I'm just saying, that there are people who are interested in something, they're also talented and do good work, and it's something that someone will pay for. Those are the people I'm looking for. It's unfair, but true.


We live in a small country with few decent highschools and universities.


> theoretical questions about SOLID, OOP , the difference between a GET and a POST, what is an inner join, or similar.

The last two of those don’t strike me as particularly theoretical.


“Theoretical” only applies to the first item in the list. The list is generally about questions related to recalling the meaning of terms.


Academic record? I’ve worked with a 4.0 from Cornell that couldn’t follow simple instructions and a 3.5 high school student who maybe quit college later ask a critical question that shaped the end game of a research project.


It does not mean that people who did well in school or college are guaranteed to do well in a particular job. There is a correlation though, as well as many exceptions.

I’ve hired in multiple organizations. Where we could hire quickly and fire quickly, we were willing to take more risks. Where it was hard to let someone go or the work had little risk tolerance, we’re were a lot more cautious and depended on a lot of signals when making a hire decision, including the academic record.


> His academical record. It's a good predictor

Two things:

1) I graduated from college nearly 30 years ago. How is what I did in the 90s a predictor of what I'll do for you today

2) It could be "Her" academic record. Just saying.


1) it's not black and white.

2) English is not my main language, hard to explain, but we use the masculine form when we refer to "the candidate" (which is a masculine noun). Sometimes I do a bad job at translating my thoughts into English and I get to sound like that.


> 1) I graduated from college nearly 30 years ago. How is what I did in the 90s a predictor of what I'll do for you today

Thats what the remainder of the next 30 years of history are for.

> 2) It could be "Her" academic record. Just saying.

It can be "His". Just saying.


“His academical record”.

I dropped out of college young, for complicated reasons. It has never, to my knowledge, been an issue with any potential employer.

And I also disagree with wanti g someone with a “good work ethic”, as you say. People with a good work ethic will grind out mediocre code year after year. I want a creative lazy coder who will find a smart solution or piece of automation to save himself work down the road.


I've worked with lazy coders and it sucks. It really does. They are not diligent about listening to users, about following standards, about testing, or being thorough. It works in the short term but only for them, not for the team.

I despise this trope of lazy coders. The best engineers I've ever worked with were thoughtful, efficient, and hard working. They got things done I didn't know were possible, and then did it again the next day. They write books, develop open source libraries, and speak at conferences. Lazy people don't do that.


> * The ability to solve easy and medium leetcode-like challenges

What about Fermi estimation problems? The internet seems to have turned on them in recent years, but they are a good test of intuition and general problem solving skills.


In four of the last five jobs I have held over the last 30 years, I have been the "last resort", the one who can find a creative, elegant, cost-effective solution when everyone else is hitting dead ends or just sitting there spinning their wheels in the mud. And I go utterly blank when confronted with a Fermi problem.

People who are good at Fermi problems may be good problem solvers. But people who are good problem solvers may not be good at Fermi problems.


> medium leetcode-like challenges

People here like to hate on it, but if you can't solve a problem where a good enough solution uses standard data structures in a halfway obvious way, you're gonna have a hard time in the job.


> * His academical record. It's a good predictor, because people who actually did good in school and finished their homeworks have a good work ethic.

Way to inadvertently eliminate basically anybody who is neurodiverse. Most of the great engineers I know, myself included, had absolutely shit university records because brains don’t always do what they want to and university is rarely a motivating factor for people As the other commenter mentioned, rarely are the motivators the same between university and a job.

> Of course, we are not recruiting PHDs to do backend work, but people who did decently well in school are usually nice to work with.

What does someone’s grades have to do with how nice it is to work with them? I know people that aced their classes and are complete sociopaths, and I know people that dropped out of high school and are lovely to work with.

> There are brilliant dropouts, of course, but they are not the norm.

You say they exist, but you’d never consider them according to the rest of your comment.


As much as I hate looking at colleges and grades, and try not to let them influence my hiring decisions, I see a correlation between both quality of school and academic record and job performance. Correlation does not mean causation, and it’s a heuristic and not a rule. Many individuals are exceptions to this heuristic. However, when hiring and faced with a lot of applicants, we need to make decisions based on whatever signals are available.

One example is sloppy resumes with grammar and spelling mistakes. What does spelling have to do with coding? Both require care and attention, and your resume reflects how much care and attention dedicate to things.


I do agree about the virtues of a resume that shows that it has been put together with care and attention.

But don't forget that while coding, the IDE, compiler, or whatever will correct your spelling mistakes in a very short feedback loop, with hardly any penalty for your output rate. That might mean that your dyslexic super programmer has never learned the value of carefully going over each text before submission.


> has never learned the value of carefully going over each text before submission.

Exactly. Proofreading and double checking is a valuable skill/habit. I would consider the programmers that learned this to be better programmers. I’d venture a guess that those that learned this early on got better grades. I can use the grade as an aggregate proxy score of a bunch of different abilities and habits. Many of those are helpful at work.


There's some good advice here but I take issue with this statement:

> I used to ask ‘tell me one of the SOLID principle you strongly agree or disagree with’ but I had to stop because it ended up with the interviewee listing/describing the SOLID principles rather than critiquing them.

How many people have internalized every part of SOLID such that they can immediately start critiquing one of the items?

I learned about SOLID in some book 10+ years ago and I'm going to need a minute to remind myself what the acronym stands for. It's a sign that I'm taking your question seriously rather than just giving pros and cons of the first random thing that pops into my head.

I think this is just a bad question and that it reflects poorly on the author that they automatically assumed the fault lies with the majority of interviewees.


The author seems like a reasonable person. IMO the best response is something like: "I don't remember the acronym specifically, remind me of the principles and I'll tell you which I feel are the most and least important."

Interviewers aren't perfect. Most are happy to be asked clarifying questions.


That’s exactly my answer every time.


This is a gotcha question, but all things considered, a pretty good one. "I don't remember/know what SOLID is" is the wrong answer, and an easy way to screen out candidates.

No, this isn't about rote memorization to pass an interview. It's about meeting basic qualifications to be a software developer. I can't imagine any reasonable technical manager wanting to hire an experienced dev who doesn't know SOLID.


I find that genuinely absurd. Knowing SOLID is very very far from being a basic qualification to be a software developer. It's a specific piece of jargon that's used in a particular type of place that goes for the whole Java/Agile thing for the most part.

I have been a software developer for >25years now and I have literally never found any of the SOLID principles that useful. Barbara Liskov herself says the Liskov Substitution principle is supposed to be an informal rule, not an absolute principle [1] for example.

I would consider any place that thought knowing SOLID was an absolute requirement to joining to be a place I would not want to work.

[1] https://www.youtube.com/watch?v=-Z-17h3jG0A


> I have been a software developer for >25years now and I have literally never found any of the SOLID principles that useful.

My guess is you actually apply them and find them useful every day that you work, you're just not thinking about them as "the SOLID principles", they're just the things you've learned through experience that make software easier to maintain.

I don't think it's a red flag if a candidate is unable to regurgitate the acronym, but it is a red flag if a candidate doesn't understand that it's generally bad to mix concerns, or generally bad to write software in a way that forces you to modify existing modules to add related functionality instead of extending them, or that it's generally bad to make your interfaces (lower case i) wider than they need to be, or that it would be generally bad to write a child class that can't be substituted for a parent class, or that it's generally bad for a unit to construct its own dependencies.

They became principles for a reason, they didn't just come to be so that interviewers can ask about them.


I gave up on OO programming over ten years ago (twenty mayby)

So the intellectual exercise leaves me cold. I, too, have read of them (SOLID) dozens of times and can never Remer them.

I just had another look and the Liskov one reminded me: Class derivation is bad, the reason one of the reasons OO is so awfully hard, so, no, i do not remember it

Sorry Uncle Bob. Love your work, disagree with your opinions


That would ironically be a good answer to the interview question


I think application of principles and best practices are often overrated. For one, best practices change over time. This means they were not best. Also, principles are just principles, they are rules with exceptions and apply in contexts and limitations. I find the DRY principle very useful for keeping code maintainable, but if applied at absurdum, the code becomes too abstract to be even comprehensible.


IMO an interviewer asking about SOLID is a great sign you should go work for someone else.


> I have literally never found any of the SOLID principles that useful

Single responsibility is a rock solid principle, and it's a shame it's included on a jargon that is almost completely about OOP ideology instead of good quality software.


> rock solid principle

I see what you did there :)


Not all of the five “principles” are created equally.

LSP is the most fundamental in my view, anywhere you have a distinction between interface and implementation, and developers should be aware of it.

DIP is not really a principle, but a recipe of how to invert a dependency when needed. This is also something a developer should know.

SRP is somewhat ill-defined, because there is no objective criterion of what exactly constitutes a “single” responsibility. It’s similar to Unix’ “do one job and do it well”, though. It’s at least a consideration one should be familiar with.

OCP is arguably the most vague one, and also the most impractical if interpreted literally (“don’t ever modify existing code”). In the prototypical example of the Unix file interface (open/read/write/close), it however describes a pattern of defining an interface that abstract over different (and, importantly, future) implementations (as afforded by Unix’ “everything is a file” abstraction).

ISP is a balancing act. Taken literally, it can result in an explosion of interfaces, one for each single use, in the extreme case. However, it points to the fact that “rich” interfaces create higher coupling between components.

All of this is not specific to OOP. It applies to the design of REST interfaces and to microservices just the same, for example.

While there is no special reason to group these five considerations in particular into one thing (it’s just what occurred to Robert Martin some 30 years ago), it at least gives well-known names to them, which is a useful thing in itself. Like design patterns, the valuable thing is that it defines a shared vocabulary to refer to the respective concepts.

A software developer doesn’t necessarily need to know these considerations by those names, but he/she should have an awareness of these considerations, and knowing some names they are commonly referred to is a useful bonus.


It's a specific piece of jargon that's used in a particular type of place that goes for the whole Java/Agile thing for the most part.

A lid for every pot. In such a workplace it probably is a good interview question, at least for a senior role where you’ll be expected to already know the culture, tools, and norms of that kind of place to hit the ground running.


It's possible to understand a concept without knowing what names other people have attached to it afterwards.


> It's possible to understand a concept without knowing what names other people have attached to it afterwards.

This is especially true for self learners / autodidacts or anybody else that's struggled with a problem, invented a solution and the only later found out that their solution is one of a few widely accepted solutions to $problem.


Being counted off for not "showing your work" even when you got the right answer was extremely frustrating.


Having a shared vocabulary is useful, though. That’s the whole point of design patterns, for example. That you don’t have to explain what you mean from scratch every time, but that you can say “decorator” and everyone knows exactly what you mean.


I pray that we never ever cross paths professionally then. This is absurd thinking that stinks of green behind d the ears hubris. I could easily see that one could internalised the principles of SOLID to such a degree that they can’t just list them off. Rather, they have ingested SOLID as one of the many sources of information that comprise their own internal sense of professional competency and judgement. In fact, I’d put myself in that bucket, as I’ve proven just now. I had to go to Wikipedia to refresh my memory, but at the same time no principle was new or surprising to me. If an interviewer reacted in the way you describe when interviewing me, it would to me be an extreme red flag, an indicator that my hiring manager would be an inexperienced rote memoriser that puts Uncle Bob on a pedestal in lieu of being to think for themself. Ridiculous.


I don't remember what it stands for myself and I conducted a good hundred interviews. All of those questions are useless to filter candidates and don't give you any insights either way in my opinion.

What I'm trying to see is the problem solving thinking flow, interpersonal skills, culture fit and the level of product skills. None of the trick questions help on that. We're living now in a world where the small details are a Google search away anyways.


I lead a team of about 10 engineers, and have also interviewed hundreds of candidates over the last few years. I can tell you with certainty that the same candidates who say they don't know what SOLID, KISS, and DRY are, are the same candidates who are poor problem solvers and communicators in the more technical portion of the interview.


I've never heard of SOLID before today. I'm a senior engineer at a FAANG company - about as senior as you get, actually. I've designed operating systems for novel hardware, implemented kernel extensions, written device-drivers for hardware that doesn't even exist at the time, and debugged the same. I've changed shipping applications to run literally orders of magnitude faster by adopting new technologies, and I've written code in daily active use on hundreds of millions of devices today. I tend to be given exactly the multi-functional cross-team "difficult" system-design problems that you're talking about.

I know of the KISS and DRY principles - I think they're widespread, I've just gone and looked up SOLID and it seems like "sensible object-oriented design", but it's not an acronym I've ever come across in the last 30-odd years of professional software development. I don't think it's in the same category as the first two.


I wouldn't put SOLID in the same category as KISS/DRY. The latter are pretty universally understood principles, but I've never used SOLID despite passing dozens of challenging technical interviews and leading teams across startups and Google.

It's jargon from a subculture of enterprise Java, not a universal term.


> It's jargon from a subculture of enterprise Java, not a universal term.

The name itself might be, but the principles apply to software development in general. You don't need to memorize the acronym, but saying you've "never used SOLID" isn't the brag you think it is. If you can get past the acronym and understand the actual principles I think you'll find they apply to most software you write. Unless you never have to maintain the software you write, then it's super easy to go a whole career without understanding the SOLID principles. It's the "20 years of experience" vs "20 * 1 year of experience". Folks in the former group generally understand the SOLID principles even if they can't name them. Folks in the latter group will insist they don't apply to the software they've written in their long and storied careers of job hopping.


> You don't need to memorize the acronym, but saying you've "never used SOLID" isn't the brag you think it is.

You said that if "they don't know what SOLID, KISS, and DRY are, are the same candidates who are poor problem solvers."

I've written plenty of maintainable software, but there are whole realms of software outside enterprise OOP. I have 0 interest in memorizing specific jargon.


> I've written plenty of maintainable software

This may very well be true, and I'm not saying it is not true, but after working in software for multiple decades, I can tell you that many developers think they are much better than they actually are.

> I have 0 interest in memorizing specific jargon.

This isn't about memorizing jargon. It's about understanding the basics of enterprise software development. It is no more "memorizing jargon" than knowing the difference between private and public interfaces. Do you study and memorize the definitions of private and public methods in order to pass an interview? Of course you don't--this is something you know by virtue of having written production software.

I would be hard pressed that you would even consider spending more time interviewing candidate who can't tell you the difference between private and public methods. "Tell me about SOLID, KISS, and DRY" is another question in the same bucket of fast screening questions.


> This isn't about memorizing jargon. It's about understanding the basics of enterprise software development.

Google3 is one of the biggest repositories of enterprise software in the world.

I've been in dozens of engineering design reviews and SOLID has literally never come up.

> Do you study and memorize the definitions of private and public methods in order to pass an interview?

There's a pretty major distinction: public and private methods are part of the language.

SOLID is a set of buzzwords invented by a random influencer and propagated through a subculture. I glanced at them quickly and the principles seem generally sound, but I wouldn't interview based on knowing them by name.


Does Linus Torvalds use SOLID?

Probably a better programmer than you. Do you know who is Linus Torvalds?

Do you even have a Putnam?


It genuinely sounds like you’ve been working in a particular sub-field for your entire career and have let yourself be convinced that its echo chamber is actually the entire field of software development. You’ve essentially touted the wisdom of one (incredibly far from perfect) tech influencer (Uncle Bob) as being some sort of sacred cow and self-evident truth. If you’ve had a successful career in software development it is in spite of your attitude, not because of it. All you’re asking of candidates is whether or not they follow the same cake-baking instructions as you. You’ve turned some guy’s list of tips and tricks into a religion. That’s not what engineering is.


Well to each their own, I've never seen any correlation between those and actual tech, product or interpersonal skills during the recruitment I've made.

I'm not sure how knowing the SOLID acronym would make you a better communicator anyway even in theory. Maybe you could argue for tech skills even if I disagree but the other ones aren't even remotely close.

To gauge for problem solving skills, asking for problem solving on examples proved to work very well in my experience, you instantly see the candidate's priorities and thinking flow.

Edit: Also I do think KISS and DRY are much more widespread terms across the industry than SOLID is but I would not use them either in interviews personally


I’d rather see someone code, refactor, write tests, and debug than regurgitate acronyms like SOLID. In other crafts we accept that there is a difference between knowledge and experience. Software is no different.


SOLID is just some guy’s quote opinionated list of tips, which themselves overlap with KISS/DRY which are much more fundamental principles. This is either blatant moving of goalposts or (more likely) a glaring sign of inexperience on your part.


I can't say I have the slightest idea what SOLID is, but it hasn't been an issue for me. I even got an offer for a fresh grad position at a Software Engineering tools company around Burbank somewhere, whose name I can't remember, but my memory could be jogged.

Whatever the principles are, I've probably got opinions about though, if the interviewer wants to hear my opinions to know if I've got independent thoughts (I do!) and if my thoughts are compatible with how the company runs their business (they might be, but I do swing hard cowboy on the process spectrum of cowboy to aerospace; I am somewhat adaptable though)


I'm almost at the point where my hiring decisions actually turn the other way - rattle down the SOLID principles and I feel like you're more occupied rote learning random SE bits than internalising what they were born out of. Every time I see the list of what SOLID stands for my eyes glaze over because, despite its name, it's not very solid (and frankly often awkwardly vague) advice on its own in my opinion.


Could you tell me what you think about people over process?


It's become almost a cliche as an interview question. That's the main reason I've thought about it enough to be able to critique it.

I think it's not necessarily a bad question assuming that the candidate can recite it. If they can, they probably should have thought about what it means.

I've met plenty of devs who can't recite it but who do it all instinctively though. That's why I'd rather see people demonstrate these principles while coding without using the term at all.

But, coding interviews are expensive and you need some way of sorting the 100 applicants into the 90 who can be ignored and the 10 who are worth an interview. This is a hard problem I've never found a really good answer to. OP's question isn't worse than most of the attempts I've seen.


> you need some way of sorting the 100 applicants into the 90 who can be ignored and the 10 who are worth an interview. This is a hard problem I've never found a really good answer to.

It's a solved problem but you need to take your engineers offline during the hiring process. This looks expensive if you're trying to increase headcount rather than fill actual positions.


I've also seen plenty of devs who can recite them and then blatantly ignore them when writing code.


This is a bad question in that most devs will have internalized several of the terms without attaching that unconscious knowledge with the SOLID terms. I'm always reminding myself which letter means which 'thing that I know'.

The stand-out one though is (S)ingle responsibility. (All the other letters (OLID) are mostly aspects of how to do interfaces/testing.) It means not only to pick an arbitrary thing and do that thing in a class/module. It covers the full gamut of naming things being hard. If you understand the problem, and how to decompose it effectively, then those are the things that get separated out as responsibilities. It's easier lower-down when you're working bottom-up.

Adherence to any buzzword has a potential for cargo-culting so always be aware of trade-offs and exceptions to the rule(s). An experienced dev should be able to convincingly describe the pragmatic benefits of any particular application of a rule.

A different thing that's rarely discussed which I find more valuable is distinguishing policy vs mechanism. This really gets to the heart of understanding and choosing your abstractions. If you do this part well, the things just name themselves.

I suppose then that I would respond to the question by picking one (if I have a strong opinion) and/or critiquing SOLID in general rather than letter by letter.


My answer to this question would go along these lines.

SOLID is a flawed attempt at packaging "clean code" into a formula. The military sounding acronym itself is a stretch, as only the first principle is really of importance. Producing clean code in practice is going to be leaky. That's just a fact of life. It relies heavily on experience and tacit knowledge. I think it's the reason many senior programmers tend to repudiate (consciously or unconsciously) SOLID as their craft matures, and embrace instead a set of guidelines to produce code that (in more or less that order) works as expected, is bug free, is reasonably efficient, and is maintainable (i.e. is legible, understandable, testable, etc).


The author isn’t criticizing that an interviewee might not know the SOLID principles. They are criticizing interviewees who can recite the principles but apparently are unable to express an opinion on them.

The SOLID principles are actually a good example, because some of them are so ill-defined, and they are such an uneven combination of items, that it’s hard not to have second thoughts on them.


it’s just a bit too much for the question i think and it asks for opinion on a subject in an already stressful situation where the candidate is very eager to please and also had their mind in the context of answer with well reasoned technical arguments. I can easily imagine and many times have seen interviewers try to propose an opinion question like this only to spend time explaining to the candidate why the candidates answer is wrong.

It probably seems like a trap to many candidates and often it is a trap whether the interviewer meant it to be or not.

This isn’t to say that there shouldn’t be opinion questions but I wouldn’t position it like the author did in. at least how it’s presented it seems like it’s a fairly “big” question in the interview which candidates will pick up on as being important and questions of opinion can be nerve wracking in such a situation.

I also think that what the author is trying to test can be handled better; i am pretty sure the question is a measure of does the candidate think through their positions and can they defend it professionally, but i think there are better approaches, for example picking a common task with many ways to accomplish the task and asking the candidate how they’d do it and why it’s useful for them to use that method. For me at least i understand their approach a bit more and i like to hear about how they handle such situations in the way that’s most comfortable for them. that the task is one with many possible ways to accomplish it helps take the burden off the candidate in many cases as likely they know there are many approaches and it usually gets candidates to open up and talk about their projects and workflows. I often learn new perspectives on such tasks and telling the candidate as such calms them down a lot also since i genuinely like to hear new approaches. if there are elements i don’t understand, it’s a great question “oh that’s a new approach for me. can you explain a bit more how this method is useful for you?” (not exact words) which usually is a great way to get candidates to talk more as they see i want to hear their thoughts, not a specific answer; they know their answer is one of many right ones so there less burden there too.

i think the authors question is good intentioned but poorly executed.


> How many people have internalized every part of SOLID such that they can immediately start critiquing one of the items?

There was a post here the other day (https://news.ycombinator.com/item?id=39460829) where someone was giving advice about interviewing at Amazon and knowing their "sixteen principles" because "they're not just words, here more than anywhere we live and breathe them". He even went on to talk about how they specifically used them all in interviewers, and "bar raiser" interviewers were chosen for their "deep and profound" understanding of them all (there was more than a slight narcissism and smug superiority in the article, if you ask me).

My thoughts were: "Quick, you work at Amazon. Name all sixteen principles that you live and breathe every day. That people are expected to have a deep and profound understanding of."

(My other favorite quote in that article: "I have certainly referenced Amazon's Principles when discussing parenting techniques".


> How many people have internalized every part of SOLID such that they can immediately start critiquing one of the items?

You don't need to have internalized every part of SOLID to answer the interviewer's question. You need to have internalized a single part.

I can only remember two letters myself: the "S" (the Single Responsibility Principle) — which seems like the unforgettable letter in the thing — and the L. That would suffice to answer the interviewer's question.

(I'm not sure I agree with the interview question, as I'm not sure what I, as an interviewer, would hope to glean from it.)


For fun I just asked BingChat/Copilot this question and it really likes the Single Responsibility Principle - it wrote a couple of paragraphs on it and then offered a code example in PHP. I'm looking forward to the day when we can just sent our AI avatar to interviews, it'll save so much time and stress. ;-)


> I'm looking forward to the day when we can just sent our AI avatar to interviews, it'll save so much time and stress. ;-)

We already can! I recently went through the "can you code?" phase of the interview pipe and they had me use a shared code IDE thing that ... had chatGPT integration.

The prompt was simple and I put in a paraphrased version of it into the chatGPT pane and ... the proposed solution was _almost_ correct. The coding question had one small wrinkle that could be accommodated by making a small tweak to the proposed solution.

I don't know if this was "free" gpt or the version that has code execution; I didn't bother pushing GPT to run and evaluate the code / correct its proposed solution against the test cases that the interviewer provided... but it almost certainly would have quickly figured out the wrinkle and adapted.


My big problem with 'Know why not (just) what' is that, as demonstrated in the responses here, the interviewers typically don't, and if you try it will sometimes result in them failing you.

At a Google interview I tried to explain that in big-O notation the sequence can only exceed the bound in a finite prefix of the sequence.

Formally:

O(g(n)) = { f(n): there exist positive constants c and n_0 such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n_0 }

I was actually made the interviewer mad with that when he asked if there was any reason to use non-recursive search algorithms and I was providing justification.

I still got an offer but turned it down because I need an environment where I can learn where my blind spots are and return in kind with the team.



No, never seen that channel before but if he interviewed at the Fremont campus in Seattle we probably had the same interviewer or same internal guidelines?

It looks like he took that job, I didn't.


On reflection I am sure the fact that both he and I had backgrounds with solving performance problems, forcing us to dig into the details of big-O is why we had similar experiences.

Mix with what is probably some standard questions inside Google and it illustrates the problem I was thinking about.

That is one of the better videos I have seen on the topic, so thanks for sharing it.


You describe how it successfully prevented you ending up in the wrong environment, so I don’t understand why you have a problem with it.


The problem is that I hold back now, and I also have to waste time interviewing people to make sure it is clear I would love to learn from the candidate when conducting interviews.

The trivia game without context of assumptions and constraints is problematic for everyone.


I have been interviewing technical candidates for decades. I have found the most effective approach is to deep dive into systems the interviewee has built. This puts them comfortably in areas they should know, and very quickly how shallow or deep their knowledge is.

Asking random questions about technology mostly tests their ability to memorize facts. I want people who know how to build things in the real world, not those who excel at adding certifications to their LinkedIn.

A parting word - if you are leaning towards Uncle Bob for wisdom, you are heading in the wrong direction.


Definitely agree with the first part. The most important thing you interview for is passion and any discussion that lets the candidate show that more is great.

Hard disagree on the bit about uncle Bob. Everything he says isn't gold but his take on clean architecture and use case driven development is pure magic.


What has he built other than peddling his books and courses that we use on a daily basis? I don’t know if he meets the bar of an exemplary engineer by that standard.

Coders at work was a far better resource in that regard, every person that was interviewed has made significant impact in the world of computing and is discussing hard earned lessons about building software.


> What has he built other than peddling his books

He wrote the books. That's what he did.

What was in the books let me build things faster and with more malleability that has been a game changer in my career.


This is so rare. I thought that the github repo trend would lead to more of this type of interviews but so far I hit none. Either people say "cool" and never mention your repo ever again or they straight say "we don't care about repos". So strange


I'd love to know a little bit more about how you do that. I'd typically feel wrong asking a candidate to show code they've written at a previous employer.

Do you just ask them to walk you through the architecture by memory? If so what how do you come up with questions about their choices without the specificities of the actual way they wrote their code?


No, I don’t ask them to show me any code or artifacts, but to simply describe the architecture, how it’s put together. Any novel code or algorithms they are proud of. Areas of the code that were problematic, and how they would fix them if they had the chance. Talk about non functional requirements and if the architecture addressed them or not.

Basically see if they could communicate how the system worked, it’s good points and bad points, and how deep they could go. This quickly separates the good developers from the cargo culters and pretenders. It also is a good gauge of how much real experience they have had.

Generally, if someone can’t describe at least the rough outline of the architecture of an application they have worked on, that’s a strong signal they are going to struggle as a developer


> Do you just ask them to walk you through the architecture by memory?

depends on the role, e.g. for infra work i can ask "if you had complete autonomy to stand up this project again, what would be your first step" and then we go from there


Don’t believe GP was talking about previous code, rather previous design. Perhaps diagraming on a whiteboard.


You don't need a whiteboard to talk.


My experience with tech interviews is that the number one most important thing by far is to be an leetcode master while being friendly and personable. Everything else is secondary or tertiary to the ability to regurgitate perfect hyperoptimal solutions to leetcode mediums and hards in twenty minutes.

This won’t cover every single company as there are edge cases, but it’s the most scalable approach covering a huge swathe of companies, including many of the most competitive ones.


I had to look up what SOLID is, and it kinda goes against most of what we’ve learned about software development since the 90s. When people say they don’t want algorithmic “leetcode” interviews, I wonder if this is the alternative they want? 1999 style OOP Java trivia?

Is the idea that you’re supposed to provide good discussion of the pros and cons, and so show some experience? That sounds like you’ll just hire people who had similar experience to you.


SOLID is a meaningless term in my corner of the industry (firmware, where we follow the SPAGHETTI method), so I read SOLID as a placeholder for whatever the acronym-of-the-week is. With that in mind, I came to a totally different conclusion. I want to ask questions that most interviewees get wrong, because that's how you narrow down a large pool of candidates to a small pool of finalists. Augmented by a reasonable quantity of questions, pointed questions like these can serve to isolate a signal from noise.


> firmware, where we follow the SPAGHETTI method

I used to write GPU device drivers. Electronic Engineers had a great understanding of the hardware, but tended to write sloppy spaghetti code. CS graduates, on the other hand, had a loose grasp of the hardware but we're better at writing solid maintainable code. Most teams ended up with a healthy mixture of the two, which worked great.

I once ran into a hiring manager that only wanted to hire people like himself, with the same skills and philosophy. I tried explaining why we needed people with a more diverse set of skills and strengths, but he wouldn't bulge. Unsurprisingly his team didn't deliver much value to the company.


It's a weird field in that regard. I'm a CompE, but I started college as an EE and switched pretty late. My skillset is more closely aligned with the CS folks, but my formal education and training was by EEs, for EEs. I still do "hardware stuff" for fun.

The best firmware teams I've worked on have always had a mix of disciplines. Heck, I mentored a mechanical engineer for a while (he became a very good firmware dev). You have to have good programmers who manage to write modular, maintainable code even if the language and hardware limitations fight them the whole way. You need folks who understand Electrical Engineer-speak. Depending what you're doing, you might HDL people, and that stuff might as well be a completely separate discipline from regular software. You need people who are comfortable debugging with nothing more than log messages, and if you're lucky, a coredump. You need a manager who can wrangle the occasional "it's a hardware problem" vs "it's a firmware problem" debate.


Yep, that matches my experience. There was a lot of Feynman debugging as well. It's a good job for somebody who is eager to learn new stuff and get out of their comfort zone. Another great thing about it is that people tend to stick around for a long time, so you build good friendly relationships with your coworkers.

The biggest downside of driver development is that it becomes very repetitive after a while: you are the maintainer of this immense complex piece of clockwork (the hardware) that you must to keep ticking reliably at all times by working around all the bugs in it. It sometimes feels like your beautifully crafted software contains as much code to track and work around hardware bugs as it contains code fof the idealized hardware described in the origibal documentation.


My first job was on working on drivers for a proprietary OS. Every time a new board came out, the protocol was to copy the code for the most similar board and start making modifications. It works, but there's a whole lot of repeated logic for common init sequences and logical mapping of memory addresses to what they control. The "simulator" that was widely praised for cutting down on spin-up time for a new board was nothing more than stub code. It all worked and was maintainable, but everything was harder than it needed to be.


> Every time a new board came out, the protocol was to copy the code for the most similar board and start making modifications

I only saw that happen at one place. It was a bit of a shitshow.

Big corps don't do that, they have a clear internal division between various levels of hardware-independent and hardware-specific components, down to workarounds for specific bugs in specific releases.


This is almost a word-for-word description of the dev workflow for a proprietary embedded OS I worked on as an intern. Even the "simulator" was what you describe - some #ifdefs that turn the program into a cli program using Win32.

This worked for us because our approach was to "make it right the first time, update only when necessary."


It's not completely meaningless but it does tend to be something people recite and then never think about - even when they're inadvertently following it. It's performative - a bit like reciting the bible.

Take I as an example - "don't depend upon interfaces which you do not use". I've seen good developers cut out dependencies hundreds of times for this reason - not because they've trained themselves on SOLID but because it just feels right after years of experience. If I said "this is part of SOLID" many of them would go "... is it?" "hmmm. I guess it is..."

As with reciting the bible, just because somebody recites it doesn't mean that they took it to heart and people who take it to heart can't necessarily recite it.

Being able to recite it is just a ritual used as a social signaling mechanism. It's a bit like leetcode, an ability to recite big O notation or knowledge of git's internals - symbols of "developerness" that don't necessarily align with skill.


I'm not crazy about SOLID principles either. They don't feel like useful, practical ideas to me. They were born to be memorized for a classroom test or a job interview. SOLID does come up in interviews, though.

Most companies don't really invest in trying to do good interviews. Rarely is there anyone motivated (or permitted) to think about the process and make it a lot better or different. Doing what they've always done, or what everyone else seems to be doing, is good enough.


DDD, Microservices, Clean, Hex, Onion, TDD, XP, EDA.....

All of those follow the basic SOLID principals.

The problem is the cargo culting and blog posts vs actually reading what the stuff means.

S or the Single responsibility principal as an example is always explained poorly by the same people who complain about it.

SRP means that "A module should be responsible to one, and only one, actor."

An 'actor' being a group that requires a change in the module.

making sure you decouple persistence code from domain code is an example. Almost universally the people who complain about SOLID thinks it means that you have to have trivial functions with lots of sprawl.

Really it is about allowing you to make changes without stepping on another's toes or trying to coordinate with them.


Mantras like these are always a response to something. Someone in charge of technical culture at $PLACE diagnosed specific anti-patterns in their Java code, came up with a set of rules for the noobs and a catchy acronym. Things got better. Great!

But then those rules got transposed into other situations, possibly by somebody else and things went downhill. Also see "Agile".

Our industry, because of its huge growth, is filled with inexperienced people who learned from inexperienced people. We all know software is kind of trash, and we're coming up with ways to improve it, but the fact of the matter is we don't know what works. We know some things that don't work, and many of those were previously on a list of things that might work and had catchy acronyms of their own. A few people have enough experience to give good advice, but what they have to sell isn't a magical silver bullet with a neat acronym, and they don't use twitter, so nobody listens to them.

The only thing I personally believe makes you write better code is long and varied experience with different domains, fueled by a desire to always be learning. Throw away mantras when they're no longer useful. Work like this for 10-20 years, and you'll start writing sort of OK code. Everything else ends badly.


> Is the idea that you’re supposed to provide good discussion of the pros and cons, and so show some experience? That sounds like you’ll just hire people who had similar experience to you.

This depends far more on the interviewers listening skills, open mindedness and maturity. Obvious everyone has their biases, but good interviewers (especially for senior positions) need to be able to evaluate how someone with different skills and perspectives will be additive to the team.


Your comment is involuntarily funny in the sense that 1999 style OOP Java is a serial offender of SOLID, not a prime example of it. It's mainly the Liskov Substitution and dependency inversion principles. If you have deep class inheritance hierarchies, you are not doing SOLID, sorry.


I’m not sure I agree encouraging inversion of control, more indirection and behavior inheritance is an improvement. In the recommended style of most languages, these are probably big anti-patterns.

On the other hand, I never spent that much time working on old Java stuff, so maybe it is a step in the right direction.


Any time you inherit from a concrete class (vs just implement an interface) you depend on something concrete and not an abstraction.


Hmm, SOLID is very well-known and the principles are ubiquitous good practices for object-oriented programming and also useful for software dev in general.

Certainly, asking about them and , especially, asking the candidate to critique them is a very good question.


Perhaps there’s a difference between knowing and having internalized these concepts, versus having heard of them via a particular acronym, and being expected to remember that acronym and these specific names.

“Liskov substitution principle” for instance is something I had to look up. I can read it and say, “oh, duh, yes that’s a key part of what interfaces are even for.” I’ve built with it for decades now.

The term SOLID was apparently introduced in 2004. It would be a shame to reject good programmers who can opine intelligently on these principles, but who have not internalized this particular jargon.


Yes, there is quite a difference in my opinion. What does SQL stand for? No clue, but I can write you an SQL query. Same thing with HTML, CSS, PHP, etc.


That's funny. I wrote SQL daily for 10 years, read the ISO standards when working on SQL transpilers, persistence frameworks for multiple databases, etc. Even wrote a new storage engine for MySQL once.

But I just had to look up what the acronym stands for! Apparently it's "structured".


Isn't that the point?

Even if the candidate doesn't know the acronym you can probe if they know and understand, and be critical of, what the principles are.


It’s generally not what it claims. Single purpose is a good idea, but the rest…

The idea that one class changing won’t change others is a pipe dream.

Substitution only matters if you’re using inheritance, but if you’re inheriting, you likely have bigger problems as few real world problems are naturally represented by inheritance.

The interface rule is redundant with the single principle idea. Maybe it should have been SOLD instead.

Dependency injection is a guideline at best. If you actually do what it says, you get enterprise fizzbuzz. It should only be used occasionally and sparingly at specific boundaries otherwise the cure becomes worse than the disease.

More controversially, when you avoid inheritance in favor of composition in a language like Java, you lose polymorphism while keep a leaky abstraction and really bad encapsulation while the language still retains all the mental and syntactic complexity of the inheritance bits.

https://github.com/EnterpriseQualityCoding/FizzBuzzEnterpris...


> The idea that one class changing won’t change others is a pipe dream

Reality is never black and white but that's a good principle for good software design and worth at least aiming for.

If anything that has been adopted and generalised in microservices, etc. because encapsulation, modularity, design by contract all hinge of this same idea, which is powerful and useful.

That's why when discussing any "principles" the important is to get the core idea behind them instead of sticking to the letter and discarding them as useless.


Encapsulation is extremely important and pairs with single responsibility, but that’s completely orthogonal to OOP.

Once you are solidly in the composition camp, functions are immediately superior. They tend to be naturally single purpose and have mathematical rules that make them easy to compose.

Most function-heavy languages also adopted modules which provide a better, less-leaky abstraction too and that’s without getting into all the other advantages in the type system or better syntactic ergonomics from dropping all the OOP baggage.


Even in Uncle Bob's version of the single responsibility principle, 'one reason to change' is about people.

He words it poorly, but it is more about not commingling DBA, UX, core domain logic in a module.

People miss that and end up having low cohesion.

While SRP and interfaces do have an intersection, they are not the same.

Interfaces leaking implementation details is an example.

Composition, dependency inversion, and dependency injection are all forms of L.

Polymorphism is problematic in most cases but thinking it is the only way to accomplish that looser coupling goal is a misunderstanding of where we have grown with the concepts.

I tend to prefer the ports and adapters pattern, which ensures client components that invoke an operation on a server component will meet the preconditions specified as required for that operation.

That is equivalent to LSP or design by contract.

SOLID is just a set of principles that are intended to make loose coupling and high cohesion the ideal default.

This is in contrast to procedural code that we know makes unmaintainable systems.

The problem with SOLID is the same as Agile, XP etc...

People ignore the core ideas that are descriptive in nature and try to use them as oversimplified prescriptivist rules.

To be fair Robert Martin wasn't great at selling this and that is why I tend to point people to authors like Martin Fowler who are more careful about sounding like a preacher handing out laws.

But Robert Martin developed SOLID in the context of cohesion, and if you read it in that context it is valuable.

He didn't invent the SOLID tool to remember the rules, but it seems that blog posts and Wikipedia pages are as far as people get before trying to implement it or dismiss it.


Take a look at Uncle Bob’s code some day. You will then realize that all of his writing was synthesized solely to make him consulting money. His actual code is rubbish. This has been true all the way back to the 90’s with his inane questions to the OOP Usenet groups.


That's also my unpopular opinion about him, I thought I was alone. I watched some of his clean code videos in the past and thought his advice looked dated and very "enterprise" styled.


Back in the 1800's and early 1900's medicine was a very new science (in a modern sense), and snake oil salesmen abounded. Laymen had very little information to discern legitimate medical advice from quacks hocking piss and ink as a cure all.

We have the same problem now with software systems, although at the least we are advancing beyond the primitive stages and the charlatans are becoming more apparent.


I already stated he isn't like his books personally.

But outside of Tu quoque fallacies, van you explain why this is a bad idea.

"A module should be responsible to one, and only one, actor."


The problem is that Uncle Bob made up SOLID and other ideas more or less by cobbling random ideas from other people and selling them on the market as his own. As many others have mentioned here, pieces of SOLID kinda-sorta make sense, but it doesn't hang together cohesively because it was never created in a cohesive manner.

ALWAYS look behind the curtain at what software consultants are selling to what they have actually built, and then you can ascertain if they are selling you hard won engineering knowledge or a bunch of marketing BS.


As I want to learn, outside of the problems with cargo culting, why having the SOLID principals as the default _ideal_ when programming cause problems.

Sure he was pulling from Structural design ideas, which even Tom DeMarco, the author of the canonical book on SD, now says should have been iterative and Agile-like, but he was also pulling from papers like the following from 1972 about modularity.

https://dl.acm.org/doi/10.1145/361598.361623

The SOLID principles are a specific application of computer science principles to OO.

Separation of Concerns is a specific example of core CS principals that he pulled from but are from Dykstra.

https://www.cs.utexas.edu/users/EWD/ewd04xx/EWD447.PDF

So is your reason for discrediting him that he aspired to be a professional writer, or that he tried to collect concepts from the wider tent into a more domain specific presentation?

Because structural design is an example where the founders realized that not paying attention to that first paper was problematic.

"We have tried to demonstrate by these examples that it is almost always incorrect to begin the decomposition of a system into modules on the basis of a flowchart.

We propose instead that one begins with a list of difficult design decisions or design decisions which are likely to change. Each module is then designed to hide such a decision from the others. Since, in most cases, design decisions transcend time of execution, modules will not correspond to steps in the processing. To achieve an efficient implementation we must abandon the assumption that a module is one or more sub- routines, and instead allow subroutines and programs to be assembled collections of code from various modules."

How about addressing Robert Martin's position, instead of irrelevantly attacking random aspects of the person who is making the argument. A.K.A Ad Hominem fallacies.

In the context of this thread, if a candidate could only express their objections to _ideals_ that are intended to be the defaults, and couldn't express their objections in a way that related to business and customer needs, I would be a hard no on the hiring position.

It is OK to not agree with these ideals, but to only offer personal attacks to dismiss them, it indicates someone who will probably be a difficult team member and lack customer focus.

Ad Hominem's never address the real concerns or needs of a project and are simply not constructive.

We know that EA failed, we know SOA works better if imperfect, we know that aiming for loosely coupled and highly cohesive code results in systems that are easier to maintain and scale, and we know that with the rise of the cloud most needs will be distributed. All of those apply the same principles at a different scale or domain.

Hand waving away those well known concepts because you have a distaste for an author won't change that. But if you share ideas that address the limits of those ideas perhaps it can help move us further.


>The SOLID principles are a specific application of computer science principles to OO.

There's little to no hard science and serious research behind them, aside from Uncle Bob writing about them and some hazy papers.

The Liskov substitution principle was developed by an actual computer scientist working on such research, but the general field of programming methodology Liskov worked in remains scientifically something like what psychology was before Freud.

Not saying Freud is scientific - saying that programming methodology is even less well established and researched scientifically in 2024 that psychology was not just at the time of Freud, but even before Freud.

Dijkstra himself, in many of his notes regarding methodology, was basically 90% opinion, even when he was right.


Here is a metastudy that will show that complexity, coupling and size were found to be strongly related to fault proneness.

https://arxiv.org/abs/1601.01447

There is lots of studies that show that modularity is one of the most significant factors for evaluating maintainability of OO systems.


Note that even before EA was dead, modularity was still valued as captured by this best practices guide that I am pretty sure is older than 1991.

https://www.washington.edu/is/fin/fas/standards/cobol.html

R.Martin had nothing to do with those rules related to procedural code.


> Substitution only matters if you’re using inheritance, but if you’re inheriting, you likely have bigger problems as few real world problems are naturally represented by inheritance.

No, it matters when you implement interfaces (in the Java sense) too.

Tons of real world problems are naturally solved by having multiple implementations of interfaces. Also, Dependency Inversion doesn't even make sense without it.


It’s well known in a specific setting, possibly? I’ve managed to never hear of it in 18 years.

If asking “well known to some people” stuff is kosher, should I be able to quiz candidates on FFP design principles in Haskell, or maybe how to implement collision detection in a video game? Both of those are well known in swaths of the industry.

I guess if you’re hiring for a specific role, writing OOP business applications, you will have no shortage of candidates who know about SOLID, but you’re probably passing on everyone who is changing domains.


Only ones I ever found worthwhile are D and maybe I.


The only part of SOLID that is unquestionably correct is "L" - it is the only sane way to use subclassing in OO languages.

"D", when used judiciously, can be useful because it can make unit tests much easier to write. But when used too much, it results in incomprehensible code.

But as for "I", in my humble opinion, when you find yourself reaching for it, it's a sign that the code base is already poorly organized.


But the L part is only correct in so far as it repeats the definition of what it means to be a subtype, i.e. it is vacuously true.

BY the way, does any OO language apart form Ocaml get this right?


The L part was apparently not so obvious to the authors of the Java standard library since it has violations all over it.

Readonly implementations of java.util.List come to mind.


S (Single responsibility principle) is very important, not only for OOP but for good, maintainable software design in general (yes, even if you do firmware stuff).


Absolutely. I just find it rather inactionable, as it can be subjective.


Odd because that's arguably the easiest to apply when designing software.


>Certainly, asking about them and , especially, asking the candidate to critique them is a very good question

...if you're looking to hire a code quality consultant that has not written any code since the 90s


If you think those principles got out of date then indeed I would not hire you.

Something like the "single responsibility principle" is just good software design to anyone who ever had to maintain any piece of software.

Quite extraordinary how commenters in this thread discard things as old without even understanding that core principles are as valid now as they were then.


>If you think those principles got out of date then indeed I would not hire you

If you think your hiring criteria are some kind of general yardstick, I wouldn't want to be hired by you in the first place. What's with people using their position of power (hiring people) as some kind of argument - or perhaps threat?

>Something like the "single responsibility principle" is just good software design to anyone who ever had to maintain any piece of software.

The "single responsibility principle" is an ad hoc, ill defined idea. A bad idea for many real world cases, that has to be applied with discretion - not taken as a core guiding principle. Not to mention that neither "reason to change", nor "change" is well defined by Martin.

It's as bad one-size-fits-all advice as advising about the ideal function being "two to four lines of code long" (also by Martin) - leading to crappy, hard to follow code, and needless abstraction.

KISS - now that's a principle I would get behind.


A principle always has to be applied with discretion.

Hence why I think why the OP's idea to ask candidates to critique is good. It shows whether the candidate blindly follow things as the gospel or is able to articulate limitations but also the reason behind a "principle" because there are very good reasons behind them.


Sounds like the author really likes their OOP. Nothing wrong with that but I hope the trivia is reflective of the job. I find programming to be mostly manual labor where common sense trumps many clever paradigms.


Follow instructions. I like to say "please respond with short, focused answers. Two or three sentences per question max." The interviewer will inevitably ramble on with unfocused stream of conscious word vomit until I have to cut him/ her off.


Thank you for sharing this and for providing explicit instructions during your interviews.

It's confusing and awkward when the interviewers set it up as a casual conversation. There are no natural conversational cues when you're talking to muted participants who are looking down writing notes and continue to do so for a few seconds even after you stop talking. It feels rude and uncomfortable to just deliver a short statement and stop but I suppose that's probably always going to be the best strategy. I'm new to interviewing and suspect I just failed a final round due to word vomit. I would have much preferred to be working through a coding problem instead of having a "conversation".


One big point I'm missing from this post is "work on your communication skills".

People shit on STAR method or variants, but the fact is that unless you are a great communicator (and most engineers I interview aren't), you will do better by adding some structure to your answers.

And then there is people who just ignore your questions and ramble on and on. Listen to your interviewer and directly answer their questions.


I'd add:

* Do some research on the company and ask a question showing you did so

* prep your coding skills with koans or similar, depending on the job and your recency of coding experience

* Have a good reason why you are looking at this company, instead of any other


> * Have a good reason why you are looking at this company, instead of any other

I don't like this sentiment. Are people not allowed to be looking at other companies at the same time? Your company is probably not so unique that people will have legitimate reasons to have a significant preference for your company over other ones (sans the salary). I feel that it's out of the interviewer's line to expect interviewees to have an answer for that, when it's not like the interviewer is looking at a specific interviewee instead of any other.


> Are people not allowed to be looking at other companies at the same time?

Of course. But I think if you can think of some reason that you are interested in this company (beyond "I need a job") that is going to help you stand out.

You don't need to pretend that working for this company is your life's dream, but a reason will show you have done some work and are excited to be a team member.

Examples include:

* I like the variety of consulting (for a consulting company)

* I really want to work more with Ruby on rails, because I have done so in the past and enjoyed it.

* What you are doing in the industry is exciting because I think webhooks are foundational and undifferentiated, but hard to get right (for a webhooks company)

Etc etc.

Don't gush, just show a modicum of interest and research.


I didn't read it that way. You could absolutely be interested in other companies but you should be interested in this one as well. Don't show up not knowing what the company does or the basics of the product. We hire people we want to work with, that want to work with us. Knowing who we are shows you want to work with us.


This is canonical advice, but I think it's often taken too far, or over-interpreted.

The vast majority of companies are not special or interesting. And the vast majority of candidates are the same. Making the candidate pretend to think there's something special about your company makes you look foolish.

If you water it down to "You need someone with my skills, you're located conveniently (or remote), you pay reasonably well, and (so far at least) I don't hate you.", then sure. That's fine, but wanting more is often self-deluding.

OTOH, showing some initiative and a vague understanding of what the company does? OK sure. Candidate should have read the home page and maybe About Us. If only to establish that they have determined the company passes the first couple stages of their filter and that the conversation is not a total waste of time.

Addendum: The above is for staff positions, where I interview most candidates. I also do peer interviews at the leadership level, and expectations at that level are definitely higher and include the desire to help set the tone for the organization -- which should trend toward specialness, but presuming that you've succeeded in the eyes of a candidate is, well, presumptuous. :-)


> I used to ask ‘tell me one of the SOLID principle you strongly agree or disagree with’ but I had to stop because it ended up with the interviewee listing/describing the SOLID principles rather than critiquing them.

It sounds to me like this is a good question that you should keep asking. You're typically hiring one person out of a large pool. If only one out of 15 interviewees does well on a question, that's a helpful signal.


As a good test for how important a signal this is, the interviewer should ask it to a few of their top performers.


This is the gem comment in this thread. If all these interviewers are such well-rounded and knowledgeable engineers, how much testing and effort do they put into their questions? Or is this just "cowboy interviewing" based on pet questions that scratch their own itch?


Just because a fraction of people can answer something to your satisfaction, it doesn’t mean it’s a good signal. Eg I don’t think baseball trivia would be a great way to narrow down candidates.

That said, if you need to narrow down the pool and you like the question then go for it. I think it’s important to keep your ultimate goal in mind, which is probably:

- find the desired number of acceptably competent candidates

- who are likely to accept your offers

- without needing to make very large offers

- or spending a lot on finding and interviewing unsuitable candidates

- while complying with any local laws

These can lead to a bunch of different things you see in hiring processes (eg you prefer referrals because of 2 and 4, and candidates may be rejected for being overqualified for reasons 2, 3, and 4)

As a candidate, this often feels very frustrating: you would rather companies follow processes that are maximally fair where so long as you demonstrate sufficient merit, you get an offer. Or even better, processes that do a good job of uncovering the value you bring. I guess one can try to look at the hiring process through this lens to not take rejections so personally, and one can hope that different companies will have different processes (to hopefully reduce how much they compete with each other for the same subset of candidates) though I don’t think they end up being very different in practice.


You're absolutely right - my remark was made with consideration of the specific question at hand. Instead of asking them to regurgitate an acronym, the given question asks the interviewee to provide analysis of an engineering technique. The question gauges an interviewee's reading/verbal comprehension, while also giving the interviewee the opportunity to display analytical thinking skills and an intimate understanding of the topic.


Agreed.

Anyone can learn acronyms, principles, etc by heart and repeat in an interview. The real useful questions are about reasons for them, purposes, limitations, etc. I.e. to test the candidate's critical thinking and experience.


I’ve written a lot of software without knowing SOLID. Maybe I did it all wrong.


IME, SOLID is only an important idea because there are so many ways to do OOP poorly and you need a set of guidelines to help you not turn your OOP codebase into a mess.

You get most of SOLID automatically if you just don’t do implementation inheritance. Many great languages don’t even support it in the first place, and that’s a good thing as it’s a massive footgun. That obviates 3 of the 5 SOLID principles right there: just don’t do implementation inheritance. (Interfaces are ok, just not base classes, ie. Use “implements” but not “extends” in Java, etc.) The other two principles are mostly just common sense anyway.

Or heck, maybe just don’t do OOP, as it’s not clear to me that it’s the right approach in the vast majority of circumstances. Make your data first class and design algorithms that work on it, keeping your logic and data firmly separated, and you sidestep the whole issue.


Don't worry!


"Study basic interview questions, listen actively, and have a good attitude."

Why is this on the frontpage.. ?


I've found that interviewers who often give expert advice are the problem, not the interviewed.


Can you elaborate?


My #1 interview tip would be, have a decent mic / internet setup so I have a shot at understanding what you're saying.


Yes, this! In this day and age, if you’re applying for a remote position but you’re still fumbling around, unable to smoothly join a zoom meeting with a working microphone, it doesn’t bode well.


I’ve never understood this issue. We get paid well. A $400-450 Shure sm7b or Electrovoice re20 will last you decades and is an incredible return on investment — way better than your electronics and monitor investments and probably better than your chair and desk investments for most people.

Even a $30-50 USB mic will last a few years and blow away your laptop or built-in camera mic (though the dynamic Shure/EV mics don’t pick up everything like capacitive mics).


For me at least it's about the ergonomics of the microphone form factor. I'm not comfortable with a standalone mic in front of my face. I don't like having to remember to keep the proper distance.

I got a nice high quality broadcast headset that sounded amazing but then I couldn't get the monitoring levels to work well with any combination of audio interfaces. On any given call I might have to boost my headphone volume considerably depending on the other person's audio quality but that would invariably throw off my own monitoring levels.

In the end I got a high quality gaming headset that doesn't require monitoring due to the open-back design. The microphone sounds fine but not nearly as good as the other options unfortunately.

All that to say I spent a few thousand dollars over a few months trying to solve this problem and couldn't find a great solution!


ElectroVoice re20. The "variable-D" design helps you sound decent if you're not keeping exact distance from the mic. The builtin pop blocker is also quite effective.

It was designed so that if your interview target didn't have good mic discipline (most people don't have any), you'd still get a usable recording. As long as you're in a roughly 12" cube in front of the mic, you'll basically be fine.

If you're spending that much money and really want a headset mic, get a countryman and wireless set. You only need to reach a few feet, so a cheap analog system should work perfectly and provide way better sound quality.


Anything is better than a laptop mic. It’s amazing how some people don’t seem to get the hint after years of remote. I bought a $30 Razer desktop mic and it’s already and order of magnitude better and pretty sturdy compared to the $10 lapel mics or laptop mics


> Anything is better than a laptop mic.

This is not true if you have a fairly recent Apple MacBook. The mic quality especially on the Pro models from the last few years is actually pretty good. I often ask people to switch to that mic in our Google Meets if they are using almost any bluetooth headphones (even high end Bose or Sony ones).


The M-series? My company doesn’t have those so I can’t attest.

My Bose Bluetooth headphones are great except the mic. I think it’s optimized for noise cancellation so it kind of sucks for voice - and for some reason forces noise cancellation to high when using the mic so it’s feels really weird in my ears - even other peoples voices become more tinny somehow


By the way, is there any easy to use software to make your voice sound better, e.g., crisper or slightly lower, during Zoom/Teams calls? I don't know enough about audio to change equalizer values to something that would sound good.

(I already have a stand-alone mic.)


Serious suggestion: If interviewers want thoughtful answers, they should try sending the questions in advance.


They don’t necessarily want thoughtful answers, they want answers that provide them information about the candidate. Sending the questions in advance would ruin that.


Maybe I don't know enough but I believe having any person talk about a few random projects from their CV and me asking them questions relevant to their position would give me a pretty good understanding of how much someone is knowledgeable in their field, as well as their confidence and general curiosity (it's ok if they just know their bit but plus points if they can also talk about design, business and/or other parts outside of their role.)

Of course one could learn that by heart and fool me but I'm also pretty sure people are currently wasting/spending their time memorising acronyms and leetcode just to get someone to give them a chance, so I'd rather have both enjoy the process.


Interviewing is a terrible way to screen candidates, but if you're literally asking questions that appear on "top 10 interview questions" pages, you're not even trying.

This whole thing really comes across as an interview process that is about soothing the interviewer more than it is about collecting empirical data that can be used (and iterated on) to qualify candidates. Be upbeat! Show that you've read the same Uncle Bob posts I have! Question me when it's important that something I said be questioned, but listen carefully and do what I tell you to in other cases!


> Interviewing is a terrible way to screen candidates

so.... how do you suggest companies hire candidates? purely through referrals, without any further screening?


Thomas is a long time and outspoken proponent of work sample hiring. Here is his current companies process https://fly.io/docs/hiring/hiring/


This is not work sample - Oxide do work sample where they can take work you already completed. This is take-home test as described at least


Using OPM.gov's definition:

Work sample tests require applicants to perform tasks or work activities that mirror the tasks employees perform on the job.

You can call them "take-home tests" if you want, but that term is more general; lots of take-homes aren't work samples (they're not work samples if they're abstract coding puzzles that have nothing to do with the real work, for instance).


Ok sure, semantics aside, and not that i've seen Fly's take-home challenges, but how is it actually meaningfully different from zoom/in-person interviews in terms of being totally synthetic? Unless you're hiring for a junior role, it seems like isolated, pre-defined task is trivial part of the job...

I like your "work day" challenge though.


Honestly, why not?

Obviously there needs to be SOME degree of vetting, but the way senior+ engineers are handled in the market is just absurd. The entire software engineering process is so closely tied to college classes with maybe a few years of professional work. All with the idea that "hiring the wrong person is terrible" but not considering that there may be thousands of developers that can effectively do the work.

If software engineering was handled like most other jobs, your work experience and references would be like 70% of what are considered. Instead, resumes serve as little more than initial screens at best, with all the weight given to leetcode results, pet questions and system design questions about how you'd build Uber in 10 minutes.

We act like its the 90s and literally everyone is new to the game. It's time to recognize the job has become a mature profession and we should treat it as such.


“Listen to the interviewer” should also be read as “don’t talk over the interviewer.”

I used to chalk that up to nervousness of the candidate. I’ve since learned it’s a very bad sign.


> If I disagree with you, that’s not an issue. Just like the songs Lennon and McCartney wrote together were better than the ones they wrote after, contrasting opinions on a team bring new insights and better decisions.

This is great but please tell people this upfront. Not everyone has an over abundance of confidence - especially candidates that come from traditionally marginalized communities.


> I’m sometimes in a practical part of the interview where a certain part isn’t going well. ... At this point I’ll try to offer the interviewee some hints or maybe even just tell them what to move them along. However, on quite a number of occasions this advice has been flat out ignored and the candidates have kept ploughing on with whatever failing approach they were using. To put it bluntly, the initial inability may not have been a deal-breaker but ignoring advice is.

Yes! If I set it a problem for you, and you're not getting it, it turns into a new interview challenge: can you work with me to get back in track? I'm not a ghoul looking for any chance to hang you. I'm your potential future colleague, and I'm looking to see if we can work together. Not knowing something is fine. Not listening to me as I try to get you unstuck and work with you too get moving again is unforgivable.


Overly rigorous interviews are a problem in the IT industry, articles like this are a proof.

> If you google ‘top 10 X interview questions’ where X is your technology, and a question is on that list you should probably know the answer.

Even if the company is not using that aspect of the technology in their work? What if the candidate has referrals and experience that already presents them as proficient?


On interviewers who are asking about "recommended practices", SOLID, etc...:

Had an interview while back with a hiring manager for "Agile Coach" position. - He couldn't answer basic questions about what goal he wants to achieve. That was a weird question for him. - He was grilling me on SAFE processes.

This is a "normal" case.


I suppose that not knowing what goal they want to achieve could reasonably lead to them wanting to really scale up agility.


The most useful thing about skimming articles and HN comments about interviewing is: Some people think X about interviews, so I need to be prepared for X.

As you pour your energy into simply getting past the ridiculous nonsense gatekeeping.

Once that misdirected energy gets you that job, you're now safe to pour your energy into the real business... of resume-driven development.

In 12-18 months, repeat. It gets easier each time.


SOLID, TDD, etc is a noise from simpleton uncle. I tried hard to see if he has anything of value to say. He doesn't really. That doesn't stop him from selling simplistic slogans to get attention.


Another one or two that I feel are very important; 1. Be invested and show you give a damn, 2. (Related to 1) ask good questions that show you care and are invested


You should always understand that the reason they are interviewing you is because they are unable to solve their own problem.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: