Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
[dupe] How to interview engineers (2020) (spakhm.com)
38 points by telotortium on Sept 28, 2021 | hide | past | favorite | 59 comments


How to Interview Engineers - https://news.ycombinator.com/item?id=24754748 - Oct 2020 (178 comments)


Thanks! I didn't realize the URL had changed, otherwise I would have found it.


I’m 75% sure this is a joke. If it’s not, this is the most ridiculous interview process I’ve ever heard of.

Despite what the author says, this type of interview is definitely something you can train for. If it’s real, and successful the author has essentially created a test for people who have spent a lot of time in competitive programming, or who spent a lot of time on interview prep.


> has essentially created a test for people who have spent a lot of time in competitive programming, or who spent a lot of time on interview prep.

He's created a test for people like this or really smart people and in fact both tend to be fairly good hires.

If they're smart, they're smart, and if not they've demonstrated high conscientiousness (the psych term for being organized and hard working).

I'm not saying 8 hour leetcode interviews are a good idea, but one challenging toy problem like this gives decent signal that the candidate is at least smart or hard working (maybe both) which is more than many candidates.


Code interviews should not test if you can write a correct program but if you can design and breakdown a problem + discuss possible solutions with the interviewers.

That's what programmers do, that's what you should check if they can do. Not if you can one liner some assembly to write hash into db that will trigger an event in main application all in one go.


Does the author claim this isn't something you can train for?

I've given hundreds of coding interviews. I always timed the candidate (with their knowledge). So I have a general idea of the distribution of programming speed, and my guess for this particular problem is to solve it in 10 minutes, you need both talent and practice... just one isn't enough. So asking people to solve it in 10 minutes ends up filtering out untalented people who have practiced a lot (and talented people who haven't practiced as much)


https://twitter.com/mxcl/status/608682016205344768?lang=en

Rigid interviewing techniques like that can easily weed out great candidates. Your confidence in your own process leads me to believe you may have missed a few great ones yourself.


There are a lot more bad candidates than great ones, time is limited so it's usually more important to be able to weed out the bad ones fast.

Max gets brought up in these discussions a lot, but honestly I think the no hire was a good decision from what I know. Homebrew was a triumph of product design more than technical prowess, I suspect Google might have hired him as a product manager. Max was by his own admission was not a great programmer[0] but interviewed solely for a software role. I don't have a CS degree either, but I wouldn't hire someone to work google scale software if they couldn't invert a binary tree.

https://www.quora.com/Whats-the-logic-behind-Google-rejectin...


I don't feel very confident in my process overall and I agree this test will produce a lot of false negatives (and said as much in my comment). But, I've never seen someone who had that sort of extraordinary performance on a leetcode style problem go on to bomb the rest of the interview. These extraordinary performers have pretty much always been solid in other areas as well.


I'd much rather my engineers have spent their time practicing their actual jobs and not brainteaser interviews. To wit, the interview questions I ask are usually in line with the role they're assuming.


The author wrote this post in the context of hiring at a "hard technology startup". I interpret that as a place where engineers are expected to be able to do cutting edge computer science research (it's not a "well-established [role] to do specialized tasks"). It's not clear to me what sort of questions you would ask "in line with the role they're assuming" for this scenario.

One idea here is to give candidates an open problem in CS and ask them to solve it. Which is a nice idea, but one of the other things I learned through administering hundreds of interviews is that open-ended questions aren't great interview problems because they depend a lot on creativity/lateral thinking/insight, which highly benefits from being in a relaxed frame of mind. So these questions end up being a test of how relaxed the candidate is.

Stress creates tunnel vision, which is terrible for generating interesting research ideas but it's OK for solving these kind of leetcode problems. So checking for a very high level of programming aptitude, as a proxy for CS research ability, is an approach which is a bit more fair to candidates who are stressed out by interviews. (I also think it is a pretty decent proxy, because doing great CS research requires you to quickly & fluidly generate & evaluate algorithms / data structures which might solve your problem, which is a big part of what a great leetcoder does.)

If you're targeting a very high level of generic programming aptitude, it is arguably most fair to make use of a standard method of measuring it. Leetcode problems are the industry standard for measuring programming aptitude. People know to practice them a lot and they know what to expect. If you came up with your own unique way to measure programming aptitude, that would create an even greater burden on candidates to practice (they'd have to do a whole different sort of practice in order to succeed in your interview), and also create anxiety due to an unexpected interview format.

I think a lot of readers are overreacting because they didn't pay enough attention to this bit: "It's applicable if you're building an extraordinary team at a hard technology startup." The vast majority of companies in SV are not doing hard technology and don't need an extraordinary team. People should not feel inadequate if they aren't capable of improving the state of the art in technical areas of CS such as databases. This is ordinarily the domain of PhDs, and filtering for demonstrated algorithmic aptitude (as opposed to academic credentials) is actually a pretty egalitarian approach.


So let's be clear. You're honing in on this block:

> "It's applicable if you're building an extraordinary team at a hard technology startup."

And you're accepting that a timed question about tic tac toe is enough to prove you're capable of being on an "extraordinary team" at a "hard technology startup"?

Really?


Yes, I think to a large degree being able to build hard technology is a matter of being extremely fluent with writing code and reasoning about data structures and algorithms, as I stated. Being able to solve this problem in 10 minutes seems like a hard-to-fake demonstration of such fluency. When I think about CS open problems and when I solve leetcode problems with algorithmic content to them like this one, it feels like I'm using the same part of my brain (or at least there is significant overlap).

If you don't agree with me that the fluency I described is a significant asset to advancing the state of the art in CS, what do you think a significant asset is?


> Yes, I think to a large degree being able to build hard technology is a matter of being extremely fluent with writing code and reasoning about data structures and algorithms

Sure. Again. I don't really think an algorithm that shows up as an introduction to algorithm proves much more than a person read "Intro to Algorithms". So again, a timed introductory problem proves some elite technical skill?

> If you don't agree with me that the fluency I described is a significant asset to advancing the state of the art in CS

You're building a cute lil strawman. I think the question is totally out of line with the stated goal. If a college sophomore can answer a question, you're not really assessing much of anything. Also, working at a "hard startup" has nothing to do with "advancing the state of the art in CS".

> what do you think a significant asset is?

If I'm handling hiring for a "hard startup" and am in search of engineers fit for an "extraordinary team", I'm probably going to spend more time finding applicable skills than opening up to Chapter 1 in the closest algorithms book.


Perhaps you and I just have different views about what the pool of programming talent looks like. I think of your average CS sophomore as someone who will most likely struggle with Fizzbuzz in an interview setting. I'd say if they're a college sophomore, they just read Intro to Algorithms for the first time, and they can solve this problem in 10 minutes, that means they have an extraordinary ability to deeply and rapidly master material, and they'll be able to handle whatever challenge you throw at them. (Either that or they started programming way before they started college.)

In my observation the default state of a student reading a textbook is it goes in one ear and out the other. Most students temporarily acquire a superficial understanding of the concepts which allows them to answer test questions and get a decent grade. To see something in the wild and instantly recognize that it's isomorphic to a concept you studied years ago requires a level of mastery/passion well beyond what it takes to get an A. (I'm talking about school in general here, of course the fact that interviews index so heavily on data structures/algorithms ends up distorting things a lot from the baseline. Still, if you solve this problem in 10 minutes you're one helluva sophomore.)

>Also, working at a "hard startup" has nothing to do with "advancing the state of the art in CS".

I think of "hard technology" like rethinkdb as being exactly equivalent to cutting edge stuff that advances the state of the art in some way... again, maybe there's just been a misunderstanding/miscommunication here


I honestly cannot tell if this is satire


It's quite bizarre IMO. It's ridiculous, but not so far fetched as to be amusing.

I don't know what to make of it.


I certainly hope it is.


If the limiting factor for a candidate who is "passing" your bar is typing speed, then nearly by definition you have asked the wrong question to evaluate the candidate.

Also in the "not sure if this is bad satire or just bad" camp


> The moment the program outputs the correct answer, take note of the time again. That's all you have to do to evaluate how talented the candidate is. The technical aspect of the interview is over. (Yes, you read that right!)

In my opinion that is complete BS (as is much of the rest of the article) - the candidate could have seen the problem or something similar before or used a particular technique or data structure that suits the problem - while that may indicate a general technical ability it could have been something they spend weeks figuring out last month but can now roll out again quickly.

Never base hiring decisions on a single task or question. Even if they did dreadfully on what you think is a basic topic that everyone should know, don't automatically disqualify them. Mark them down but ask other questions.


This is exactly the kind of "how to" I would expect from Slava. This mindset, paired with a number his of other online posts/interactions, makes me wonder to what degree RethinkDB's work environment negatively impacted its chances for success.


This advice is so horrible I am deep into Poe's law territory. To support my faith in humanity I am declaring it a very subtle satire.


I’m waiting for the 2021 post on software maintainability and legacy code.

For everyone’s questions about satire, the blog is called “zero credibility”


An old discussion but I still think that interview questions like the TicTacToe example completely miss the point. The vast majority of engineers are stopped by questions like: Why does this version of a library no longer work? How do I address this legacy API I need to include in my program?

These interview questions exist only for the purpose of filtering out as many as possible from an enormous number of candidates, not to find the most suitable engineers.


After having interviewed hundreds of engineers, I agree with the premise that elite TopCoder competition programmers are great not just at brainteasers. It takes a lot of discipline to improve skill to solve that fast, so they are smart and hardworking and can ace the design interviews too.

However, the fact is these people are inundated with job offers and very generous swag - like laptops. Everybody wants to hire them. It's hard to compete for one of these candidates let alone fill an engineering team. Centering hiring around that is an act few can follow.


>Slava Akhmechet Ex-product/eng at Stripe. Founded RethinkDB. Computational neuroscience PhD dropout. Care about building the future

Founded RethinkDB? So is this about how to run a tech company into the ground?


Having interviewed many many candidates there is only one rule that I found consistent - hire someone who you feel is "adult" enough to handle responsibilities and who you feel comfortable to be in one room with. And a quick background/references check on top of it just for a tick.

Everything else comes third, fourth, etc...


For tech interviews, whether someone cares about my speed is a nagging question in my mind.

I can write code that runs, but is messy, uses short variables, and generally is one big blob and doesn’t handle expansion very well or I can spend time to consider future needs and write sensible variable names.

Not sure which is correct.


It depends. I think your code should be readable - try to make good variable names if it doesn't slow you down, but as long as they're not actively misleading even code with short names like "x" will be comprehensible at the program sizes written during interviews - because the interviewer will need to review your code after the interview and making it comprehensible can only benefit you. Especially if you don't get it quite right, it's much better for you if your interviewer can look at your code and see the minor bug fix needed, versus having to take your nonfunctional code as given. On the other hand, don't waste much time on "future needs", at least for hour-long interviews - YAGNI definitely applies here.


It tends to depend on how large and/or elitist the company is and how formalized the interview process as a result. In very rigorous interviews people are, ironically, less likely to dock you for style points unless they significantly slow you down or reflect a generally poor mental model; it will be assumed that you’re using a language that you’re less familiar with. Probably many people have taken the interview so you’ll be calibrated mostly on how far you got and how much help you needed to get there.

In smaller shops it’s going to be more hit and miss because you don’t know the level of the candidates you’re up against, but probably style points, comments, and good structure will be weighed more highly.


This sounds like some linear combination of mysticism, irony and stereotyping.


I feel this article funny. Most FAANG interviewers have just average talent


Offtopic and sorry but I have to let this off my chest: Substack's spectrum typeface has too much letter spacing and it leads to poor legibility. Your brain recognizes word shapes as whole and the more space you add between glyphs (this is not kerning, which is between specific pairs of problematic glyphs), worse the legibility. Typeface looks clean when you do it and perhaps great for display use, but absolutely horrendous for reading long prose. Substack has one job. Any typographer worth their salt would chuck this typeface out of the window. It is not a subjective matter. If Substack design team cannot afford a typographer, they should be using many excellent typefaces out there for free such as Source Serif Pro.


What OS and browser you're on? Seems to be quite nice on Mac OS/Chrome.


Same


> specialization is for insects

Read this and everything else in the voice of Edna Mode from The Incredibles, and it makes sense.


This could be a great setup for a science fiction / fantasy short stories series.


> Take note of the time and let them do their thing. Answer any questions they might have as they go. The moment the program outputs the correct answer, take note of the time again.

Gross.

> If you decide you want to hire the candidate, the interview must last at least six hours (with an hour break for lunch). Have your engineers interview the person, one by one, for about 45 minutes to an hour each.

Gross. The hardest pass.

> First, the candidate needs to feel they've earned the privilege to work at your company.

Gross. Equally hard pass.

> Second, your engineers need to feel they know the measure of whoever they're going to be working with.

Gross. Random engineers aren't qualified to assess talent or fit. Random engineers _may be_ qualified to pick candidates that match their gender/race biases though.

> and more importantly spend time having a little trepidation about how they did.

I would go so far as to say "Fuck you" to this author.


Ha, I read your comment first and thought you were somehow overreacting, but then I read the piece and I agree with you. The author doesn’t seem to realize that the flavor of interview he’s running is deeply trainable, and that one can become very, very good at tic-tac-toe style questions after a year or two of intense study. But who would do that? Well, a whole lot of IOI/ICPC contestants did. So, he’s mostly just hiring for people who joined the same club as, I assume, himself.


It's funny. The older I've gotten, the more code I've written, and the slower I am to write code.

When I was fresh out of college the 'obvious' approach just appeared, and I was off to write it.

Now, any problem I am slow to peel apart in my mind, decide what the best way to approach it is, based on the language I'm using, what I'm feeling right now, readability, maintainability, etc.

Tic Tac Toe? Interesting; should I store board state as a 2d array, or a single array? The obvious approach would be to try and play every possible game with backtracking, but perhaps instead it would be simpler to just generate every possible permutation of 5 Xs and 4 Os? Would that work; on the one hand a badly playing O might lose after just 3 Xs, but we could still fill out the board if we 'kept playing', so it maps one to one, so maybe! Of course, then you risk situations where both X and O wins, so maybe not? Also, what about rotations; we could reduce our work by ensuring we didn't try to solve for cases that are just rotations of one another, but is the book keeping of that more work than just brute forcing it, given the constrained nature of the problem? Etc etc.

Writing working code matters, yes. But if you're looking for people who think just a few seconds before typing anything, and then seem frustrated they can't type fast enough, you're optimizing for juniors.

I hope it's all satire.


> every possible permutation of 5 Xs and 4 Os

This would not include all possible end game states. A value is either X or O or empty. It's possible to end with empties.


Which is touched on; you don't need to evaluate game states, you merely need to count them. A game that ended with X or O before the board filled out could still have additional Xs and Os written on it; while that would "lose" the winning state, it's immaterial since you aren't interested in the state, just the outcomes (or, possibly, the ways to get there).


> every possible permutation of 5 Xs and 4 Os? Would that work

xox --- ---

corresponds to two different games depending on where you placed the first x, so no


I wasn't actually asking or pondering over it (the answer would in part be determined by what is meant by a "valid game"); I was just presenting a train of thought akin to what I would have in such a situation, examining the question in various lights as to ways to solve it, and generating clarifying questions to ask or ideas to explore.


Board state can also be stored as as sets of X positions and O positions.


IOI contestants (and people that can memorize lots of difficult algorithms) tend to have high intelligence. If you're looking to hire really smart people and you don't care about false negatives, it's a good way to approach the problem.

The entire point is to hire people that 1) are interested in exploring algorithms on their own time (i.e. they're interested in more than just getting paid for a job, they're interested in interesting problems too) 2) are competitive and want to win 3) are hard working and intelligent (i.e. they had to study for these competitions and do well in them).

You might miss a lot of smart people but it's pretty likely if you ask really hard algorithm questions that you'll filter out any not smart people. If you're reacting badly to this style of interview, that's because you're not their target talent pool.


Actually I’m a fan of algorithms questions! Of course, getting an engineering staff full of IOI medalists is a wonderful situation and will attract more talent. But I and others here are recognizing the role played by practice and grinding on this specific flavor of problem. The author states that intelligence is unchangeable and divides candidates into talent bands - but actually, his test is significantly impacted by factors other than intelligence. His framing is just weird, to be honest. “People who think faster than they can type” isn’t really a meaningful category, yet he’s turned it into a theology of interviewing.


> So, he’s mostly just hiring for people who joined the same club as, I assume, himself.

Not so. From the article:

> First, I cannot myself pass this interview. Last time I tried, I got the correct answer after about forty minutes or so. I could get it down with practice, but it doesn't matter-- I think slower than I type. That's a no hire. The point of the interview is to hire extremely talented engineers, not engineers as talented as me.


I think this is mostly a sign that he’s out of practice - he’s a former PhD student working on database tech, the man has spent plenty of time writing algorithms. I’m claiming that he’s under weighting the effects of practice on how well (specifically, how fast) candidates perform here. When people are coding at typing speed, it usually means they’re not really thinking at all - rather recalling a similar template from memory.


He even says "I could get it down with practice", it's quite astounding that he doesn't get the importance of preparing for this specific type of test. I've been doing competitive programming in highschool, was decently good at it (got to IOI, got some medals). First year in uni they wouldn't allow us to compete in ACM, by the second year I felt like I was competing with my hands tied due to the lack of practice. One year of pause is all it took.

There's _some_ truth in some things that he says though, even though HN doesn't appreciate it. Like, anecdotally, I have a friend who refused a Google offer (when Google was much smaller but still a big-ish name) and went for a startup because the interview problems at the startup were very difficult and he thought they were going to have interesting problems to solve. I took note of that and made sure my interviews were as difficult as the candidate could take it - like, go progressively harder until the candidate is stuck, then back off. This works fairly well especially with just-out-of-university hires, there's a certain type of people that notice it and like being challenged. And they're often very good employees. (of course, what the author suggests in the article is WAY over the top, I'd agree his process is broken).


"Gross... Fuck you"

Serious question... how did these kind of fifth grade playground insults become normalized on HN? It's a blatant violation of the guidelines https://news.ycombinator.com/newsguidelines.html and yet somehow comments like this get upvoted.


You're welcome to flag my post if you think it's inappropriate.

The author of this post is either ignorant or malicious. Some of their behavior is so ignorant or malicious that it warrants direct adversarialism. This blog post is gross.

Curious, what about my post included a "playground insult"?


That's obvious: "Gross... Fuck you".

Would you please not post like this to HN? It's not what this site is for. No matter how strongly you disagree with someone about $topic, it's not ok to dump poison into the ecosystem.

If you'd please review https://news.ycombinator.com/newsguidelines.html and stick to the rules when posting here, we'd appreciate it.


Doesn't read as fifth grade to me. Just has a little spiciness to it, I appreciate that. Think about it, the person could have linked and linked and linked to other articles to back up some point of disagreement. But instead, they shared their own opinion, relatable, easy to follow, and refreshing.


Please don't post like this to HN. It's directly against the site guidelines, which ask you not to fulminate or call names. Maybe you don't feel you owe people who you consider wrong about engineering interviews better, but you definitely owe this community better if you're participating here.

I'm sure you can make your substantive points thoughtfully, so please do that instead.

https://news.ycombinator.com/newsguidelines.html


I would also go that far


[flagged]


Qualifications != intelligence. Everyone has pre-existing biases.


> What a nasty thing to say a random/typical engineer is too stupid to assess talent and is basically racist…

Quote me in my post where I said these things. Otherwise please don't misquote me. It's text after all, it's pretty easy to copy and paste.


This is good. I'm reminded of my need to improve my knowledge of programming itself. I know the feeling of being held back by my typing speed, but also by my knowledge of my programming environment.

Edit: corporate bullshit aside, this is a great way to find good engineers. The manipulative tactics and social standards don't sit well.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: