I feel like this is how everyone thinks they interview candidates, and that it doesn't really work. I wrote about this at length, specifically so I wouldn't write the same long HN comment every time this comes up. :)
Thanks for writing that up; really interesting stuff. I agree with many of the critiques you make. I suspect that it may be easier to get the sort of standard process you describe implemented in a relatively small organization that has a relatively consistent set of products and commensurate challenges than in a large, diverse organization like Microsoft, where I learned to interview. The Microsoft interviewer has the challenge of finding candidates that are good both for the specific team they're interviewing for, and also have the general skills necessary to work with other teams in the future. As you say, it's a hard problem.
But Microsoft is a large organization, with deep pockets! Is time spent crafting meaningful challenges for each team really that much more than, say, the man-hours that each team will have to spend to interview candidates?
I appreciate that you're tired of writing the same comment over and over, but I'd really appreciate your going into some depth about what "this" is that you're referring to and how it contrasts with your approach. It seems like the original article describes a very scripted, standardized interview that tries hard to put the candidate at ease. If you were to condense your article into a sort-of "Joel test" for interviews, I imagine that this process would score reasonably well.
It's based on the assumption that it's reasonably possible to put a candidate so at ease that they can solve interesting programming problems in front of an interviewer. But in-person interviews are inherently hostile, if for no other reason that they're timed and adversarial. It's better that we jettison the entire notion that they can be a venue for demonstrating programming ability.
I'm not opposed to interviews. I just think you need to factor out the programming from them.
I'm currently working with a roomful of cognitive scientists on ways to unpick some of the bias from recruitment. Be good to have a chat at some point if you're interested.
A very major difference that I noticed is that tptacek's process doesn't turn down a candidate if they don't have the requisite domain-specific knowledge; rather, they send them a bunch of learning resources and let the candidate resume the process at any future time. The OP's process would just reject the candidate, forget about them, and move on.
tptacek's process is also going to work a lot better for candidates who aren't comfortable in high-stress whiteboard coding situations. The OP's process is designed around the assumption that the candidate can be made comfortable; tptacek's process is designed around the assumption that some candidates can't be made comfortable, but you shouldn't reject them (and in fact you may desperately want to hire them if you could find out how good they are), so you need to find another way.
Although the OP's article states that he is prepared to offer a computer for anyone who prefers it, how many candidates are confident enough to tell an interviewer that they don't want to do whiteboard coding? My experience is that candidates tend to be incredibly submissive during interviews so they're not going to speak up about something like that.
Based on the OP's article, I didn't get the sense that they have a rigorous rubric or that they meticulously record objective data points for every candidate to establish a reliable dataset over time.
All that said, I do feel that the OP's example question is pretty good (for that domain of expertise, at least) and better than what I typically see in these sorts of articles.
The number of candidates who have told me that they want a computer rather than a whiteboard is small but not zero.
I do give the same few problems over and over again, and I do take extensive notes, but I am not keeping track of a specific set of metrics that I track across candidates. I have a pretty good sense of where the "middle of the pack" candidate is. The process could be more scientific, yes.
Do you worry that your brain is really, really good at reassuring you that you're making careful decisions? Mine is, and it makes me worry. I tried to design a process with that in mind.
Yes. There are so many possible biases when interviewing and it is hard to keep them all in mind at once. The practice of science is essentially one long story about overcoming those human tendencies to arrive at the truth.
I started my career being pretty terrible with interviews for reasons pretty well encapsulated in Thomas' article -- general nervousness, social anxiety, lack of confidence (in retrospect, I believe this was primarily an impostor syndrome issue).
I've gotten better at it with age, but the improvement in my interviewing skill is completely unrelated to my improvement in programming skill, though I believe both continue to improve still into my early 40s.
Speaking on behalf of younger me, and probably others like younger me:
Doing the work on a computer is scarcely better than coding on a whiteboard, and in some ways it is worse. It isn't my computer. It is set up all different, it would take me hours to set it up like mine, the keyboard is all weird, what the fuck is the ctrl key doing over there, fucking Lenovos, all of these are trivial differences in isolation but in aggregate they make the work seem just as foreign as doing it on a whiteboard especially because I already feel like I am "under the gun".
Also (and by far most importantly and relevant even if I had brought my own computer in): I'm being "watched". Even if you aren't actively watching me (or rather, younger me), I am effectively being watched if I'm sitting in some room at your company banging out code. I'm all up in my head about how this is taking too long, how long ago did I start, how long are they expecting this to take, etc, I never get anywhere even close to the "flow" that I commonly get into when I am doing real work, the whole experience is just completely unlike the experience I have when doing actual work and feels like torture.
And then lastly (drifting somewhat out of scope of both the original blogpostand Thomas' blogpost) another issue I've always had with these types of interviews is that I'm somewhat of a subconscious worker, in the sense that when I am presented with a thorny technical problem the way I often best deal with it is by not really thinking about it (consciously), but rather going for a walk, taking a nap or just otherwise zoning out and letting the solution sort of materialize out of my subconscious. This is obviously completely incompatible with any kind of traditional programming interview, where the interviewer is almost obsessively looking to "see the work" in how you're thinking about the problem and I can't just ask them to let me go take a walk for 15 minutes and get back to them with the solution they're looking for (generally - I'm sure some will respond saying they'd be okay with this, but practically speaking we all know it wouldn't fly in the vast majority of interviews). An extremely common occurrence for me back during the time when I did not interview very well is I'd be driving back from the interview and would just suddenly have completely fantastic answers for all of the questions that didn't go very well on-site, a technical interview version of l'esprit de l'escalier.
Having said all of this, I'm not one to suggest anyone who interviews in any specific way is objectively wrong (for their own needs). If whatever system you use generates enough hires to get the work done, then so be it; but I can really relate to a lot of the problems highlighted in Thomas' article, so at the very least I hope people interviewing with more traditional methods than he has been working towards realize they are certainly filtering out some really great hires (which is maybe okay for them, but like Thomas I believe ultimately bad for the industry if we continue with what we have now as the standard).
Partly it was just experience. Interviews (as done poorly in the tech industry) tend to follow certain patterns, the more you do the more you see the patterns and the more you can adjust for them.
Perhaps most importantly (and this is something that is often mentioned by people give interview advice but easy to ignore if you're a "meritocracy"-minded techie) I've gotten really good at just sort of taking control of the interview and leading it where I want it to go (while still being sure to display how I would be valuable to the company) rather than being a passive question-answer-er, but getting good at this is also pretty much just down to experience.
Thanks for this. Even when well-implemented, the 'standard process' for interviewing is rather unreliable and stresses the wrong skills.
Worse, it's almost never well implemented. It seems almost designed to crumble under the smallest misapplication - a badly worded question, a dry whiteboard marker, or an unsociable interviewer can destroy a candidate. None of this is conducive to good hiring.
The following is utter pedantry, but probably thousands of people are going to read that page. "Consligeri" (a few paragraphs into section 2) is plural; you meant "consigliere".
Don't forget to update us on the progress and the results after launch.
It would seriously change my worldview if it will turn out that Google, Apple, Facebook, Microsoft, Dropbox, Evernote, Amazon, Airbnb, Uber, Square, PayPal and countless others firms that hire their engineers using some variation of whiteboard coding would be proven wrong in their practices by one person.
I don't know how to respond to this. I was simply telling the parent commenter, if the work-sample stuff we did at Matasano sounded fun, the thing we're doing now basically extracts and amplifies all the fun we think was in that process and opens it to the public.
http://sockpuppet.org/blog/2015/03/06/the-hiring-post/