It depends what you're looking for. If you want someone who can turn the handle on your typical daily task then, sure, test them on your typical daily task. But if you want someone capable of developing solutions to brand new problems then it's not so easy and testing fundamental computer science theory is important.
It’s not. Theory can be referenced. People do not work in a vacuum.
Interviewing in eng is broken, but afaict its a “worst solution save all others” kind of scenario.
But let us not begin to deem these intrinsically important.
Some of the most creative and productive coworkers I’ve had struggled with leetcode style interviews. They’re a bad tool for anyone who isnt a new grad, and even then.
When you apply for jobs do you simply look for "engineering" positions? Why am I always applying for software engineering and not electrical engineering? It's all engineering, and theory can be referenced, right? In fact, why doesn't everyone just buy a book and become a top engineer?
The point is not (or shouldn't be) to recite a textbook. The point is you can navigate your way around the textbooks. I've got both The Art of Computer Programming and The Art of Electronics on my shelf. I could find the sections to help sorting a list in seconds. As for the latter, I have no idea why the majority of that book even exists. I can't call myself an electrical engineer, even though all the theory I need is within arm's reach.
I assume you're arguing against the "recite the textbook" approach. I would agree that this is not the way to do things. But equally, "throw the textbooks out" is not the right way either. We need to evaluate a high-level grasp of the literature/theory but don't punish for forgetting minutiae. I might ask a candidate to talk about choice of sorting algorithms. There is, of course, no perfect answer, but what I'll be expecting is general evaluation of algorithms: time/memory tradeoffs, probing for more domain knowledge (e.g. does the data often come in sorted or random), platform constraints etc. I won't even expect a name drop of an actual sorting algorithm as that's not really the point. What they're telling me is they know why Knuth has a whole chapter on sorting. That's the important thing.
This is a false dichotomy. Specific theory that is hard (and useless) to memorise all details can be easily referenced if you are knowledgeable enough in a field, if you know about a red-black tree, the gist of its properties you can easily Google usage cases if you've forgotten, examples of it and algorithms related to it (rebalancing, how it relates to search, etc.), if you had never studied, used or seen one there is no way to reference to these properties easily.
I'd much rather hire and work with someone who has the skill to easily assess a situation and use referencing to rebuild knowledge than someone who memorised how to implement tree balancing, so why do we test for the latter rather than the former?
Some companies want to test if a person spent time preparing for the interview. So asking all those quiz questions does make sense even if they are no relevant. At least it shows that the person knows the rules of the game and is willing to invest substantial efforts to follow them even if the rules are arbitrary and irrelevant for day-to-day activities.
Okay. So arbitrary preparation - when nearly any other professional interview requires little preparation beyond updating your resume - has merit because... rules of the game?
Stop supporting baseless metrics for assessment just because some old person used them before you showed up. We can and should do better.
This is how it is with IT companies paying well above average. Given that they are able to pay such salaries this interview strategy is compatible with big profits.
It could be that by changing interview strategy to look more similar to other professions that profit can be increased even farther, but nobody is risking it.
I can use your argument to push to another side: wouldn't this strategy also tell us a huge bias that it is selecting for and presenting itself in tech companies? With that I mean the bias of "learning to play the game", selecting for people that are going to conform to arbitrary rules for their promotions, caring about playing the game instead of analysing the impact of their work?
And I can ask that given the recent issues with data privacy and data abuse by the tech giants, would we be in this place if the interview processes had selected for more holistic engineers, technically able but that refuse to play the game just for the sake of playing the game, that are opinionated and don't conform to something just for the sake of money?
I know that I might be creating a false dichotomy but I would like to think about what kind of pressure this selection process creates, what biases arises from it? How can we make it better?
Because your argument is the most conservative and pro-establishment one: it works so don't touch it and just emulate.
I was not arguing for these types of interviews. My point was that one can rationally explain apparently useless quiz questions. And yes, this is a strong selection bias to pick people that agree to play by arbitrary rules without questioning them.
How do you know that the person is even able to comprehend theory?
> Interviewing in eng is broken, but afaict its a “worst solution save all others” kind of scenario.
That's your opinion.
> Some of the most creative and productive coworkers I’ve had struggled with leetcode style interviews.
Good for you. But "slumpt_'s most creative and productive coworkers" is not a good metric for hiring.
> They’re a bad tool for anyone who isnt a new grad, and even then.
Again, that's your opinion. I'm not a pro in those interviews, but studying DS and algos opened up and pushed my mind to its limits like nothing else. Your whole thinking process changes when you start working on this, you start thinking about constraints, performance implications, pro and cons of different approaches. It is called Computer SCIENCE for a reason.
The person who's going to come up with new idea's isnt spending their time memorizing old ones. They learn to index where to retrieve knowledge when necessary in order to allow them to cover a wider breadth of knowledge. And this will allow themand to pick the best one for the job at hand as opposed to the tool they are an expert in. Sometimes you need a handyman instead of a master plumber because they are better able to see the big picture beyond all the shit.
My best coworkers is about as good of a metric as things that “stretched your mind to its limits.”
The point is neither is demonstrated to bear any relationship to jack shit
Interviewing is and has been broken, even with the changes we’ve made over the years.
If you’re holding onto leetcode challenges that make you think hard as representative of engineering prowess we’re never going to have a reasonable conversation.
Again, it depends what you're looking for. If the real world problem is "we need a fast optimising compiler that runs on our embedded platform" then hiring someone who is great solving problems but knows nothing of compiler theory is going to be very inefficient.