Imo, there are two kinds of programmers: people who can write code to build stuff, and people who can write code to build stuff and are also conversationally fluent in the theory behind writing code. The second group is 5x more useful than the first, and coding interviews are testing which group you're in. Often the first group doesn't think the extra skill of fluency is important, which is fine, think what you want, but they're definitely wrong, and I wouldn't want to work with those people; when there are actual problems to solve I'm going to go looking for people in the second group to figure them out. A terrible situation is to end up with a team of entirely people who can code but can't theorize about code, because they'll build a mountain of crap that other people have to rebuild later.
(Now it's true that some people can't theorize quickly, or in front of someone else, or especially in a stressful interview where there's a lot on the line. Those are real issues with the format that need solving. Not to mention the "esoteric trivia" sorts of questions which are pointless.
But the basic objection that "coding tests aren't testing the skills you need in your day job" is absurd to me. They're not the skills you use everyday, they're the skills you need to be able to pull out when you need them, which backstop the work you do every day. Like your mechanic doesn't use their "theory of how engines work" every day to fix a car, but you wouldn't want a mechanic who doesn't know how an engine works working on your car for very long either...)
I think there’s another group: people who can come up with solid code by using search tools.
I code, sure, but I will never come up with a custom solution for any non trivial problem. I know where to find appropriate solutions (the best ones) because I’m aware of what I don’t know (I read a lot of tech books). You cannot test this in the classic tech interview (because I would googling 75% of the time).
The final result is: you want good code or not? How I come up with it should be secondary.
As the problems become harder, you can’t just Google for solutions. Really great engineers often build things that nobody has ever built before — or at least not documented how they built it publicly. If you don’t have fluency in the fundamentals, you won’t be able to piece together the parts that you need to build novel systems.
Second, part of hiring junior engineers is evaluating their growth prospects — e.g. new grads are often completely unproductive for up to a year, and firms make large investments when hiring them (maybe up to $200,000 in mentorship and wages). People with the attitude “I don’t need to learn/understand things, I can just Google them” are unlikely (IMO) to reach that level of seniority.
In my experience, it's very rare that you're in a job that requires you to come up with a solution to a problem no one has ever dealt with before. Custom solutions are often a sign the engineers in question didn't do the appropriate research to find the standard solution for the problem.
I've been a software developer for 10 years, and I've never worked on a problem that someone else hadn't come up with a solution for somewhere. And if they haven't, alarm bells go off as to why I'm the first to do this, and where down the pipeline did I deviate so horrifically from the norm.
I strongly agree with this. I worked on low level algorithms in bioinformatics circa 2010. Writing mapping algorithms and variant detection in C/C++. Most/all of what we did was adapt known compression and database algorithms. The "best" aligner is still BWA (Burrows-Wheeler Aligner), which uses the Burrows-Wheeler Transform, popular in a lot of compression utilities.
Could you please give a firsthand account of an instance when a great engineer built a novel solution? I feel NIH syndrome is way more common cause for building things from the ground up
I've seen it at least ~10ish times in my pretty short career. I think you're maybe imagining someone building, like, "Linux from scratch". Novel solutions don't have to be that big; they just have to be novel.
Someone I worked with once went off on their own and implemented a test framework that solved a lot of the problems we've been having. They could have just written tests the normal way; they did it a different way; it was great. Someone else made a debugging tool that hooked into Python's introspection abilities and made flame graphs of the stack traces. Not exactly groundbreaking science but it was entirely "innovative" in the sense that no one expected or wanted that tool, yet it solved a real issue. Someone else made a JS library that abstracted out the problem of organizing these dynamic workflows on an internal-facing took. Small, but novel, and it organized the ideas in a way that made it possible to build on the abstractions later. For my part we had this chunk of business logic that was a pain to debug and I had the thought to factor it out into a standalone library that was completely testable at the interface. Not groundbreaking, but no one had thought to do it and it obsoleted the issues from before immediately. Etc.
If your job is anything more complicated than "feature implementation", there are chances for innovation left and right, and good engineers see and pursue them.
I’ve come up with some of the core solutions in my org to solve massive big data problems and had to depend on intuition and theory instead of the web. I still failed a merge sort whiteboard challenge in an interview. Some people just can’t deal with these inane questions in an artificial environment.
yeah, that's wrong. I don't only want good code. I want a smart person who can write code and also do a bunch of other things, like make good decisions about code and mentor other people to write good code and fix problems before they happen and keep everything maintainable and clean. How you come up with your code per se is secondary, yes, but I'm testing for a bunch of other things that are not secondary as well.
Curious. What skills from the "return all elements from a matrix in a spiral order" make you a good mentor? Or say something about your skills keeping code clean?
None, but a) if you can't write that trivial code I don't want to be on a team with you anyway because I'm going to be teaching you how to basically think, and b) the part where you talk about the code, not the part where you write it, is the part where I try to detect if you're any good at communication or abstract thought.
(disclaimer: all of this is notwithstanding the fact that some people's brains shut down specifically during interviews/places where they feel under pressure, which I have nothing but sympathy for. Afaik that's an unsolved problem with coding interviews. I would always try to lower their stress but it is not a sure thing.)
Thanks for the link, but I don't see a problem with that question. If you find it difficult, I wouldn't want you anywhere near the code base of my (hypothetical) company. So I guess the question would be doing its job just fine, for both of us.
> conversationally fluent in the theory behind writing code
means?
It might be my insufficient command of the English language, or I might be outing myself as being outside said group, but I'm unsure what that means. Is this just referring to a vocabulary for discussing the structure and creation of software, or is there a deeper mystery I have not yet grasped?
I mean that if someone asks you questions about code, you can respond intelligently and "think on the fly" about the subject in question. For instance you haven't just memorized something like e.g. the big-O time to access a hash table, but you have reasoning behind it: you know how it works in a few cases, your knowledge about it comes from an understanding of the implementation, and you can extrapolate that knowledge to new cases or variations of the problem, etc. Maybe your knowledge ends at some point but you could keep going if you had to: like maybe you don't know how hash tables interact with page tables or CPU caches but if that starts to matter you would be able to understand it and keep going.
The same way of thinking applies to design patterns (single responsibility principle->but why, and when is it okay to break?) or to architectures (OOP / dependency management -> yes but why? can you make a version yourself? can you work around problems with it?) or to libraries (React components->what are they trying to do? how do you keep that contract simple?) or to languages (JS->what are the tradeoffs? what features do you need? how important is upgrading or polyfilling?) etc.
All beyond-basic intelligence takes this form: not memorization but having a working understanding of how something operates that you can use and apply to new situations and investigate and drill into and wieldy flexible. I would call that "fluency". To be conversationally fluent in a subject is not necessarily to be an expert but to be able to "think" in terms of the concepts, and usually it means you could become an expert if the situation demanded it.
This is much more basic than what I thought you meant. What you're outlining are critical thinking skills. And I agree, lacking them makes a programmer far less valuable.
But there's a whole other level of fluency around the theory of software development, and it comes from experience with different architectural patterns, and being able to see into the different futures of each architectural pathway,and being able to converse with other people who understand software at this level.
Although, calling it a level really undersells it. Multiply the potential capacity for this talent by every dimension of software building, and you start to see how people having even a little of this skill, but being able to work with others who have a bit of it in a related dimension can form a team that is more than the sum of its parts.
Yes, the ability to have critical thinking skills is the key differentiator between the two types of developers mentioned.
I think that is what a lot of these discussions seem to be miss: the issue is not really the hard tech skills/knowledge. It is more about the softer critical thinking abilities or personalities that allow someone to become skilled at something or solve a problem easier/better.
I agree, I was quantifying with some examples off the top of my head, but I do mean 'this skill, but for everything'. Architecture is certainly a big part of it.
I’ve been having quite a bit of luck at these coding assessments by simply memorising solutions to leetcode problems. This feels not very different to studying braindumps to get a vendor certification.
And this is exactly the problem. Being a software engineer is 1000 things more than just rote memorizing some toy problem that solves exactly 1 single toy use case.
Oh yes you'll do find with that but you'll be a bad programmer when you're done. Better to work on the art of programming, at least once you've met your immediate needs like getting a job. It will get you a lot further in the career, plus it is morally better to be good at something and contribute meaningfully compared to doing just enough for a paycheck.
Grinding leetcode doesn't convert you into a "good programmer", and conversely memorising leetcode solutions doesn't make you a "bad programmer". The leetcode assessments done by companies are encouraged to be gamed by the companies asking for them anyway.
> It will get you a lot further in the career, plus it is morally better to be good at something and contribute meaningfully compared to doing just enough for a paycheck.
I've met virtually nobody that has said leetcode has got them further in their career except for passing a gated interview. Honing your craft and being good at something has nothing to do with leetcode.
To me leetcode is simply a means to get past gated interviews; if memorising solutions does the trick then I'll continue to do that. Honing my craft as a programmer and being a "good programmer" is something I work on in which leetcode bears no relevance.
Questions like these are hit and miss tho - I can do this because I worked in a sub-field where “write a parser for that” was a common tool to reach for. In my current field I haven’t seen a single parser in any company codebase; a dev that grew up here could be deeply skilled but have a gap around parsers..
That's okay, but it is testing what it says: facility with a particular part of CS that some people have studied and some people haven't. Can't hurt, though, and it's the sort of think that ought to be in everyone's toolbox, although it isn't.
When starting to type this comment, I was going to write that I could not do it and I think of myself as a decent coder. While typing that, I had an idea and I started a stopwatch...
Made this test set in the first minute: "1 + 1", "1 * 2", "1 + 2 * 2", "1-1", "1/2", "1+2/2" which I think should cover the requirements generally. Then I took 9m58s to come up with 77 super ugly lines that, to my surprise, after fixing that I forgot to cast the inputs to floating points (lines 67 and 76), gave the right answer straight away.
The correct answer would probably have imported ast but, while I know of the general existence of such tools, I never needed this in my life. It's not like I work on JSON parsers (a minefield) or wrote my own coding language. An old colleague did the latter for a commercial product by using str.split on operators (yes, strings were a major feature in the language), which went about as well as you expect... I know to stay away from these problems or, if tasked with it, where to look to solve it, but that doesn't mean I can do it in an interview.
While I'm pleasantly surprised to have gotten a crude version working and in a faster time than I expected...
...if you're not hiring specifically for a parsing job, please don't use this as a coding exercise. It could be an interview question if you just want to hear "something something AST, and not by string-splitting on operators when the language has strings with said operators potentially in them". That could demonstrate knowledge sufficient such that the coder would do the right thing if this task were to come up in their job
Nice work. Yeah, writing a parser without knowing parsing idioms is really hard. I can't remember the idioms anyway so mine would look like yours. There's a reason there are whole classes on this in colleges.
Most mechanics I know have long forgotten how to "connect the dots" and troubleshoot issues. Everything became computerized there and all they do is plug in a code reader. They literally don't do that "could it be spark, could it be fuel" kind of thing anymore. Most branded garages follow company instructions, "IKEA"-style, aka use a 10 socket and use it here.
I find the second group more often than not so pendantically afraid of building something even a few lines of any sort of "anti pattern" that when they meet the messy qualities of reality they fail to build anything, or at least take 20x as long as the first group.
Eh. That's not inherent to the second group. I think that's what happens when the second group is disempowered---both by the organization and by the hellish landscape of technology they have to work with. It can be very paralyzing to try to do something right when the tools are all broken.
Or maybe a good team is a mix of the two. I dunno. But I know that not having theory gets you only so far, and then everything becomes awful.
plenty of mechanics or other professions get by using "mechanic" or "engineering" know how vs know what that you're describing group B to be.
let's look at empirical evidence: old building - do you think masons back then understood compression forces ? but those buildings still stand. what they knew was simply a matter of probability that doing a->b->c results in this predictable outcome based on what they had done or what others had done.
scientism is not engineering. scientism is knowing why things work. engineering is knowing how things work or don't work.
(Now it's true that some people can't theorize quickly, or in front of someone else, or especially in a stressful interview where there's a lot on the line. Those are real issues with the format that need solving. Not to mention the "esoteric trivia" sorts of questions which are pointless.
But the basic objection that "coding tests aren't testing the skills you need in your day job" is absurd to me. They're not the skills you use everyday, they're the skills you need to be able to pull out when you need them, which backstop the work you do every day. Like your mechanic doesn't use their "theory of how engines work" every day to fix a car, but you wouldn't want a mechanic who doesn't know how an engine works working on your car for very long either...)