I think the first question should be why applications play such a big part and why they take the form that they do.
I think one part of the answer is administrative bloat, where "processes are implemented" and "guidelines need to be maintained". Especially for technical jobs it is hard to judge people from the perspective of HR, which is why you have "personal statements" and such, which only effectively measure your ability for coherent and flattering writing.
Another part is the job market, both employer and employee side. As long as only reasonably qualified people apply and companies accept reasonably qualified people having multiple rounds of interviews is just a waste of everyones time.
Personally I think "coding interviews" are just very insulting and speak of an awful state of the industry. Imagine giving an electrical engineer a 1st year undergrad assignment to make sure he really can calculate the properties of a basic circuit. Seems like a good reason to just walk out.
The paranoia of tech companies speaks to me as a desperate attempt to filter out applicants who are completely unqualified, from the vast ammounts of people applying.
Other industries manage this problem by having a professional qualification that you do the exam for once, and then you're good for years. Possibly with refresher courses to keep up with rule changes.
> filter out applicants who are completely unqualified, from the vast amounts of people applying.
It's not really surprising that industries which pay incredible amounts of money get a lot of applicants.
The "gate" has to be somewhere. At the employer, or at a professional body, or at the education system? All have advantages and disadvantages.
If people did generally put the gate only at the employer, where they randomly hire and then fire their way back down to competence, people would complain about that too. And with some reason.
A further lemma: No matter where the gate is put, a lot of people will complain.
The problem with programming is that it’s easy enough to self-teach to reasonable competency without going through the formal channels. Contrast becoming a physician or engineer.
That's not only the history of our industry given that CS and CE arrived much later than programming jobs but it's also a feature.
CS and CE cover some theory, but that theory is widely inapplicable to most jobs in the industry. People that don't come from formal backgrounds often work in cross-functional roles or roles that CS and CE deeply don't touch on. An example of the former is SREs. You can become a SWE by getting a CS degree, but a SRE job involves policy, userland software, systems, and so on skills. An example of the latter is Systems Engineering. I've found very few formal schools for Systems Engineering in the way it's practiced in industry. To make matters more complex most systems engineers also need to be familiar with writing application code.
I don't think that's a problem, as long as exams are open. The problems would be a) blindspots among the self-taught, or b) largely useless, complex material added to the exam by the people who sell formal education in order to force people into formal education.
edit: and only the second is a real problem, the first is a feature.
> Another part is the job market, both employer and employee side. As long as only reasonably qualified people apply and companies accept reasonably qualified people having multiple rounds of interviews is just a waste of everyones time.
Right but that doesn't happen. People throw their resume at any company without a second thought, and then outright lie on it.
We have a rule that we only ask questions about stuff they write on resume or submit in questionnaire (that is basically "here are few things, judge in 1-5 scale how much you know them) and still some people say they are expert and fail basic non-curveball questions or topics.
Going back to EE example it would be like writing you're an expert on amplifier design then not being able to recall equation for gain for noninverting operational amplifier
It absolutely is an indictment of the applicants as well. These questions get asked for a good reason.
One problem I see is that these types of interviews, which are essentially basic aptitude tests, might be regarded by the interviewers as far more than they really are. The idea of "preparing" for a coding interview is absurd. If the applicant feels it is neccesarry to prepare he either is completely unqualified or the testing method is bad.
I am quite certain that in a 15 minute talk I could easily figure out if some applicant has the technical knowledge for a position similar to mine. And that seems to me what Interviews should be about.
>One problem I see is that these types of interviews, which are essentially basic aptitude tests, might be regarded by the interviewers as far more than they really are. The idea of "preparing" for a coding interview is absurd. If the applicant feels it is neccesarry to prepare he either is completely unqualified or the testing method is bad.
I think it is because it started as just simple "weed out the weeds" questions and somebody got bright idea that "the harder brain teasers they ask the better candidates will be left", not realizing they are just recruiting people that are good at trivia.
Yep. I've never screwed up hiring people just by having a technical conversation. It just never goes wrong in the sense of me hiring some person who couldn't learn to do the thing I wanted. The only bad hires I've had were motivational, people who decided to do other things with their lives despite knowing how to code.
With any modern tech-heavy business, there's just so much industry specific jargon that if you hadn't done whatever was on the CV that led to the interview invitation, you would get found out in a short conversation.
I hear people sometimes run into some smooth talker who knows all the words and none of the actual skills. Never happened to me, and it's so far from it I can't really imagine it.
I strongly suspect that discriminiation, affirmative action, and bureaucratic recordkeeping play at least some role here.
Informal, application-free processes are great for those for whom they work, but are also frequently a path to discrimination based on ethnicity, gender, religion, or other factors. Standardised forms and recordkeeping provide a track record that can help address any claims or investigations arising (though won't always cure the underlying problem).
That and simply increased information-handling capacity by businesses, moving from pen-and-paper to computerised formats --- initially punch-cards, now almost entirely electronic recordkeeping.
An awful state of the industry matched only by the number of applicants that fail them. I shouldn't assume 'they wouldn't waste company time on them if they weren't necessary', but they are, in fact, necessary.
> Personally I think "coding interviews" are just very insulting and speak of an awful state of the industry. Imagine giving an electrical engineer a 1st year undergrad assignment to make sure he really can calculate the properties of a basic circuit. Seems like a good reason to just walk out.
I don't want to defend whiteboard programming, not at all. But I think one can get a lot of value out of simple concrete questions.
Example: Person applying to C/C++ systems programming job -- think low-level file formats, kernel interaction, custom network protocol -- without understanding hex numbers.
Me, the interviewer, asking what 0x10 is in decimal (feel free to use pen & paper!) is a lot quicker than trying to sniff out a good bullshitter that you've allowed to be in control of the conversation.
No, I really don't think you can do the job without knowing hex.
Yes, I really think any background that would have prepared you to be successful at the job would have had you interacting with binary data in some form. (Saying you personally prefer octal would have been a plus, not a minus -- it would have communicated experience.)
The difference to the coding interview is that nobodys time is wasted. It is a very quick question and if you get a slightly confused "16" back, you can just move on to more relevant matters.
This is different to a 30 minute long question where you examine the applicants ability to solve first year undergrad homework.
But that you are asking the question at all speaks by itself for major problems in employment.
The "selectopia" example is kind of how it already does work, or at least how it used to, in computer programming, where having an impressive Github account can help one get hired. But it has downsides:
- People who want to get hired put low quality projects on their Github which are optimized to look impressive at a first glance.
- People spam popular open source projects with low quality pull requests to buff their Github contribution metrics.
- People who don't have the time to create software for free are put at a disadvantage. Also people who aren't passionate about programming enough to have side projects are at a disadvantage (not necessarily bad).
I also suspect that this phenomenon is what motivates the development of a lot of the "leftpad" or "is-even" type libraries. Which could be interpreted as either good or bad.
This kind of frightens me. It likely leads towards some kind of ageism, as students and teenagers are comparatively time-rich and can more easily find the time to build up impressive GitHub profiles. As you get older, projects like building a house, raising children etc start taking up your free time and after many years as a software engineer, you do relate differently to your profession, compared to when you first started out. (You're less naive. Less capable of just having faith in some weird karma whereby hard/good work will be recognized and rewarded if you just do it).
I guess other professions have similar things going: Lawyers might take on high-profile cases on a pro-bono basis to build a name for themseves, or contribute to academic publishing even when they are not on payroll in any academic capacity, etc. But they have a different culture to go with that. As a lawyer, I guess your X-hours work week will stay an X-hours work week and your non-billable hours Y are simply subtracted from X.
But most jobs in the software engineering profession come with the expectation that you have an X-hours work week, and your employer usually does not allow you to use any of that time for personal professional status-building through open source or publishing etc. ...your only chance is to do it on weekends, which eats into your work-life balance.
It's true that the culture is different, but it's not my experience that employers are categorically against status-building, especially if it indirectly reflects well on the company.
We recently got a paper accepted at a conference cooperating with our local university and my employer was more than happy to pay for that time.
What's the real problem IMO is that studying, training, etc. is pushed to your time off. When you account for all of that, my 40-hour week is more like 50 hours.
My previous employer (the one I have been with the longest, at 3,5 years) categorically rejected paying for conference participation etc. The company would frequently get requests from organizers of conferences, networking events and the like for the company to have a presence. The CEO would then post the speaking opportunities in the company-wide group chat laying down the rule that anyone could attend in the company's name if they wished, but he personally considered it a waste of time and we'd have to do it in our own time.
I think people very seldomly took him up on the offer and if they did, I would guess there was a certain probability that he would read it as a negative signal along the lines of: This person is really going out of their way to do networking, so they're probably working on finding their next gig so they can leave the company.
In one other company I worked at, the company was keen on conference participation, but the way it would work there, if you were the person doing the legwork it would be your boss presenting at the conference. So you couldn't really use it as an opportunity to make a name for yourself. You'd just make a name for your boss.
I've also done extensive freelancing. If you run a low-risk freelancing strategy, clients pretty much expect you to work for them fulltime. It's already a tough sale to get them to give you a day off every quarter to do your own bookkeeping. And the hourly rates aren't high enough to allow you to spend several weeks or months in between freelancing gigs to work on status-building. ...now if you're using your status-building to get the most choice consulting opportunities where you have the bargaining power to do few hours at high rates that's a different story, but that's a different kind of freelancing altogether. (The "high-risk" kind which I'd distinguish from the low-risk kind that my personal experience is limited to).
To be honest, I don't think there's anything wrong with employers preferring enthusiastic candidates over jaded ones. If you want to keep earning tech salaries, you need to keep learning.
If you lose interest, there's nothing wrong with pursuing a different career -- I know a couple of people who started over because they just couldn't bear how boring tech jobs get after 20 years.
Right but having contributions in either open source or your current company's github ones will look better than you writing same amount of code for the company's internal repositories
I think that when it comes to projects on github the key is in distinguishing between the "I want to be hired" projects and the "I'm doing this because it's interesting and fun" projects. The latter tend to be much higher quality than the former and they're also more fun to work on (because you don't have the pressure of what a recuiter might think about it). Although, as you said, it's not always easy to tell which is which at a first glance.
Briefly, minimize the application process and aggressively review work. Any mistake or failure and you're fired, two weeks to prove competence or you're fired, six weeks until a significant accomplishment or you're fired. And no stigma against being fired, just keep trying until you find a fit.
I believe you're misrepresenting the speaker's position. Failed commanders were not "fired" in the way a private sector employee gets fired. They were relieved of command and given easier jobs with exactly the same pay and benefits. Also, the military command structure is intentionally redundant in order to be robust against mass casualty events. Every commander has a line of succession consisting of people who are trained and ready to take command themselves, and in fact, usually have taken command at least a few times already when the commander is intentionally absent to give them practice.
Private industry could do this if they want, but it would require cutting executive pay in half, hiring twice as many executives, and creating intentionally redundant positions that serve little purpose other than preparing the next cohort for leadership. This works for the military due to a unique combination of the organization itself not having a profit motive and the personnel valuing non-monetary accomplishment more than personal wealth. And, of course, the military is a monopoly. This also works because there isn't a different military that can offer your generals triple the salary and poach them away. If they want to be generals at all, they have no choice but to buy into the unique culture of the only military they can serve in. Theoretically, a foreign military could try to buy them away, but countries mostly seem to not allow foreigners to command their military units for security reasons and the possibility of being executed for treason and/or desertion is a pretty strong incentive not to leave even if you got an offer.
I guess I may as well add generals don't get there by being hired anyway. They're promoted from within. To get into the military officer corps in the first place, they absolutely do have to apply, many times, first to the academy, ROTC, or OTC, where they spend years being rigorously trained, tested, and evaluated before ranking the career field they want to be in, which they will only get based on where they rank compared to all other cadets in the same cohort year.
An equivalent private sector practice would be not hiring leaders at all and only appointing them via internal promotion, but to get into the organization at all, you need to apply to a training academy first, then spend several years in that academy learning the organizational doctrine and culture and being evaluated and stack ranked before you're ever given a real job. Typically, the bottom 40% of so of cadets don't even get in at all and have to become reservists.
That sounds like the "internesia" of the original article.
The problem is that, in most countries, employment law will make it pretty difficult to do this. Also, with deliberate high turnover like that, you'll find it difficult to keep trade secrets and there will be a huge drain on resources for onboarding those people who are quickly come and gone.
In Europe you usually have a 1-3 month probationary period that works more or less like "at-will" employment does in the USA. Either party can cancel the contract on the spot for any reason or no reason at all. There are also some limits on how much time off you can take at once and in total in that time.
Employment law is more like a grenade than a sharp-shooter rifle. It just deals out damage in all directions regardless of how it's used and where it's supposedly targeted.
In a country like Germany, if you fire an employee and they walk away with a hurt ego and start talking to a lawyer, you will end up hurting in some way or another, so the game is to manage egos and make sure that, when a company parts ways with an employee, the employee feels that it was mutual.
Actually, I think it would be a step in the right direction for both bosses and employees, if bosses could just say: Hey, I'm firing you because I just no longer want you here. You didn't do anything wrong. I just have other people who do the job better.
The perverse incentive is that it's probably better for companies to single out individuals they want to get rid of and treat them badly until they leave of their own account, rather than do that.
I have a friend who argues that well-implemented UBI and a strong social security net would unlock the full potential of capitalism and i tend to agree.
No one would work awful jobs for terrible pay because why? No employers would keep ineffective employees onboard because why? Unions might become obsolete because bad employers can't exist (people just quit). Anyone would be at liberty to try and start a new business because they never have to risk their passional safety.
Right now many employment situations are like an abusive relationship, staff (and bosses) are getting hurt, but they are afraid to break up. People with strong safety nets and personal wealth can leave terrible relationships much more easily than people who depend on a partner for everything or live in a society where breakups are taboo or illegal.
I feel like a bunch of things must be being glossed over. Let's say UBI happens. How will people in NYC, SF, Singapore, London, and other high priced cities afford rent? Those are also the type of places where the jobs are. Not all jobs can be work from home.
UBI might let you live in some small town in the middle of nowhere where the rent is low but it's unlikely to let you live in a high demand area. You could say "rent control" but that's not a solution. All that does is start a lottery of who gets it live in the city and who doesn't.
Further, there's a ton of jobs that need to be done but arguably no one wants. jobs like farming and shipping, stocking grocery stores, construction. Ok, so UBI lets everyone quit and now those jobs have to pay more to get anyone to take them. But then all the costs go up and UBI is no longer enough to make it. Raise the UBI and you get a vicious circle.
I don't know all the reasons prices are out of control in the bay area but for example, I ordered 3 simple Thai dishes the other night. $135! In Thailand that would have been $6. 3 burritos in the poor part of town, $60. 4 plates of Chinese dumplings, $80. My guess is this is in part, high rents, and paying higher wages. Sounds great, but now no low-income people can afford to eat out.
UBI would take that even further because (see previous paragraph)
Someone will now chime in how it's bad to make people work in jobs they don't want. I don't disagree. My point isn't that people should work shitty jobs. My point is UBI won't fix the problem.
Your dinner bill nearly gave me a heart attack. Dear lord!
I don’t have much say in UBI, but your mention that rent-control would simply “start a lottery of who gets to live in the city” sounds like a good thing. If you don’t already have established family in a high-demand city (in which case you should be grandfathered in), isn’t a lottery the fairest way to get in?
That's how it used to be, and a lot of other bad things happened. If bosses could be trusted not to discriminate, we wouldn't have such a law in the first place.
Regulations are written in blood. They are always Chesterton's fence. That doesn't mean it's still a good idea, or even that it ever was. But there's a reason it was put there, and it's wise to at least know why before removing it.
...the question is whether the historical context from which it grew is still in place today.
Today we have at least three "layers" of social security in most European countries:
The first layer is that government makes sure everyone has employment and makes it difficult for employers to get rid of employees, even those ones that cost the company more than they're worth. Family law extends the coverage to everyone else, e.g. for non-bread-winning spouses (even ex-spouses), it's the bread-winner that's on the hook (so, transitively, the bread-winner's employer) and for children the parents (again: ultimately the parents' employers).
The second layer is mandatory social insurance.
The third layer is usually some kind of outright government handout that kicks in for people who, for some reason, fall through both the first AND the second layer of social security. (e.g. [1] in Austria)
My guess would be that each layer was brought in at a time when the layers below were not yet in place, though, maybe, someone can provide the necessary facts to the contrary here.
I just wish they'd do away with all that crap, and instead just implement universal basic income, paid for by the tax payer.
That’s standard, but as far as I know it’s very uncommon to actually make use of that and drop somebody during probation. And employers definitely do not interview less carefully because of it.
(This is for software engineering jobs, at least. In other industries it might work differently.)
Seems like an awful idea. Especially if you are new to something failures happen and are a good way to learn.
This just rewards the lucky and those who minimize their own work.
Officer were relieved from their positions not fired in the sense of being removed from the military. It was not a career ending event. In fact the video points out some of them would still have successful careers.
You relieve someone from a position for not winning (and not just not failing). The theory proposed by the video is that relieve culture INCREASES (even if counterintuitively so) the willingness to take risks.
If there is no reward for winning then people will minimize risks. If you make winning necessary for keeping the position, they will be more likely to take risks. (In military context, officers these days, according to the video, care more about minimizing losses and not messing up than actually winning the wars.)
Translating it into civilians contexts might be a bit difficult but one thing that I believe can work well is allowing people to move to different positions to find one that they can be most effective in inside the company.
Like sometimes people get promoted to managers and then it turns out they are actually ill fitted for that position. Firing them might be a waste but just keeping them around will be to the detriment of the people they need to manage. So the solution is that they need to removed. Maybe even take a lower position until a better position can be found.
I think this is main take away: Just like you should not keep around badly performing general as that will cost literally lives you should also not keep around bad managers in a company to the detriment of the people they manage. It is better to pull the plug early, for both sides.
Intriguing concept, but sounds like it doesn’t leave much room to learn from your failures. Often you might not “win” a project, but you learn a lot from it that can be used to win the next one. Maybe this concept only applies to more senior role?
Yes, the video talks about generals. So the civil equivalent would be (higher?) management positions that already require substantial experience. (Also again, note, it is not meant to bee career ending thing. If you fail, you lick your wounds, find another role that hopefully fits your better and apply your learning.)
It doesn't make sense to apply the concept to rank and file soldiers or like junior software engineering roles.
The higher someone is on a the hierarchy tree, the more people they can influence and the more important it is to remove them when they are not a good fit.
It is very unfortunate that people think the video war arguing for hiring and firing normal employees just because they don't perform perfectly in the first few weeks. That would be absolutely psychopathic behavior. Your job as a junior is to learn and make mistakes.
Well, that worked at wartime, when any pair of hands was welcome, and speed was essential.
At peacetime, people may not like to put up with the strain which such arrangement puts on them. It does improve efficiency, but had a cost, and works better at times of very high demand.
I guess it makes sense on the short term. Instead of application - you work for (as quoted) 6 weeks to prove yourself and you either get fired or not.
But on the long term? Senior-level employee makes an error? Bam - fired! With all the institutional knowledge gathered over the years, experience, etc. If it was a major screw-up - you'll be firing an employee who will not do that mistake again, but you will risk it with a new hire.
“You know what Pasteur said: Chance favors the prepared mind. Take one of the chanciest things in the world, like war. Both Kitchener and Frederick the Great, when they were considering a general's qualifications, would always ask, 'Is he considered lucky?' It was a perfectly legitimate question, because if he was considered lucky, it meant he was prepared to take advantage of chance.”
Well, Marcus Clarkus was left in charge for quite a while. And McArthur was only sacked after he wanted to nuke Korea.
The allies were better in WW2 than the others, because they had a feasible grabd strategy and outperformed the Axis on operations. That, and the fact the Allies committed their full industrial might for years, while having public support for doing so.
Yeah, we had way more stuff. We could design, iterate and build things fast. But we could build way more copies of things than the other people. The sophistication of the Japanese carrier forces and their strategies, the tanks, etc, the way the Germans waged war, etc were far ahead of the US in some ways. But we could build way more stuff, and we could ship a ton of it to Russia to keep Germany from wiping them out. We were willing to make changes in our military strategies over time, we learned (and made mistakes). We did stupid stuff too of course, plus don't forget endemic racism, resistance to black people having any role more important than moving supplies around (women too, other groups too, there's no end of our racism). Stealing and imprisoning Japanese people.
Different subject, but I was just thinking what ww2 would have looked like if we had the slightly evolved social order, protections, liberties we have today. We still have racism and sexism and ... but it's different now (but still there). Most large organizations like the military still have an old boys club but they also did a lot of work to improve things. We have so far to go as a society. What if we didn't round up all the Japanese people and destroy or steal their possessions. How would the groups have dealt with the war? I think it would be like Russians living in Europe today, plenty are influenced by propaganda news orgs to think that Ukraine is somehow the bad guy, even though they are living in free, safe, and well off societies out of Russia.
That's not a bad idea, but it would only work if people were not dependent on income from employment to meet their basic needs. Also, the way you phrase it is a bit extreme. Failure is an opportunity for learning, and it's hard to make a major accomplishment if your work is mostly routine.
I did quite a bit of interviews (on the hiring side) for almost a decade and I find the current recruitment processes to be basically stupid and over-complicated --both for the applicant and the employer is a huge waste of time.
Look, if you are hiring an engineer, it's easier than it seems. Just sit an hour with them and have a good talk about his career on the business domain or the technical field. At the end of that hour you should have a non-100% accurate but solid perspective on these things: "Does this person communicate effectively? Would I want to work with this applicant?" And that's it.
Yeah, people who share similar values can communicate and work together better, but this isn't discrimination.
On this basis the whole hiring field and each and every interview is discriminatory because the companies want to select who they want to work together with based on the candidates skills/values.
>Yeah, people who share similar values can communicate and work together better, but this isn't discrimination.
This isn't really my experience. Similar people will have a better initial perception of each other as it takes less time to synchronize with each other however in most jobs that's of little importance. What matters is how they work together in the long term and most people don't need clones to work well with. If anything else having too many similar people is a problem for a team since you get overlapping blind spots which tends to cause serious issues over time.
I consider it quite important how well people synchronize. It is much more fun to work with like-minded people where you can have good private talks as well because you’re sharing many values.
People are obviously using discriminatory as a shorthand for pointlessly discriminatory. It's an argument against using qualities other than actual productivity in a hiring process. You're on the side defending that.
Agreed. If you, as the interviewer, can't ascertain an applicant's ability to do the job, after chatting with them for an hour or two, you aren't qualified to hire for that position.
The idea that you can pinpoint someone's skill level (and estimate their performance potential) through additional hours-long exams is arrogant. Job performance has more to do with the environment than the incoming skill set.
When it comes to promotions, there was a paper this year that won an Ig Nobel prize, which made the case that promotions are awarded at worse-than-random https://arxiv.org/pdf/1802.07068.pdf
I've also done a lot of interviews for over a decade. I figure I'm an average or above average interviewer. Talked to people for an hour, discussed projects, they looked good. Failed basic coding screens. Over and over. Maybe you've got an amazing sense for interviewing people. I doubt many others do and processes need to accommodate the average interviewer and not the amazing outliers.
Dude here who designed and built an early multitenant mobile app server-side support system from scratch, serving millions of daily users (the spikes in traffic when our users sent out pushes to some of those apps... yikes), among a bunch of other stuff. Actually, in-fact did that. Code-language-polyglot, often considered the one to go to for low-level stuff (nothing I seek out, I just keep getting that reputation), getting paid to write code for ~22 years now, et c., et c.
I've nonetheless bombed "basic coding screens" a couple times, including after having done years of nontrivial and highly-independent coding for pay. And I don't mean leetcode shit, I mean really basic stuff.
Some percentage of those folks who seemed great in the interview but failed your basic screens are false negatives, no matter how much you believe no one who had a tenth of a clue could possibly fail them. Maybe that rate's negligible, but maybe it's not. I do think unjustified confidence that very simple tests can't have a significant false-negative rate is part of why so many people are convinced our industry is drowning in expert-level bullshit artists who can't even write a for loop.
I am not saying that all the people I hired were x10 performers, but they were certainly better than average, imho, with a leaner process. For example I did not hire anyone that lacked basic coding skills. Perhaps I am a great interviewer or, more probably, the fact that the specific profile I was targeting (RoR developers) included a high-pass filter.
No idea if this could be related, but @pantulis you're in Madrid (says your profile. Well, "from"), whilst @marcinzm you are (I'm guessing) statistically in the US, maybe SF?
Is it possible that it works better to "just talk to people" in some parts of the world, than other parts? In SF, software engineer salaries can be $X00k/year, but a lot less in Madrid. I wonder if this can influence people's behavior.
(Nonetheless personally I'd never do only such unstructured interviews.)
In my experience the kind of simple interview you describe is vulnerable to BS artists. For all their flaws, the common coding challenge kind of interview at least gives you concrete evidence about whether the candidate can solve a technical problem.
Anyone who's a good enough bullshit artist to fool you while talking tech for an hour should be in a totally different field, because such extremely high levels of bullshitting ability are themselves quite lucrative.
I just can't believe there are so many people out there with the skill and will to do that, who also haven't found a better way to employ those skills directly, that our field can be awash in them. I believe there are plenty who try, but I doubt there are very many who could talk about tech and their career for an hour without tipping their hand.
> Look, if you are hiring an engineer, it's easier than it seems. Just sit an hour with them and have a good talk about his career on the business domain or the technical field.
“his”? Doesn't sound like a very fair recruitment process.
> But if you had the God’s-eye view of all your applicants, you might find out that 75% of all of applicants were good hires!
... or 5%, and rest fails at basic litmus test of not too hard questions.
People will apply anywhere regardless of their own (perceived or actual) skill level. And it is no wonder as the requirements will be half-lie or wishlist most of the time.
So the one side throws resumes everywhere they can and the other just tries to best guess which one is worth spending time to even interview as there is too many to interview all.
On other side, testing candidates puts a "homework requirement" on them, doing that online also makes cheating easy so you might bore/detract the good ones (they most likely already have decent job and don't care for few hours of homework on off chance they get thru the filter) while the less good will just find a way to check the box and pass anyway
In classification models, there is a concept of precision (% of correct positives) and recall (how many positives are captured from the population).
The author correctly points out that recall is very low in a typical recruitment process.
As much as I agree with the sentiment, employer incentives are not to find ALL suitable candidates but to find just ONE suitable candidate. Everything beyond that is a waste of resources.
For some positions (e.g. finding a co-founder or an artist with a unique style) - yes, low recall might be a limiting factor. It is just a market for most other roles - in particular, all that can be summarized with a job title.
If you are not sure if your screening processes work, do an RCT. I have zero idea why the article raises this clearly provocative and correct approach and dismisses it immediately. Allocate a percentage of the positions you're hiring for to candidates that are hired at random. The empirical decision about what percentage depends on your primary concern: if you think your hiring process is completely useless, then you want to get this percentage close to 50% because this is what's optimal, experimentally. If you think your hiring process works OK and want to guard against downside risk on the random hires, get the percentage lower. This will cost you statistical power, but less than you expect.
No one is saying "random" means "no standards at all". If you're a graduate pure math department you don't need to admit a grad student whose background is that they have a GED and worked as a welder for 5 years before getting a 135 on the GRE Math side. Instead, split your screening into two phases: one phase that is targeted to exclude people who absolutely positively do not meet requirements and could never work out, and one phase designed to pick the best feasible candidates among those who meet requirements. Do the RCT in the latter phase.
If you have relatively continuous or regular hiring processes, you'll be well powered to assess your hiring screen in pretty short order. The larger your organization or the faster your overall turnover is the more robust you are to the risk of this going wrong. Not every company can do this -- academic departments that hire one person a year and sometimes have blown searches or people going back on the market within a year or two are clearly going to be a little risk averse. But if we're talking Deloitte hiring a cohort of 150 office admins, they can absolutely, positively make use of this strategy.
The other thing I'd add is that a lot of people are under the mistaken belief that hiring is about determining the best candidates. In fact that is not true and it has never been true. It's about maximizing the probability that a candidate you offer a job to takes it and is good... or else maximizing the frontier of some function of probability the candidate accepts, is good, and how good they are. Hiring nothing but the best candidates will increase the likelihood of a blown search, increase the rate of attrition, etc. This makes almost all of the discussion around mErIt In HiRiNg a distraction from how hiring actually works.
There are more narrowly tailored options. The more local you make the randomization, the less power and the less downside risk. Suppose you rank candidates from 1 (terrible) to 100 (ideal) and tend to hire candidates in the 80-90 range. Even a willingness to allocate some positions to candidates scored at 70-80 is going to give you lots information about whether your ranking system is useful at all.
Erm, no. Applications are needed (or, more charitably to TFA, useful) precisely because qualifications can be hard to quantify. That’s why they are called Applications and not Forms. And Qualifications not Quantifications. Hence RCTs are not really a solution. Look at clinical trials. They are already heavily quantified compared to Social Science, which hiring is, and its a consensual mess
> Harvard, for example, charges a $75 application fee
And this is why I ended up going to a for-profit school.
Circa 2003 I was a B level student who had taken a couple college level math courses. I applied to three colleges, all local, one of which my friend payed the application fee (thanks, Paul) and was rejected from all three.
$50+ dollars is a lot of money to a 17 year old in a single income household whose parents are completely disinterested in their education.
You know who didn’t charge an application fee? The for-profit college advertised on late night television.
If you try to get ready-made talent from the outside you will face problems in the selection process and, as the article points out, there really is no good solution to those problems. ...so the alternative is to create your own talent and work hard to hold on to them.
If you need a senior developer, take someone internal who has proven themselves as a junior developer and make them into a senior developer.
If you need a junior developer, hire someone straight out of college and give them all the training and mentoring to make them productive.
If you need a manager, look within your ranks for an individual contributor who wants to step up to manager, and give them a chance.
> And you did pretty good: 75% good hires!... But if you had the God’s-eye view of all your applicants, you might find out that 75% of all of applicants were good hires!
The author has clearly never been involved in hiring/selecting before, because the idea that application processes are as good as random is positively absurd.
People will apply to all sorts of things they're not even remotely qualified for. Hell, I've certainly done it in the past. A resume screen is essential to eliminate the obvious unqualified, and a first round of interviews weeds out people who also clearly don't have what's needed for the role or who throw up a gigantic red flag. (And then after that it's about trying to find the best match, not just a barely sufficient one, since suitability for a role isn't binary but exists on a continuum.)
Applying and then interviewing/screening is something we do because it's necessary. Obviously people are always looking for ways to improve the process, e.g. what types of signals actually correlate most with job performance?
But the idea that the application is no better than random flies in the face of all evidence, as well as all common sense.
This makes as much sense as saying that going on dates is useless -- just pick a partner at random and get married and be done with it, because the odds are just as good! (When of course the research shows that personality compatibility and chemistry compatibility are huge determinants of successful marriages.)
> This makes as much sense as saying that going on dates is useless -- just pick a partner at random and get married and be done with it, because the odds are just as good!
How sure are you this is false? I'm not completely convinced. Some people are very, very bad at picking compatible partners. Now I'm trying to figure out how we could ethically run this study, because I think the results might surprise you.
You might be able to compare divorce/separation/unhappiness rates between arranged and non-arranged marriages, but culture differences would be a confounding factor… maybe study the average lifespan of relationships started on reality tv haha
Also divorce rates can only be meaningfully be compared within the same culture. The pressures to stay married or get divorced vary widely between cultures, separately from whether marriages are arranged or not.
Trust Windfalls sound like a terrible idea. If you allow someone to give out cash and select the next person to give out cash, the money will concentrate within a friend group. At the very least, funding will disburse over much narrower set of research topics (I am most familiar with the work/people who research topics similar to my own).
> Trust Windfalls are a Rorschach test for whether you trust people or not. I’m a guy on the internet going “hey give me a bunch of money and I’ll give it to my friends and it’ll be great!”... If you don’t know me and you’re a bit of a misanthrope, this sounds like the worst idea ever.
Indeed, it does sound like the worst idea ever because I don't know you. That means there is zero chance of me getting the funding. And when you give the money to a friend, it's likely your friend will return the favor and give it back to you. Why nominate a person you know will not return the favor? And when they do, who is your next pick? Well, their wife needs funding, and so does your wife. So you go another round. Then who? Your cousin need funding, and so does your college roommate. You know who doesn't get funding - that person with a rival theory, and that person attempting to replicate your work.
And if the money does happen to migrate to me, nothing against you, its just that if the money migrates out of my 'trusted network' it will likely not return for a long time. And my lab won't survive several years waiting for the funding to return.
I'm not the author. I think you've hit on the main limitations, but I don't think it's fatal. If the lay of the land in a given field is such that there are well defined camps, then you allocate the money to each camp[0] and allow the camps to allocate the money internally. If the lay of the land is that it's diffuse and has very shifting structure, then you can't follow this strategy, but also it's maybe worth raising the possibility that it's not yet time to allocate money and the first thing the field should do is try to develop some structure.
[0] Obviously the problem of what %age to give to each camp is an important second-order problem, but if you imagine a scenario like a university, I find it more likely the Provost's office can figure out how much to give to which Dean than they to be able to figure out how much to give to which lab.
(I also think you're right to observe that even within a particular camp of a particular field there will be relationship-driven gladhanding that helps determine the flow of money, and that's a real concern, but it's just a more localized concern than I think you give it credit for. I'm also not entirely sold on it.)
> if you imagine a scenario like a university I find it more likely the Provost's office can figure out how much to give to which Dean than they to be able to figure out how much to give to which lab.
If left to the provost or dean, the money will inevitably be distributed equally between all faculty. Once the money enters the university, there is no longer a separation between funding agency and recipients. And since 'equally' is the least complex/controversial way to distribute the funds, that's where it will eventually settle. (Hm, perhaps I am a misanthrope).
In many cases a solid needs-based JD exists and must be filled in a
perfunctory fashion.
But often companies don't really know what they want. Advertising a
job becomes a way of seeing what's out there, and whether anyone can
bring them some missing "magic".
If we think of work as being about relationships then successful ones
are slow, sometimes painful routes by which we find lasting
partnership. It may involve many on-and-off dates, exploration of
shared values and interests, introductions to family, and so on.
OTOH, the idea of a "job market" in which human beings are rapidly
traded as they "climb a career" is akin to a nightclub for picking up
casual sex. Many companies experience nothing but a series of one
night stands. Passionate embraces and big promises are followed by
tempestuous exits.
The cost of casualisation and rapid churn isn't really appreciated by
either party, or society at large. Workers, forever "dressing to
impress" never really develop deep skills that come from
stability. Companies spend a fortune on HR, perpetual transformation,
induction, on-boarding and reconfiguration around the latest
hire. Little real work gets done and eventually it becomes mutually
disrespectful, or even hostile.
This gig economy is harmful in unseen ways. Now we have companies
desperate for good people, but unable to recruit, and talented workers
unable to find a happy workplace, because both are "playing
hard-to-get" out of pride and restless, undirected ambition. Imho,
it's a situation rooted in immaturity and something Adam Phillips
touched on when he said "capitalism is for children".
Of course it is hostile, because of the corporate structure and owners vs. workers. Workers work for “Me Inc. currently supplying services to big co.”. We often try to negotiate to make incentives align as much as we can, but they never quite do.
I have had similar thoughts! Mine would be communitopia. Something I have seen almost done. The work slack is mostly public (not much at work is really that secret!) and interested devs chime in see what’s up and help out. Chuck a standard hourly rate their way, and if they seem real good, offer them a ticket to the intern island!
Wanted — An archaeologist with high academic qualifications willing to spend fifteen years in excavating the Inca tombs at Helsdump on the Alligator River. Knighthood or equivalent honor guaranteed. Pension payable but never yet claimed. Salary of £2000 (or $6000 U.S.) per year. Apply in triplicate to the Director of the Grubbenburrow Institute, Sickdale, Ill., U.S.A.
And there is a job ad for a circus acrobat before this one in the book, so I've conflated the two together. And BTW, if you google just "pension payable but never yet claimed", this quote won't come up either. But googling for "pension payable but never yet claimed parkinson" works although only the first link is about the book and it leads to a Russian(?!) website of all things. Misteries of the Internet never cease to amaze me.
Things get even worse where the application process is completely unaccountable to the future success of the applicant. When you hire someone for a job, you get some feedback on how that person does at said job. In Medicine for example, this is not the case. Whether the person you admit to your university or residency program ends up being fantastic or a psychopathic murder is not even considered.
I think one part of the answer is administrative bloat, where "processes are implemented" and "guidelines need to be maintained". Especially for technical jobs it is hard to judge people from the perspective of HR, which is why you have "personal statements" and such, which only effectively measure your ability for coherent and flattering writing.
Another part is the job market, both employer and employee side. As long as only reasonably qualified people apply and companies accept reasonably qualified people having multiple rounds of interviews is just a waste of everyones time.
Personally I think "coding interviews" are just very insulting and speak of an awful state of the industry. Imagine giving an electrical engineer a 1st year undergrad assignment to make sure he really can calculate the properties of a basic circuit. Seems like a good reason to just walk out.
The paranoia of tech companies speaks to me as a desperate attempt to filter out applicants who are completely unqualified, from the vast ammounts of people applying.