Hacker News new | past | comments | ask | show | jobs | submit login

An interesting story!

I've also had an AI cheater during phone screen, but they were pretty clumsy... A question of form "You mentioned you used TechX on your resume, tell me more what you did with it" was answered with a long-winded but generic description of TechX and zero information about their project or personal contribution.

Another thing that I can take away from that is "take home project" is no longer a good idea in AI times - the simple ones that candidates can do in reasonable time is too easy for AI, and if we do realistic system, it's too hard for honest candidates.






> Another thing that I can take away from that is "take home project" is no longer a good idea in AI times

Take-home projects were never meant to be evaluated in isolation.

It was common for candidates to have their friends review the take-home or even do it for them.

You had to structure the take-home so the candidate could then explain their choices to you and walk you through their thought process. When you got a candidate who couldn't answer questions about their own submission, you thanked them for their time and sent the rejection later that evening.


The difference is that AI can now feed them explanations as well. Their friends (who IME were usually also mediocre coders: everyone I've seen who actually did well on a take-home actually was that good) didn't have the patience to sit around and help them memorize a bunch of extra nonsense.

At some point it feels like it would be easier to just get good at programming, and yet...


Take home projects >>>>> live coding sessions, unless you're interviewing for some kind of twitch streamer position.

Just have a 1 hour or 2 hour call with candidate where you guys go through the project.


I would never spend time doing a take home test. The best paying companies never require it, so why would I jump through hoops for nothing but middling compensation on the other side?

The best-paying companies jerk you around for months with hours and hours of in-person quizzes and expect you to memorize a bunch of trivia you will never use day-to-day so they can use their MIT intern interviews for everyone.

Take-homes are a much more reasonable expectation than memorizing how to implement quick-sort on a white board.


Given a choice between studying for admittedly meaningless leetCode style interviews and making $250k+ as a mid level developer at a BigTech or adjacent company and working really hard and slowly doing the corp dev grind for years to become a senior doing enterprise dev making $160K, why wouldn’t anyone who is young and unencumbered with kids not try to do the former instead of dismissing those types of interviews?

The $160K-$180k is about the median for a senior dev in most non tech companies in most cities not on the west coast. You can verify this on salary.com.

Yes I know most of the 2.8 million devs in the US are on the enterprise dev side and that’s where you will end up. But why not shoot for the moon?

For context, I am 50. Spent all of my career until 2020 on the “enterprise dev” side of compensation until a pivot and a position at BigTech in the consulting division fell into my lap (full time direct hire with cash + RSUs like any other employee).

But I tell every new grad to do whatever it takes to get on to the public tech company gravy train if possible.

That being said, at 50, I would rather get a daily anal probe with a cactus than ever go back to BigTech again. I’m good with where I am working for a smaller company.


If they do it properly and walk through it with you afterwards, it can be a good opportunity for you to assess cultural fit as well based on the conversation that you have; are they hypercritical of unimportant details? Do they acknowledge good design and decisions? Do they offer their own insights, and if so, what do you think of those insights?

Also, you might find yourself in the unfortunate position of looking to find a job without already having one; many people find that a compelling reason to "jump through hoops for nothing but middling compensation"


I've had situations where I submitted a take-home exercise, only for me to get feedback that it didn't match their required level.

After some back & forth I was able to (politely) prove their feedback was not correct, which actually granted me a follow-up interview.

Unfortunately, this was a unicorn, most companies don't give feedback, let alone admit they were wrong.

But, take-home is preferred, I want to use my IDE, with my keyboard shortcuts etc.

Then there are take-home timed challenges on systems like hackerrank / leetcode etc, which are horrible in terms of accessibility and access. Not to mention that they are a pass/fail, and focus purely on speed, not quality.

Next to that they don't allow you to work in an environment you're comfortable in. No debugger, etc. When an HVAC company hires a new tech, do they tell him/her to do a 1.5 hour repair with only a hammer and a lighter to diagnose and fix an issue? No, it's stupid. Why do developers have to do this then?

And the same applies to live coding exercises. While there is an opportunity to explain yourself, you're still in an extremely uncomfortable environment. Why is there such an emphasis to put people in an environment where they are not set up to succeed?


>When an HVAC company hires a new tech

HVAC has certifications you can get. We should strongly consider this in our industry. I don't think its an unreasonable compromise, especially now with the advent of LLMs.


We have some certs. The problem is that software development is about thirty different skills in a trench coat, and half of them we don't know how to evaluate (like slicing, or abstraction.)

What ends up happening is that our certs end up being a bunch of multiple-choice questions that check people's ability to memorize trivia.

It is more like having a Certified Novel Writer or Certified Mural Painter or Certified Graphic Designer certificate than it is like HVAC or welding.


It would be nice if there was at least a bare bones certificate that guaranteed the candidate knows at least some absolutely minimal baseline, like what a for loop and if statement is. You’d still have to interview the candidate but you wouldn’t have to start at Hello World or FizzBuzz.

I have interviewed at least one self-described Senior Software Engineer who didn’t know how to write a function that takes an integer parameter and then prints every integer from 0 to the argument passed.


People do take university courses in doing creative stuff, a fair number of sucessful novelists seem to have done one, RPG proposed that we could have something similar for software [1].

[1] https://dreamsongs.com/MFASoftware.html


Some parts of the IT industry do lean on certs.

IIUC, network engineering in particular is an area where vendor certs play a big role (mainly Cisco).

AWS, Azure and GCP all have certs. There are certs for Windows and Linux administration. Java has certs.

(I don’t know if anyone cares about the Java certs, but they do exist.)


That is an instructive example.

In regular systems administration, having certs kinda suggested that you didn't have the chops to get a job without a cert. Even people who had them would only include them on the resume when they were explicitly called for in a job description.

With the rise of "DevOps" and throwing half your raise at Amazon, the job moved away from being able to build and run networks of computers. Now it is mostly about configuring off-the-shelf tools in "the cloud". In that world, certs became way more meaningful. Sure, the AWS cert is just testing if you know the six different names Amazon has given one feature, but it is potentially more helpful to know that trivia than it is to actually understand LDAP or DNS.

If AI successfully de-skills software development, maybe certs will finally become useful for developers too.


> I don’t know if anyone cares about the Java certs, but they do exist.

The clients in some consulting projects definitely do.


I think in part, the difference in what I mean about certification (perhaps licensure is better word here) is an industry body - accepted and respected generally by the businesses within our industry - that will demonstrate some form of competence

I would love to see a trade union-style group, where you are sponsored to join by an existing member and expected to do some work along side existing members before being certified as journey-level and recommended to employers.

It would require that group to agree on what being a "good" developer meant, but there could be more than one and if you don't agree with this one you could form your own. Maybe one requires people to be able to write testable code and be able to label design patterns, and another expects pure functional programming, and another expects deep security expertise, and companies could know which of those they are looking for and inquire appropriately.

We have this a little bit with employers like Pivotal or ThoughtWorks, that have such strong learning cultures you can be sure that if someone spent five years there they know their stuff. But we could have a version where workers were willing to endorse each other, rather than relying on a specific for-profit company.

It is, like all certifications, only as valuable as the least-competent person who holds it. But the informal versions of this are pretty powerful.


I'd rather it be like passing the bar, accounting exams (CPA etc) or actuarial exams. They test very relevant deep knowledge and act as a proof of fundamentals - and software engineer does have technical fundamentals that could just as well be tested for in a meaningful way.

I am not sure we can come to agree on what competence is.

I think if we try hard enough we can get there.

If we can divide the industry into many small subindustries, each with their own licensing, maybe. If we want to treat it as the one big industry like we do right now, no chance. We won't even be able to find agreement on surface level things, never mind the nitty gritty.

    > We should strongly consider this in our industry.
These were very hot for system admins in the late 1990s and early 2000s. Is it still a thing today? Do high quality employers still care about these certs in 2025? I doubt it.

    > Then there are take-home timed challenges on systems like hackerrank / leetcode etc, which are horrible in terms of accessibility and access. Not to mention that they are a pass/fail, and focus purely on speed, not quality.
This does not mirror my experience. Many times that I have interviewed with hackerrank/leetcode questions, I wasn't able to get all of the test cases to pass. After time was up, I explained my solution to the interviewer and talked about the failing test cases. Sometimes I passed the interview; and other times not. It was not binary: Imperfect means 100% fail.

It’s like a scene in Swordfish where a hacker (Hugh Jackman) has to infiltrate a system while getting a blowjob and having a gun pointed at his head.

“If you can do the job under these constraints, imagine what you can do under optimal, normal conditions!! Hired!”


IMO take home projects still have value, provided you do a comprehensive follow-up interview with their project (which is the _actual_ interview, I feel). Those who just used AI on it are far less likely to talk about any tradeoffs, do deep dives, or even simple extensions of the project in the follow-up interview.

I think take home still has value, if it's of any size and they just vibe code it'll be full of long messy methods, unused variables, and lack of any thoughtful design.

They are, if anything, a more-accurate example now of the kind of code a candidate is going to produce on the job.

If we expect people to use AI, and it is available in most companies now, then being able to appropriately refactor, test, and sense-make of AI-generated code is even more important. The key is raising the bar on quality beyond mediocre, and not relying on those take homes to test skills they are no longer testing.


I think the better/mature response to this cultural change is to design takehomes anticipating the use of AI and then seeing where the canddiate got lost in the weeds or gets lost when cross questoined about it

Don't see the point of these "take home projects". Just ask them what's the most difficult technical thing they had to do before, and have them walk you through it, probe, ask questions. If you don't like the one they talked about, ask about another one, or another one. You can generally weed out the bullshitters, they talk alot about "we" and hardly ever use "I" meaning they didn't do anything.

I would say "we solved this issue" if all someone did was hand me a coffee while I was debugging.

What compels you to play lingual games with peoples' livelihoods?


Well I find conversations with people in interviews to be less of a game than giving them “homework” to do, given that unless they’re totally green with no work experience, I’d assume they would actually have some stuff they’ve worked on and would like to talk about.

It’s completely bizarre to me that take home assignments have been normalized as part of an interview with professional working people.


It's a constant tug of war between standards and expectations.

I personally prefer hypotheticals, or some variant on live pair programming. Also, as someone with enough free time to do take-homes, I also prefer code reviews over that one-off code which then becomes a case of 100% "I did this and here's why".

Even with that last example I would say, "well to optimize, etc., we could do this".




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: