Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I’d argue that pieces of that interview could still be gamed using this cheating platform depending on how accurate, fast, and realistic sounding the gpt responses are. And all of those attribute will just get better as new models are released.

I realize that this one project isn’t the only way to cheat in interviews, but I still think it’s naive to think that this tech will only harm what you perceive to be “bad interviews” and not affect your own preferred interviews. At the absolute minimum, it adds additional overhead to performing interviews where you have to also be aware and try to figure out if the interviewee is being coached like this.



If the candidate uses a tool to "cheat", and then keeps using that tool while they work for me, is anything lost?

Right now people "cheat" by using an IDE, but no one has trouble with that (and rightly so!).

So why should I care if someone is using LLMs to pass the interview and do their job if they are being successful?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: