My brother has just finished a masters programme and is starting to apply for jobs, he's coming up against a lot of these.
He's smart, motivated, very hard working, and has the necessary credentials on his CV, but when talking to him he's quite monotone and low energy. Because of this, he's failing these automated video interviews as he looks like he's not a "team player" or "personable".
While interviewing practice is becoming more aware of bias against things like this, the training data for these "AI" systems are going to be based on those interviews conducted in the past, often with questionable outcomes. The software is sold as helping "remove bias", but it instead pulls interview bias forwards in more extreme ways to the CV-review stage, preventing candidates from ever getting in front of a human.
While multiple humans on a hiring team may overcome some of their biases working together, a single piece of software trained to look for stereotypes will exacerbate those biases.
I had never heard of these video interview analysis methods prior to this thread.
I think it’s at the very least way too early to rely on this and at the worst just... terrible.
AST’s are already notoriously bad and this is just a whole new level of just plain bad. Time is precious, sure, but so are human relationships if time means anything.
It’s like nuance was completely disregarded in the pursuit of corporate contracts.
They do sound terrible and I hope to never encounter one, but my brother made a good point: the NHS get something like 12k applications per week, they have an average of 25k job openings each month. They apparently can't deal with that manually, which does make some sense.
I personally think they _should_ handle it manually, and if the system breaks we should fix it in another way because this way sucks, but I'm not sure what that would be. More consultancies or temp agencies? Not sure that's a better option.
If a company isn’t willing or able to commit the appropriate resources to hiring, they should reduce their number of open positions, not look to technology to solve a self-created problem.
Assuming they don’t have the resources to give people a proper interview and vet a candidate, can you imagine the situation once those that do pass actually encounter on the job? I’d never work at a place that makes such kinds of decisions and it’s a good signal that the culture at the top of the company is completely broken.
Any company using one of these AI systems to combat attrition or justify growth is for sure a dumpster fire.
I agree. The AI hype helps with drinking their koolaids and let's hope this will cease soon. I'm not saying that there aren't good problems to solve with AI, but adopting it so readily on people while it is not working properly and while it is amounts to experimenting on people which is a terrible thing to do. I'm not for the AI winter but for the the end of AI hype.
> the NHS get something like 12k applications per week, they have an average of 25k job openings each month
That’s just a couple of applications per job. How can the line managers possibly not have time to review just a couple of applications per person they’re apparently going to be leading?
Which NHS organisations in the UK are using automated video interviews?
I've done interviewing for a couple of NHS organisations, from band 6 positions all the way to very senior leadership, and automated interviews were not used and would not have been used by these organisations.
> the NHS get something like 12k applications per week,
It's not really "The NHS". It's a bunch of different companies. They each have their own corporate structure and boards of directors. They have their own HR teams that advertise and recruit to positions.
There were (in 2017) 207 clinical commissioning groups, 135 acute non-specialist trusts (including 84 foundation trusts), 17 acute specialist trusts (including 16 foundation trusts), 54 mental health trusts (including 42 foundation trusts), 35 community providers (11 NHS trusts, 6 foundation trusts, 17 social enterprises and 1 limited company), 10 ambulance trusts (including 5 foundation trusts), and 7,454 GP practices.
The largest is probably Barts Health NHS Trust, which employs about 15,000 people. The smallest is probably Weston Area Health NHS Trust which I think employs about 1,700 people.
You nailed it. Having encountered some of those kinds of agencies before—the other bad option seems better.
It just seems like we need to incentivize ourselves in the better direction all the incentives currently seem to lie in (at least what is coming to look like) the wrong places.
The answer is simple, but you're not gonna like it.
1) Set minimum qualifications
2) For everyone who passes step one, run a lottery
3) That's it
Most interview processes are about as good as a coin flip at determining if a candidate is a good fit. People are biased (especially people who think they're objective, which describes a lotnof hring managers), and so are the systems built on our decision-making. So, apply similar logic as to the kind that supports index fund investment: cut "people" out of the decision-making, and make the process faster and cheaper by relying on a simple set of objective criteria and the way risk distributes across a population.
This is where things will go south for most big organizations anyway. They don't know what they need, or they can't specify what they need, and often this part is done by someone other than the team that actually needs people. So you find jobs where the requirements are 3 years of experience with C# and at least BSc in comp sci while the team needs someone to work on embedded C & Ada and they couldn't give a shit about BSc (or any other degree for that matter), just that you can do the job.
Right, so then the question is if the other processes currently in place serve as mitigating factors to the inadequacy those qualifications present to the selection system's efficacy. I'd argue that they don't, in which case you're spending a lot more time and money for the same results.
If they do, then would fixing the qualification issue be less expensive than interviewing et al.? Again, assuming that you have a choice of similar outcomes at different price points.
I actually think these are a marvelous idea. Because if you ever encounter a company using one of these, you know for certain you don't want to ever work for them, as they don't value you even to just give you a normal interview, instead opting for an impersonal faulty machine.
Imagine the horror if you actually started working for them! If their hiring is this bad, their working conditions must be even worse! So it's pretty kind of them to reveal their true nature in such an easy to identify way. ;-)
Would it be correct to say your brother is also not the type to network with others easily?
An automated system is the last-line in hiring; recommendations -- internal and external -- are the first-line. If one finds themselves in a position where they're manually submitting cold resumes, it's almost always more productive to start networking into the companies you want to work at, and getting the recommendation firsthand, turning your resume into a hot one.
> recommendations -- internal and external -- are the first-line
I'm afraid this is a very tech-centric view. Outside of a few specific industries or the very top levels, this is essentially unheard of.
My brother's educational background is biomedical sciences so he's looking for essentially lab work doing analysis for a hospital, drug company, or similar. There are a fair few jobs doing it, but they are relatively low level, have no "community", no real way to facilitate referrals.
In tech it's easy to "network into companies" because companies are so open with their hiring – they hold events, they sponsor conferences that are priced so that people can pay their own entry, and there are community events where you can meet people from them. This is very far from the norm, until you get to the golf clubs where you can mingle with other execs.
No, this is how it works in almost every industry. Even if that's how they try and force hires into the pipeline, if you're simply accepting that instead of circumventing it, you're success rate must be abysmal.
I don't work in tech. But I've got about a 60% lifetime success rate (Job offers to applications). 100% once I got to the interview stage. And that's in a variety of industries: EMS, academic research, the energy industry, and civil/environmental engineering.
I swear nobody has any hustle anymore. I've never bothered to make a LinkedIn or go to "networking" or "hiring" events. They're a waste of time. If you're really out of your existing network (you're probably already doing something seriously wrong if that's the case), you'd be better off figuring out where you want to work and then waiting at a nearby lunch spot for an obvious group of employees to come in around lunchtime (or after work drinks) and start chatting them up. (I actually landed a job doing that.) Or better yet, find a CrossFit gym some of them go to. Sweat and bleed and bond with someone a bit before you leverage them as a recommendation. There's a million ways into an organization if you want it badly enough. If nothing else you can get super good intelligence on how to craft your application to be desireable.
Do your research, know your shit, know exactly what they're looking for before you ever turn in an application. Become that person to the core. Get any new certifications you need to be that person. Make every document you turn in to apply for the position fit that profile. Make every searchable piece of information about you on the internet align with that profile. Know the way they conduct interviews before you get there, and practice and rehearse the questions and flow of your responses in broad ways. Leverage your contacts in the organization to get information about each of the interviewers and how they think and approach interviews.
> you'd be better off figuring out where you want to work and then waiting at a nearby lunch spot for an obvious group of employees to come in around lunchtime (or after work drinks) and start chatting them up
I'm going to go out on a limb and guess you're either a US citizen, or at least base this advice on the US.
I have only known 1, possibly 2 people who can make this sort of thing work here in the UK, people just don't do this.
Plus, CrossFit isn't really a thing here except in trendy bits of London. For many of these places there isn't a "lunch spot", people take their lunch in to their building in a business park where there's no lunch options or options for socialising.
Overall, while you don't work in tech, I think you're probably privileged enough to work in an industry that works pretty similarly. Most of your advice would be pretty good for me, but almost none of it would work for someone at the beginning of their career, aiming for a large company with out of town offices – a fairly typical starting point for many graduates.
Yeah, I am American, and I definitely understand there's certain informalities available to us here culturally.
I think the key for new graduates is to have been thinking about getting a job for the last 4-5 years. Don't start looking and preparing when you graduate, you're already behind.
I kept a job I started as a teen as a lifeguard for like 6 or 7 years even though there were much better opportunities available to me financially because I knew the stability was one of the best things I could bring to the table. Resume building.
But in addition to that, I started networking long before I left school. It's essential. Despite the fact that culture may differ in other countries, I don't think that fundamentally changes my advice. The tactics may differ, but the strategy is the same because human nature is the same.
> There are a fair few jobs doing it, but they are relatively low level, have no "community", no real way to facilitate referrals
I’m only tangentially exposed to the biomedical field but I’m fairly sure this isn’t broadly true. I’m a member of a single cell RNA sequencing slack group whose members host a meetup twice a month. Most members work in wetlabs and many conduct research into the effects of various drugs on cancer.
The groups are probably harder to find (than programming meetups) but I doubt there are none of them.
>but when talking to him he's quite monotone and low energy. Because of this, he's failing these automated video interviews as he looks like he's not a "team player" or "personable".
i'm looking forward for the day when my AI "clone" - a very personable, cheerleader level bubbly and energetic deep-fake based of my face/voice - will charm away the dry pedantic HR AI on the other side of the screen.
I'm amazed the lawsuits aren't already locked-and-loaded.
All you need to do is acquire the software, feed in a video clip of a male/female or white/black applicant saying the same things (maybe use deepfakes to ensure consistency in delivery), and if the score is one iota different, you've got the developer, at best, having to try to explain away a lot of ugly black-box logic, and at worst, a cut-and-dry discrimination case.
A human screener is probably somewhat biased too, but he's at least trained to provide a legally acceptable excuse for his choices.
Who has the liability if the algorithm is found to be illegally biased? I assume it would be based on the contract (or perhaps there are other laws relating to having a service rendered illegally), but I wonder which way it goes
I consult. When its remote, I sometimes recommend we have a person on the ground (at the customer plant) to coordinate and manage data. If they don't agree, I hire a graduate anyway and ask that a place be made in their office for them. Ultimately the client find them useful and they get hired on, because the client actually did need somebody to be expert in the new software.
It's a responsibility to help young graduates find an entry to a company. All it takes is that first entry.
Question for you. How do you find good fit for graduates in your niche/good candidates for the role?
My experience consulting was that the spectrum of work I could be asked to do was so broad I'd have trouble, despite having a handful of junior engineer friends, feeling confident I could hand off work to them; let alone a graduate with enough locality to be onsite. (and despite wanting to, as well, I'd love to give them those opportunities while selfishly allowing myself to scale further)
I'm not sure whether this is a problem due to lack of specialization on my part ("big data/distributed system design"), or lack of finding properly aligned graduates; I'd love your take on building those connections.
Sure.
It generally goes like this: they think they want another Engineer, but what they/I need on site is a project coordinator/manager type. That's where they drag their heels on hiring somebody.
So I hire a Journalism or Business graduate, give them mentoring for the first month on-site, and being young and ambition they take off.
The principle thing in dealing with an Engineering organization (I tell them) is, when you don't understand what they're saying, say "I don't understand that. Can you elaborate?" Engineers generally love to explain their stuff. And it doesn't hurt to appear humble and eager either.
Anyway it works for what I've been doing (data migration, engineering data management, test fixture setup and deployment) since its more software configuration and vendor selection than it is technical Engineering. More a planning and coordination role. Especially since the departments I get injected into are already busy, and are glad to offload non-Engineering tasks.
It's a situation that's rich with opportunities to make yourself legitimately useful and productive. So the right candidate is not chosen so much by credentials, as skills in organizing, communicating and following up.
One very successful example had been a manager for an Americorp section, looking after 20+ young people as young as himself, and keeping them motiviated, supplied and supported. This person is now being groomed for a higher-level management position in the service company I introduced him into. I expect to see him as a Director or VP one day.
Btw, it doesn't hurt to have somebody at client companies that feels predisposed to look favorably on my services. Not only can I continue to get connected to whomever it takes to get a contract moving, I have a positive voice in their meeting rooms supporting us.
While I agree that using AI to rank the candidates based on their responses to pre-recorded videos is dystopian, I would like to point out that it only works in the environments where:
a) The supply of candidates greatly outweighs demand.
b) There is no clear way to assess the individual candidate's impact on the business' ability to make money, and the employees are essentially commoditized.
These environments have been known to be toxic for a while: filtering people based on bullshit 20-page personality questionnaires, managers hiring the most desperate ones who would tolerate the most humiliation from them, while every part of the process trying to a wink-wink nudge-nudge to others in order to land their nephew on a cushy position.
While AI definitely starts another round in the battle, I would advise anyone who's sick of this bullshit to focus on the economic output of their prospective position. If a company is looking for an AI engineer that can halve the size of their inference model without losing more than 1% of accuracy, they won't waste time scoring your story about your weak sides. They will look at your thesis and hire you if you already did something similar. The problem is that most of the jobs these days are about sitting quiet in a friendly team and smiling when the CEO shows off the team to the next round of potential investors. But those jobs won't give you much growth, and will end your career once you don't look young and cool anymore.
This has got to be a fake. The only way AI would be able to make any sort of correct judgment about someone's probable future performance is if it had some training set of data of employees who had their interviews recorded and who were known to be bad in some way. However, it's still a massive leap in logic to be able to correlate anything an AI could observe to some tangibly bad behavior later on.
I'm thinking the promises of this technology will never be realized.
You're right in that there's no reason to believe AI can do this effectively.
However, in my experience, HR departments and hiring processes are anything but rational. HRs buy any kind of snake oil by the barrel, and hiring managers generally reach their position for reasons entirely unrelated to the ability to identify and hire above average employees.
HR departments are by definition on the defensive, that is a large part of what "human resources" is - a legal shield department to make sure employees stay in their lanes, with employment and legal threats as guardrails. All they need is a few pop culture articles saying these "truth/personality/job filtering AIs" are liability traps, and they will cease to be popular instantly. HR is a fear driven department, play on that.
Not sure on the current state of things on the recruiting side but it was tried at Amazon. I don’t see any reason why they wouldn’t run it again (if they aren’t already) if they can solve some of the problems discussed https://www.reuters.com/article/us-amazon-com-jobs-automatio...
I’m not too worried about the long game though, humans are proving to be quite adaptable with tools like this one; an AI that optimizes your resume to match a given job description: https://www.jobscan.co/
Google and other large tech giants have been working on precisely that sort of AI with those types of data for at least 10 years now. They are famous for tinkering with their hiring process based on mountains of data, with mostly bad results.
At Google machine learning was was always used for vetting the 100k+ applications per year that the HR team just doesn't have time to go through.
The main goal was to filter people who have the biggest chance to go through the human interviews later in the process.
The only reason why it's called ,,AI'' now is because it's trendy. It can be just a simple linear model that takes education (University) and GPA score into account.
The simplest way to skip these filters is to get a reference from a person who already works at the company.
I wonder why ultra-elite companies like Google don't swap to a model where they reach out to people they want to hire, rather than accepting random applications?
> I wonder why ultra-elite companies like Google don't swap to a model where they reach out to people they want to hire, rather than accepting random applications?
This is exactly what happens. Most experienced engineers I know with a good resume get emails almost weekly from Google and Amazon in particular.
I know - I meant why do they also have an open mailbox for millions of people not remotely qualified (I presume) to spam into, creating this problem for themselves.
If you’re at the level of working for Google you’re presumably already at the absolute top of your field internationally and will know people at Google you can reach out to personally.
Google hires tons of people fresh out of university, they're not going to be "at the absolute top of their field" or well-connected. (they do seem to hire previous interns a lot, but that moves the filter having to handle the flood to people applying for internships instead)
I think they were saying the claimed performance of the interview-processing software has got to be fake, not that the article must be. I read it the same way as you at first,though.
confirm that at least one UC Berkeley affiliated PR was paid quite a lot of money to do exactly this kind of work for several years, that was five years ago at least.. they did not use bombastic terms to describe the project, evoking the authority aspect, instead they used language that signaled "finding hidden talent"
In my opinion, this abuse of "AI" is criminal fraud. It is simply not possible to do what they claim. The entire "industry" of companies offering these services need to be investigated and shut down.
I am not sure where I heard it, but I keep coming back to the quote "There is no algorithm for absolute truth". People who aren't in-the-know about AI tend to lend it much more credit, when it comes to its ability to accurately measure something, than is due.
There is no scientific method to recover, access, or identify another consciousnesses mental state short of asking them, and then you must deal with the possibility of their answer being incorrect due to a entire complex world of reasons. This class of "AI" is pure fraud.
Absolutely, without this it is just snapshotting and compressing the particular biases of the interviewers in the training set, in a lossy way, not training against ground truth (good hire // bad hire)
Whenever I see AI used for something like this, right after thinking "our dystopia is here," I wonder how it could be gamed. An experienced interviewer can tell when they're being told what they want to hear; I suspect an AI could not. The AI ends up rewarding preparation and acting skills. And when you make too many hoops for people to jump through, eventually you find you're just selecting for those who are good at hoop-jumping.
Students in South-Korea have a strong prep culture, and are already trying to game these types of interviews. It is much harder than gaming skill-based tests ("Don’t force a smile with your lips, smile with your eyes.").
> The AI interview is too new, so job applicants don’t know what to prepare for and any preparations seem meaningless since the AI will read our faces if we make something up.
I strongly feel the market will work itself out: If AI interviews bias companies to hire hoop-jumping and actors, in half a decade, other, more traditional hiring companies will eat their lunch. If the AI interviews only deliver top notch candidates... it is working as intended, and who am I to complain about my own lack of prep or acting skills? Not like whiteboarding or being able to quickly think on your feet without access to StackOverflow is biased against me already, and companies who use these for hiring seem to be doing just fine.
I also think you're self-selecting for desperate people who couldn't get jobs elsewhere, as most in-demand candidates would not put up with this nonsense. I know when I was looking, anywhere that had one of those 100-question inane personality tests ("Would you rather lead a meeting, or go skydiving?") etc I would bow out of the application process.
On the other hand, consider the position that had a 100-question personality test was never intended for a competitive candidate like you. It was intentionally crafted for the desperate job seeker who might take a lower wage.
I don't see a meaningful distinguishment between AI-driven hoop-jumping and in-person. When whatever you're measuring before becomes publicly known, it immediately becomes useless -- anyone can optimize for it. Humans are equally as likely as AI to reward preparation and acting skills, especially since suffer from implicit bias.
It sounds like so far this is only being used as a very coarse, first-pass filter. It'll definitely get gamed, though.
It would be cool if there was a website where applicants could upload their videos and the hiring outcomes, to provide data from which organization's hiring biases could be inferred.
i wonder the same thing, it's an arms race, perhaps someone will release a counter-measure like the chinese meitu (http://global.meitu.com/) that is trained against these types of ai's..
"A new obstacle to landing a job: getting approved by AI"
Doesn't this same problem exist for anyone finding a job? It's probably even worse when you're looking and your 40 or 50 and even worse if you're changing careers. I'd bet AI drops anyone over the age of 40, almost certainly 50.
I'm not saying "these darn kids have it so easy today" I'm sure it's not, but my 1/2 educated guess is that the older you get the harder it gets to make it past these algorithms.
Most probably unusable in the UK, as the AI won't be able to explain why it rejected a candidate ("why did you score them low for team working?") and the candidate will take the employer to the cleaners. God only knows what hidden variables it'll start discovering and exploiting. My guess is it'll reinforce well-known prejudices.
I'm quite hard of hearing. I wonder what the algorithm's analysis of "facial expressions and the tonality of the job applicant's voice" would supposedly reveal about me.
Probably something actionable under the nondiscrimination laws here, thankfully.
It will be hard to predict what will be considered unusable post-brexit. Although, I don't know how an algorithm will be able to adhere to a mish-mash of job requirements laid down by Dom Cummings for running the government, which involves assorted wierdos juxtaposed with data scientists and Y-Combinator alumni. However, if people like Uri Geller keep applying, it will at least keep the HR bods in gravy.
The vast oversupply of talent has gotten so large employers can’t even be bothered to read resumes anymore: Ai will. This is what happens when ever larger percentage of society gets more and more degrees
> 2. A bill for 5 typewriter ribbons and 24 pencils totalled $9.85. If a typewriter ribbon costs $1.25, what is the price of a dozen pencils?
> The IBM Programmer Aptitude Test is from 1960 (? I think. It’s tough to read the date on the mimeograph). Is it just me or have technical interviews gotten much harder?
> I interviewed for a programmer job at Facebook several years ago, and one of the questions I got was, “Given a set of obstacle coordinates, write a function that finds the shortest path between two cells for a knight on a 3-D chessboard while avoiding the obstacles.”
> I remember this question because it was one of the few that I managed to solve. Probably a half-dozen others ended in fumbles. Not surprisingly, Facebook didn’t give me an offer.
> Some time ago, we compared technical interviews from the 1960s to those of today: They’ve gotten much, much harder. I suggested that the last few decades of lax immigration have invited overseas talent to raise the bar – it was unfounded speculation at the time, but some number crunching shows it might be true.
> According to this Joint Venture Silicon Valley report, 74% of Silicon Valley tech workers are foreign-born immigrants. A decade ago, 36% of Silicon Valley tech workers were born abroad. In 2000, only 29% were.
> Tech industry employment has increased from about 300,000 jobs in 2007 to 400,000 in 2016, so even though we created 100,000 engineering positions in the last decade, we’ve also displaced 88,000 domestic engineers.
I'm suddenly curious what other fields are staffed by 74% foreign-born immigrants.
According to Purdue, the first computer science department in the US was established there in 1962, so of course in 1960 IBM wasn't that selective - they couldn't hire CS graduates. Maybe math majors.
Also, I don't know when CS degrees became common among programmers, but it didn't happen right after they were invented, I'm sure.
It's an example of conventional attributes which didn't exist then because the whole field was unformed. It seems like that would inhibit specialized aptitude tests - if you're hiring math or whatever majors back then (not that it doesn't happen today) you can't ask them about linked lists or sorting algorithms.
Not to mention, a vast number of people who could be good programmers wouldn't have had access to a computer before a first programming job, even if they had a college degree. Today it's much safer to assume that if someone is any good, they will have been exposed.
But nobody's talking about interviews that ask for conventionalized CS material. The point is that the interview questions are more difficult. They aren't especially more computer-focused.
I wasn't asked more difficult questions in 2007, when I got my first programming job. Several years on, we had some discussions about what kind of questions to ask candidates. One place I applied to more recently had a flowchart reading test that must have been from the punched card era.
So I don't concede the point, anyway.
But, the examples above definitely do compare more and less computer focused things.
> Seems like a constant gripe about talent shortages but then conflicting stories about how to filter an overwhelming supply of talent.
> I'm honestly curious how both can be true.
There's an overwhelming number of applicants. That says nothing about their qualifications.
I don't know what Big Tech's numbers look like, but when I did hiring, the percentage of applicants that were even remotely qualified was usually in the single digits.
> There's an overwhelming number of applicants. That says nothing about their qualifications.
People explain away Google's high false negative rate, and desire to not do anything about it, by claiming they receive such large numbers of qualified applicants that it isn't worth caring about false negatives. How does that narrative jibe with the "talent shortage" or what you stated here?
There's a certain amount of innate talent involved in any profession, so additional training and education is unlikely to help. No amount of education or training would ever turn me into a professionally successful artist, for example.
Perhaps there is just a small group of people who apply for every job and are continually rejected. Whereas most good people aren't really applying for any jobs or very few.
There's an oversupply of people with great credentials, CVs, references and self-promotion skills and there's an undersupply of people actually willing to do the work.
Talent shortage is a myth. Companies place themselves in the worst locations which are the least accessible places in the US (extremely high rent and commute distance) and then wonder why they only get a modest number of applicants.
Also, I’ve noticed many companies are terrible at highering
I think job applicants should have to compete with eachother in a craps tournament. Losers are knocked out of the pool until the number of applicants matches the number of open positions. I also think this is how we should choose the president.
There's an old joke that the first cut is or should be made by randomly throwing away a subset of applications, because you don't want to hire someone who is unlucky.
But seriously, I think it is probably a good idea, if you have to throw out 90% of applications to start, to do it randomly if the alternative is a dubious statistical or ML-derived model. It would make me feel better from the point of view of hiring or as a candidate, not to have the risk of an unforeseen undesirable bias.
This is why I am very skeptical when companies, especially tech companies, go running to Congress wanting more imported labor. They're ignoring qualified applicants left and right due to processes that are optimized to minimize the risk of ever making a bad hire even though it filters out lots of good hires. Fix your process before expecting laws to be changed to ameliorate the negatives of the process you chose.
I am thinking at what point all those "business
innovations" will turn democracy and freedom into a corporate gulag. What's the point of having so called free and democratic government system if you'd be back to USSR 5 days a week from 9 to 5.
Pure fraudulent AI snake oil. Where is a hungry lawyer that wants to team up with computer scientists and take down these predator companies. There is serious wealth to be had by exposing these con men, and a hell of a lot of misery to be halted.
The flip side is that top talent won’t want to put up with the BS so a lot of these companies end up shooting themselves in the foot to save a few bucks on recruiting.
Just seems like a way to lose a lawsuit. All you need is some handicapped client the system can't understand, or a demonstration that it's biased against an accent, and that's all it'll take.
The customer could blame the service provider - especially when the provider stresses how unbiased they aim to be. But then negligence by the customer is an issue too. I agree with you — bias in ML systems exists and applying it to more facets of life is tricky and snake-oil can be tasty, but this does also surface issues that require standards and assessments of bias.
It seems like small companies using this will probably fail. They will hire batches of AI-approved, but actually bad candidates. Or they will go long periods without being able to hire for core positions because they "can't find anyone". Both eventually result in the company going out of business.
But government or big companies, well, its just bad news there, since they can't fail. They will just get more incompetent.
I tried to apply for a in-between job at a large outdoor store. They wanted me to use the HireVue software. Unfortunately, there was no way to appeal; the email link they sent was one of those "noreply@contractcompany.com" deals.
I’ve thought for years a SaaS app could be created that elevates a prospects chances of a call. I’d have to hire folks to work alg but we could take their resume and apply to job with a very high score
literally all of them? I guess we can pretend that Applicant Tracking Systems (ATS) that scrape resumes for keyword matches aren't some specific definition of "AI" if we want to be naive though
the fear mongering in here is interesting. "secret, unproven" algorithms (much less intelligent than anything developed with machine learning) already control an individuals access to all sorts of things, like college, credit, insurance, etc. there is a positive side to approaching this problem and more humans is never going to be it.
He's smart, motivated, very hard working, and has the necessary credentials on his CV, but when talking to him he's quite monotone and low energy. Because of this, he's failing these automated video interviews as he looks like he's not a "team player" or "personable".
While interviewing practice is becoming more aware of bias against things like this, the training data for these "AI" systems are going to be based on those interviews conducted in the past, often with questionable outcomes. The software is sold as helping "remove bias", but it instead pulls interview bias forwards in more extreme ways to the CV-review stage, preventing candidates from ever getting in front of a human.
While multiple humans on a hiring team may overcome some of their biases working together, a single piece of software trained to look for stereotypes will exacerbate those biases.