Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why are safety drivers not:

- working in pairs, so there is social pressure, conversation, and two pairs of eyes to increase alertness and safety?

- doing shifts of 30 - 45 minutes at most [1] (although they could potentially swap back and forth with a co-driver)

- issued a dumb-phone for emergencies and searched for entertainment devices (it's good enough for Amazon warehouse staff)

- being monitored by the driver-facing camera, with training and termination for drivers who can't hack it

- monitored automatically for attention using eye tracking or other methods, with the car safely stopping if lack of attention is detected

- required to take over on a random, regular basis for a short period to keep them engaged and attentive (and obviously, the car keeps driving if they don't take over, but they are marked down)

Due to the boredom, it is an extremely demanding job, but the way it is being done is clearly not good enough.

[1] I can't find anything published about how long the shifts are, but I'm guessing they are longer.



I think these are sensible approaches to try and mitigate the problem.

Still, I'm not sure this is a problem that can be effectively mitigated. I've been _extremely_ suspicious of any AV system that requires a human to be attentive, alert, and ready to take over at a moment's notice.

Humans are just _so bad_ at that kind of task. Since we're apparently making people do that job anyway, I'm all for the mitigations you've outlined. But I think it's absolutely crazy that we have humans doing a job that we're just _really really_ bad at, and then making that the critical safety element of the testing.


All of those suggestions will increases cost and decrease profit, there's no law mandating any of those so why would they go above and beyond for 'safety' reasons?

I understand it's the ethical decision but at the end of the day for-profit companies are only interested in one thing: maximizing profits.


Robin_Message basically says a lot of what I was thinking, but you also need to consider what maximises profits in the long term (and self driving cars are a long term project) - is it doing the minimum now to reduce costs, or is it taking all reasonable precautions to avoid the public perception building that self-driving cars are unsafe?


Profit maximisation is not the only factor of running a business, even a public company.

But even given that, what are the PR and legal costs of this predictable [1] accident being made worse by a poor safety system? Plus the downtime on doing training/testing? And the probably harsher requirements that might now be enforced?

Maximizing profits doesn't mean doing everything on the cheap.

[1] in the sense there was likely to be an accident at some point.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: