Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> FSD

The F here stands for 'full' here but...

> I need to remain mindful of the system at all times

It sounds like it should really be called supervised self driving (SSD) to me.

I think the technology is interesting but I wholeheartedly object with the name and the promises that it implies.

> If you pin the failure of a driver to oversee FSD on Tesla

I think it's reasonable to pin the failure on the system if you call it FSD.



Tesla's system requires less supervision than other manufacturers. And that's where the rub is. They're saying -- hey, we'll make this tool available to you, and it really will function autonomously, but you have to know Tesla's not going to take the hit if there's an accident. In America's litigous society, I think that's the only way we'll ever get these tools. Otherwise, the plaintiffs' lawyers will destroy Tesla. I want the technology, and I'm comfortable assuming the risk if there's a fuck-up. If Musk needs a way to prove I wasn't watching and that the liability to transfer to me -- that's fine. I want the option. And Ford/GM/BMW et al won't give it to me.


> I'm comfortable assuming the risk if there's a fuck-up.

Well, here's the rub. The risk is not just ours. It also involves others.

If said "fuck-up" is spilling some tomato sauce on the carpet, then sure, we can say "my bad," and take out our checkbook. It's fairly certain that the other person the risk exploded on, will accept our amends.

However, if it is running over a child, I don't think the checkbook thing will work.


> I want the technology, and I'm comfortable assuming the risk if there's a fuck-up. If Musk needs a way to prove I wasn't watching and that the liability to transfer to me -- that's fine. I want the option. And Ford/GM/BMW et al won't give it to me.

It isn't just about you, its about everyone else on the road as well. You don't automatically deserve to operate a less safe system on public roads just because you are willing to accept liability. Other manufacturers, at least with regard to autonomy, recognize that fact and design products that mitigate risks with a proper safety lifecycle and design domain.


But that's where you're wrong. All data indicate it's SAFER than human drivers. My decision is, on average, making people more safe, not less. https://www.tesmanian.com/blogs/tesmanian-blog/tesla-autopil...


There are serious statistical issues with Teslas claimed rates, but even so autopilot != FSD and Tesla FSD is, imo, currently benefiting from the left hand side of this chart:

http://safeautonomy.blogspot.com/2019/01/how-road-testing-se...

Their disengagement rate is so high that as it stands it keeps drivers vigilant, but as the system improves driver vigilance WILL fade and without robust mitigations FSD will become less safe than a human for a considerable amount of it's development.


> Tesla's system requires less supervision than other manufacturers.

> I need to remain mindful of the system at all times

> and it really will function autonomously

To me this is a contraction. It's said to be both autonomously and to also require constant supervision. And that's why I think it's marketed incorrectly despite it being an interesting piece of technology.


If Tesla really believes it "really will function autonomously" then they shouldn't have a problem assuming liability. Further, if it needs to be supervised, then it's not autonomous, is it?

Congratulations on being so accepting of being sold a bill of goods, though.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: