"Unfortunately, these warnings were not heeded in this incident. The vehicle logs confirm that the automatic Summon feature was initiated by a double-press of the gear selector stalk button, shifting from Drive to Park and requesting Summon activation. The driver was alerted of the Summon activation with an audible chime and a pop-up message on the center touchscreen display. At this time, the driver had the opportunity to cancel the action by pressing CANCEL on the center touchscreen display; however, the CANCEL button was not clicked by the driver. In the next second, the brake pedal was released and two seconds later, the driver exited the vehicle. Three seconds after that, the driver's door was closed, and another three seconds later, Summon activated pursuant to the driver's double-press activation request. Approximately five minutes, sixteen seconds after Summon activated, the vehicle's driver's-side front door was opened again. The vehicle's behavior was the result of the driver's own actions and as you were informed through multiple sources regarding the Summon feature, the driver is always responsible for the safe operation and for maintaining proper control of the vehicle."
Basically, they designed an autonomous-operation mode that was easy to activate by accident and incapable of reliably avoiding crashing into things, it appears someone did and his shiny Tesla crashed into a trailer as a result, and they responded by accusing him of intentionally activating the feature and misusing it.
Why is this being down voted? The highly detailed log of the driver's every action is crazy creepy.
I get that the data is likely useful for debugging, and may very well be a function of the feature's beta status (can someone confirm? Or is this something that Teslas do all the time?), but it's still insanely creepy that every single action this guy took in his own car was remotely logged and accessible. This guy is basically driving a Telescreen from 1984 to work.
Looks like there's someone doing a carpet-downvoting everything in the subthread(my root post dropped ~3 pts just as these were downvoted).
Yeah, it's a double-edged sword. On one hand it's a ton of data, on the other there's multiple cases where you don't need to bring the car into the dealer for them to diagnose something.
Oh wow. I have been anti-Tesla due to their creepy "we still own your car" auto-update craziness but that just takes it up another level. No, Tesla, I will not buy your cars, not now not ever, because you don't trust me and therefore I do not trust you.
But that seems to mostly be speed and throttle information stored in a black box in the car that logs in the event of an accident and isn't remotely accessible. That sort of thing is a far cry from "our server logs show you opened the driver side door at 5:53 PM" like Tesla is doing. If other manufacturers are recording that sort of granular data, too, then yikes.
I don't think the car's logs are automatically transmitted to Tesla. They reside on the car, and Tesla can login remotely to view them if they have a valid reason to.
Who decides if it's a valid reason, and who authorizes such access?
If it's not the owner... then they aren't really the owner.
With the number of cameras/sensors on a Tesla, it's a privacy nightmare... I won't buy one until the answers to these questions are the ones that I want them to be.
What I also found irritating in this article was the sentence "you remain prepared to stop the vehicle at any time using your key fob or mobile app or by pressing any door handle". Stopping the car by mobile app?!? Imho stopping the car in such situations is something safety critical for me, which absolutely requires hard realtime behavior. I don't see any chance to achieve this through any consumer hardware/software or a bluetooth/wifi connection.
Unfortunately, these warnings were not heeded in this incident.
The way cars chime at people, so often, for such bs reasons, it's a wonder anyone would design a UX where it's safety-critical for someone to pay attention to a chime and pop-up. That UX designer needs to have a good talking to, or be fired.
Heh. In contrast to most GPS systems, my Audi doesn't outright block the ability to enter destinations whilst en route, but instead says "don't do this while driving", and proceeds to let you do it if you so choose.
I agree with you in this case. A Tesla is a mass market item. It should be held to very high safetly standards.
Just ask the designers of that Chernobyl plant.
I somewhat disagree here. It was 1960's technology. Even now, complex systems like that inevitably have a plethora of ways that humans can screw them up. It's very hard to completely prevent determined idiots from destroying the equipment.
"Unfortunately, these warnings were not heeded in this incident. The vehicle logs confirm that the automatic Summon feature was initiated by a double-press of the gear selector stalk button, shifting from Drive to Park and requesting Summon activation. The driver was alerted of the Summon activation with an audible chime and a pop-up message on the center touchscreen display. At this time, the driver had the opportunity to cancel the action by pressing CANCEL on the center touchscreen display; however, the CANCEL button was not clicked by the driver. In the next second, the brake pedal was released and two seconds later, the driver exited the vehicle. Three seconds after that, the driver's door was closed, and another three seconds later, Summon activated pursuant to the driver's double-press activation request. Approximately five minutes, sixteen seconds after Summon activated, the vehicle's driver's-side front door was opened again. The vehicle's behavior was the result of the driver's own actions and as you were informed through multiple sources regarding the Summon feature, the driver is always responsible for the safe operation and for maintaining proper control of the vehicle."
Basically, they designed an autonomous-operation mode that was easy to activate by accident and incapable of reliably avoiding crashing into things, it appears someone did and his shiny Tesla crashed into a trailer as a result, and they responded by accusing him of intentionally activating the feature and misusing it.