And if your local police department wants to buy that data? What if a large law firm wants to buy all the data for use in lawsuits? Once you have pass it on to "third parties" then there isn't much you can do to stop such things.
It happened a few decades ago as in-car GPS rolled out. Some rental car companies started issuing speeding tickets. That game lasted about a week.
>> Feb. 2002. A Connecticut man has taken a local rental-car agency to court, after the company used Global Positioning System technology and fined him $450 for speeding.
I’m not ignorant of your concerns, but that’s what privacy laws are for. These third parties have been scrapping Tesla APIs at the behest of their customers for years. This is nothing new, simply more formalized.
If the laws are insufficient, that’s a call for better laws (which I agree are needed). The apps I use do not sell their customer data, but Tesla should probably stipulate API integrations aren’t permitted to as extra guardrails (with violations being an API access death sentence, killing the app business).
Or forget the laws. I will buy a car that doesn't stream data. My current vehicle doesn't and I have never felt the need. My next one wont either ... not if I have anything to say about it.
That doesn't change the fact that this framework is an improvement on the status quo for Tesla owners. And presumably most Tesla owners are not like you, so what you would do is not super relevant to this discussion.
The question is, why does a car have to at all? And no, the closest we are is a car that could share data with the OEM, if you wanted, and we don't. As long as possible, I'd pass hard on cars where you don't have that choice.
If it is in the EU, yes you do. Just compare the difference between what Toyota does with customer data in the US and the EU. Preferably, I can remove the sim card myself and all connectivity stops. If I cannot do that hardware wise myself, I havw to assume I am tracked.
The different OEM approaches between the US and EU show you how important legislation is. Especiqlly since Tesla's zrack record regarding privacy is abysmal.
The times Elon has shared in car video feeds to try and blame the driver for autopilot crashes, without that driver's authorization to do so. Legal? Sure, according to every stupid, untested EULA that everyone has to sign, but you didn't ask about legality.
Cool, but there are people who like to look at data, like Wh/km and efficiencies at different temperatures etc.
Funny thing I wanted to point out though, you sound like one of those people who say "I will buy a car that runs on gas, I've never wanted a vehicle that runs on electricity and my next one won't either... Not if I have anything to say about it".
I feel like I've seen so many of those comments lately... I want to ask why?
It's like me saying something like "my phone doesn't have a screen. I've always had an analog connection and I've never felt the need to have a digital screen and any smart apps on my phone... Etc etc not if I have anything to say about it."
Or perhaps even something like not wanting an automatic transmission because a manual transmission is the only way to go lol..
The parent isn't rooting for ICE, he's rooting for having a vehicle that doesn't stream data to the outside.
There is nothing that mandates that a vehicle -whatever its technology-, must send all its data to external servers.
I, and many others, want to have the right to possess a car that isn't relying on external connections to work, and doesn't send all its data out.
It's more akin to wanting a TV that's not connected online. Some people like the added features of a smart TV, but some just want a dumb screen that doesn't sell their viewing habits, or screenshots of what's on the screen to third-parties.
> I’m not ignorant of your concerns, but that’s what privacy laws are for.
It's a shame then that the few laws we have protecting consumer's privacy are not adequate to the task. It's reasonable then to do our best to avoid products and services that exploit the weakness of our laws until that situation is improved.
As an aside, how do you know that the apps you use aren't selling your data? Is it only because of their entirely non-legally binding statements and privacy polices? What do think could happen to that data when those apps/companies are sold or otherwise acquired by someone else? Even if they were telling the truth about not selling your data right now, do you think that means it isn't readily available to police or the discovery process in a legal dispute? Do you think that data collected or displayed by the apps could be exposed to Google or Apple and collected?
To answer your questions (I could not tell if they were rhetorical or not), worst case outcome is that someone either legitimately or illegitimately has the history of my vehicles’ locations and commands issued to the vehicles. As mentioned, in my grandparent comment, within my risk appetite. It could happen, but I don’t care enough to worry about it.
> It's reasonable then to do our best to avoid products and services that exploit the weakness of our laws until that situation is improved.
Agree to disagree. If the cost of loss is low, to go without the product or service is more costly than potential data loss. The services I use within this context are very likely not actively lying about their privacy policies.
> worst case outcome is that someone either legitimately or illegitimately has the history of my vehicles’ locations and commands issued to the vehicles.
That is hardly the worst case outcome. The worst case outcomes would be those where that data is used against you because it was sold/leaked/subpoenaed. There's no end to the ways it could be used against you either.
Maybe a future potential employer doesn't like how often you visit bars, or what church you attend, and you are passed over for a job you want.
Maybe your car's recorded proximity to where a crime took place makes you a suspect in a crime you had nothing to do with and it costs you tens of thousands in legal fees to clear your name. (similar to what happened to this guy: https://www.nbcnews.com/news/us-news/google-tracked-his-bike...)
Maybe that data gets pulled up in a divorce or custody battle. GPS records and toll transponders are already being used in such cases to show things like patterns of working late hours, visits to girlfriend's houses, or undisclosed income
Maybe that data is used by advertisers to more effectively manipulate you into parting with more of your money.
Maybe your insurance company (health or auto) buys it up and their algorithm decides to jack up your rates because you hit up a fast food drive thu once too often or you speed too much or drive to many hours.
Maybe you visit or even park too close to a gay bar, mosque, or planned parenthood and you get harassed by an extremist group or dragged into a Texas courtroom.
Because the data never goes away, it can follow you for the rest of your life and be used by others again and again at any time in whatever way the person who gets their hands on it feels will benefit them.
> Agree to disagree. If the cost of loss is low,
Everybody is free to decide for themselves what level of risk is acceptable to them. With some kids you can tell them that the stove is hot and they will leave it alone, while others have to touch it and get burned.
I hope that you never suffer a consequence that makes you regret the data you gave away, assuming that you can trace it back to that data in the first place. At least you can say you were informed about the dangers and made an informed choice to roll the dice. I worry a lot more about the folks who don't even realize what data is being collected, the ways that it can be used against them, or who assume that they can count on laws and privacy polices to protect them from harm.
> Maybe a future potential employer doesn't like how often you visit bars, or what church you attend, and you are passed over for a job you want.
Take this as my opinion only, and I appreciate that some people don't have the luxury of this opinion, but for me there is zero Venn diagram overlap between jobs I want and employers who are as creepily obsessed with my private life as that.
> Maybe that data is used by advertisers to more effectively manipulate you into parting with more of your money.
If you want to frame advertising as manipulation, then I suppose so. Personally I don't see it that way. If I see an advertisement for something I want and I end up buying it, that isn't any more manipulative than seeing a particularly attractive banana in the supermarket and buying it.
Advertisements are a way for people who make things or do things to find people who want those things. Successful advertising isn't a bad thing so long as it isn't dishonest. I think if there's a 1% chance that I'm more likely to become aware of a product that actually interests me, I see that as a win for me. I have finite money and I'd rather spend it on things I want more than things I want less.
> I hope that you never suffer a consequence
I'll happily wear that risk. I greatly prefer it to the alternative, which is the guaranteed suffering which results from being obsessively paranoid.
> there is zero Venn diagram overlap between jobs I want and employers who are as creepily obsessed with my private life as that.
I feel the same way, but employers aren't going to tell you they're digging into your personal life or why you were turned down for the job. You just get ghosted. The problem isn't limited to employers either. It could be a landlord, or a bank. Part of the problem is that you aren't allowed to know when it's happening which makes it hard to avoid.
> If you want to frame advertising as manipulation, then I suppose so. Personally I don't see it that way.
Ads can be informative, but when was last time you saw an ad that wasn't in some way manipulative? If you can't even see that it's happening, you're likely more susceptible to the effects, but even you know it's happening you're still influenced by manipulation. We all are. Ads are carefully designed to exploit flaws in our brains. Ad companies have spent massive amounts of money and research to maximize the effects, even experimenting on children to learn things like how early a child can recognize a brand.
> I'll happily wear that risk. I greatly prefer it to the alternative, which is the guaranteed suffering which results from being obsessively paranoid.
You know what the say, it's not paranoia if they're really out to get you. The examples I gave of the harms that can result from abuse of your personal data are based on things that have already happened. People might be happier if they are blissfully ignorant or can convince themselves to ignore what's going on, but I feel better if I take some simple steps to avoid potential harms and stay aware of what's happening in the world. It's pretty easy to just not buy a car that collects your location 24/7 and even easier to avoid giving that data to unnecessary apps.
> but employers aren't going to tell you they're digging into your personal life or why you were turned down for the job.
So I don't learn why I didn't get a job I definitely wouldn't want. Perhaps not the ideal outcome, but not far from it. I really don't understand what the problem is here.
Would I prefer if I accidentally ended up working for an awful person because I never gave them a chance to reveal their awfulness? Absolutely not. I don't want to work for an awful person even if they never get a chance to be awful to me personally.
> even [if] you know it's happening you're still influenced by manipulation
I reject your framing but to the extent there's any truth to it, I'm going to be manipulated by ads no matter what. Given that, I'd rather be manipulated by ads that are better targeted to me. This is an important point — manipulation is mostly orthogonal to targeting. Especially since hyper-targeting quickly becomes overtly creepy and loses its manipulativeness.
Considered holistically, there's no downside for me being targeted rather than broadcasted. In fact if targeting means that a company's customer acquisition costs are lower, there's a non-zero chance it could result in me paying a lower price for something I would have bought anyway.
> So I don't learn why I didn't get a job I definitely wouldn't want. Perhaps not the ideal outcome, but not far from it.
Would you say the same about the house you wanted to buy/rent or the loan you needed? It just makes it easier to hide discrimination that would otherwise be illegal. At least the bigoted HR person in charge of screening applicants is probably not someone you'd interact with once hired.
> I'd rather be manipulated by ads that are better targeted to me.
I doubt I can change your preference for targeting advertising, but I will offer you a few perspectives on the subject you might not have considered:
It was manipulation through ad targeting that made that whole Cambridge Analytica situation such a problem.
Targeted ads artificially limit what new products and services you get exposed to only those things companies "think" you want, or what advertisers want you to buy, or whatever will make them more money vs things you might really be into or prefer if you'd had the opportunity to hear about them. It allows them to shape culture and segregate populations. This is a similar problem to the "filter bubble" found in search engines.
knowing that we're vulnerable to advertising, I'd much rather be tricked into forming an irrational association linking happiness, or acceptance, or well-being and a product that I know I'll never or rarely buy than have that kind of trickery influence my choice between two products of a category I buy often.
Correct! There are several apps that I've wanted to use over the years but the main blocker (for me) has been authorization. With this new feature from Tesla that all changes. I'm much more willing to purchase these apps now that I can limit their reach.
The real bonus is that it appears as if this update will help isolate the app from the vehicle. What I mean is that the current way these apps work is they poll the vehicle for data. This is bad because it (potentially) wakes up with vehicle and increases battery consumption. The way I understand it, this new method will allow apps to "poll" Tesla instead, which is a much better operation.
Some companies charge police for the work it takes to fulfill their requests. I suspect that once it becomes a revenue stream, they are less likely to pushing back against requests for data where they might otherwise.
It happened a few decades ago as in-car GPS rolled out. Some rental car companies started issuing speeding tickets. That game lasted about a week.
>> Feb. 2002. A Connecticut man has taken a local rental-car agency to court, after the company used Global Positioning System technology and fined him $450 for speeding.
https://www.cnet.com/culture/rental-car-firm-exceeding-the-p...