Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Is there any way for me to get my information removed from Equifax?

Do I need to contact all of my line item creditors and ask them to remove references to Equifax?



It's not your information in the sense of you having any ownership or control over it. It's just information about you. I got information about you by looking through your comment history. You cannot make me forget it. Lucky for you, I will anyway in a few minutes, since I am not in the business of selling that kind of information and thus don't really care.


While what you say is factually true, does anyone else find it morally repugnant? The idea that information like your genetic code, behavioral data, photos of your likeliness, etc. might be owned by someone else strikes me as both slightly ridiculous and incredibly dehumanizing.


Which is precisely why this is handled completely differently in the EU. Essentially, data about you (personally identifiable information) is your data, people generally may not store or distribute it without explicit consent, and generally you may revoke consent at any time. There are exceptions and the details are complicated, of course, but the fundamental principle is much more sensible.


That seems to be somewhat overstating the current position under EU data protection laws, though some member states already go further, and with the introduction of the GDPR next year the situation across the EU will move closer to what you describe.


Yes, it's not all the unified across countries yet, but the basic principle still generally aplies, in contrast to the US model of "if you happen to have some information about someone else, good for you!", which then only has exceptions for medical information and stuff like that.


Wow. This is great! US Privacy Laws would benefit to be at least this strong


Its good as a person, but it is a pain in the ass for companies. Things like the domain name used in an email address can be considered PII because if the domain name was your name, that is PII... IP addresses can, in many cases, also be considered PII. Once something is PII, you as a company have to treat that data different--you have to be able to provide that data to the person and you have to be able to delete it.

Basically, the overall idea is good (data about you should be owned by you) but mapping that into actual nuts & bolts implementation details is a huge pain in the ass.


> Things like the domain name used in an email address can be considered PII because if the domain name was your name, that is PII...

That's ... just wrong. The email address itself, unless it happens to be a role address, is PII. Whether there is a person's name in there doesn't matter.

> IP addresses can, in many cases, also be considered PII.

Really, no, the address isn't PII, the information about someone's behaviour is. You may store IP addresses as much as you like. You just may not store it as a key that could be used to link information about someone to other information about them.

> Basically, the overall idea is good (data about you should be owned by you) but mapping that into actual nuts & bolts implementation details is a huge pain in the ass.

I wouldn't say it's always trivial, but it's not really that hard either. The only thing that is really hard is collecting data that you have no justification to collect. If you simply avoid collecting data, none of the other stuff affects you.


The only thing that is really hard is collecting data that you have no justification to collect. If you simply avoid collecting data, none of the other stuff affects you.

Unfortunately, since our modern digital world works with data, that particular tautology is next to useless.

With my small business hat on, I'm deeply concerned about the real world implications of the GDPR next year, even as with my privacy advocate hat on I'm happy that the kinds of big data-hoarding companies that cause most of the real problems are going to face more meaningful regulation.


> That's ... just wrong

The lawyers would disagree with you. If the email address contains anything that identifies you personally, it's PII and thus falls under GDPR. Therefore, if you collect email addresses you have to go through great pains to protect them.

As for the rest of your stuff, while you are technically correct your comment is pretty useless. We live in a data driven world that requires.... well... collecting data. Saying "just dont collect email" isn't helpful and collecting IP address info without associating it with use data is pretty pointless.

It is a huge, expensive undertaking to make any kind of "legacy" pre-GDPR infrastructure GDPR compliant.

> If you simply avoid collecting data, none of the other stuff affects you.

If you are doing anything useful on the internet, you are collecting data.


> The lawyers would disagree with you. If the email address contains anything that identifies you personally, it's PII and thus falls under GDPR.

You are simply missing the point. It's not required to "contain" anything. If it is the email address of a person, then that makes it PII.

> Therefore, if you collect email addresses you have to go through great pains to protect them.

Well, yes, of course you do? If you want to profit from using other people's addresses, you better make sure they don't get harmed by it.

> We live in a data driven world that requires.... well... collecting data.

Erm ... no? We don't live in a data driven world, we live in a world where some people are completely unwilling to respect the boundaries of other people and consider it their right to do whatever it takes to manipulate them in their interest. There is no such right, there is nothing that requires that you violate other people.

> Saying "just dont collect email" isn't helpful

Well, whether it is helpful depends on what you expect to be helped with. But it is a perfectly viable thing to do. If people voluntarily hand you their email address so you can send them invoices, say, there is absolutely no problem with you doing so, because that obviously means that there is consent. Anything beyond that, and you are just selfishly trying to ignore the interests of other people, and it is perfectly viable to just not do that.

> collecting IP address info without associating it with use data is pretty pointless.

... so don't do it, then?

> It is a huge, expensive undertaking to make any kind of "legacy" pre-GDPR infrastructure GDPR compliant.

Erm, well, it's an expensive undertaking to stop being an asshole ... so what?

> If you are doing anything useful on the internet, you are collecting data.

So, if I publish a free software project on my own web server that doesn't write any log files ... that's not useful? Could you explain how exactly that reasoning goes?


> Saying "just dont collect email" isn't helpful

If you don't need that email address for anything, then don't collect it. If you do (perhaps because the customer has agreed to let you send them email), then fine, collect it, but also provide a way to delete it, permanently, if the customer wants to terminate their relationship with you.

> It is a huge, expensive undertaking to make any kind of "legacy" pre-GDPR infrastructure GDPR compliant.

Yes, it is. But worthwhile things are often not trivial.


But worthwhile things are often not trivial.

Indeed. But there is not trivial, and then there is prohibitively difficult and expensive to the point of being unreasonable.

Obvious examples, exhibit A: Suppose you use a deduplicating backup system, and being a responsible service you also ensure that all user data is properly encrypted as part of your backup process. If each single customer who decides they don't want to deal with you any more can require you to remove any reference to them from anything you currently store or have ever stored, please describe a technically and commercially viable strategy for reliably cleansing your backups.


Here's the thing: tough shit. I am super tired of the "it costs too much", "it's too hard to do" excuses. You've built your business off of my data. Not yours. Mine. I dictate how it is used and who has it. You keeping my data can actually be a threat to my security and livelihood.

We have swung way too far into a regime where personal information is irresponsibly slung around the internet with no accountability and no consequences when that data is misused.

> If each single customer who decides they don't want to deal with you any more can require you to remove any reference to them from anything you currently store or have ever stored, please describe a technically and commercially viable strategy for reliably cleansing your backups.

That's not my job to do. That's "your" job as a responsible technologist to figure out, based on your knowledge of your systems and processes. I'm in the midst of preparing to be GDPR compliant at my employer, and I'm not saying it's easy, but all these problems are tractable. Would I rather be focused 100% on building new products and features? Sure, who wouldn't? But part of being a professional developer is taking responsibility for the software you write, which includes handling customer data with respect. GDPR is a step in the right direction for that.


I am super tired of the "it costs too much", "it's too hard to do" excuses. You've built your business off of my data.

I haven't built my business off your data or anyone else's, unless things like having your details to charge the agreed payments or keeping routine server logs counts. None of my businesses is in the data harvesting field, nor do any of them collect personal data that isn't legitimately relevant to what they do or share it with any third parties other than to help do whatever it is they do.

Not yours. Mine. I dictate how it is used and who has it.

Well, no, you don't. You might wish you did, but neither the law nor the facts are currently on your side. This will remain the case even under GDPR.

You keeping my data can actually be a threat to my security and livelihood.

Your paranoia may be a greater threat to your security and livelihood. There is very little that we do that could pose any significant threat to anyone even if our systems were compromised. How do things like keeping records of how people are using our own services and resources to help with our own planning and protection against abuse pose any threat to you whatsoever?

That's "your" job as a responsible technologist to figure out, based on your knowledge of your systems and processes.

And I suggest that for many businesses, no solution will exist that is responsible in terms of good practices like keeping proper records and back-ups, compliant with the letter of the law under the GDPR, and commercially viable in the face of people actually exercising their full rights under that law.

But part of being a professional developer is taking responsibility for the software you write, which includes handling customer data with respect.

I am a staunch advocate of privacy and protecting the rights of the little guy. I do treat all personal data with respect, and we go to considerable lengths in my businesses to ensure that such data is not collected or shared unnecessarily, is stored and processed according to good practices, and so on. We have always done so, from day one, even with only limited resources and sometimes at the expense of doing things that would no doubt have made us more money but crossed into territory we weren't happy being in.

My objection to the GDPR is precisely that despite doing reasonable things to run our businesses and doing nothing that most people would consider even remotely controversial or unreasonable, we are still subject to the kinds of excessive and expensive measures we have been talking about. It's rather ironic that the obligation to notify third parties to which data has been disclosed comes with caveats about being impossible or involving disproportionate effort, yet the obligation to erase data held directly does not.

However, given that this is the case under the GDPR, it's hard to imagine how most businesses could withstand full enforcement of the right to erasure by customers wanting to make trouble without compromising their ability to operate viably and responsibly in other respects. That is not, IMHO, a good law, even if you're a privacy advocate. In fact, since it sets an almost impossibly high standard for compliance, it is arguably worse than a more moderate law, because businesses may decide that they're screwed in any case if someone wants to make trouble so they have little to lose by not doing their best in terms of data protection.


PII just stands for personally identifying information. If it identifies you, it is PII. Even if it isn't linked to any behavioral data explicitly, the fact that its in your DB means they are associated with you. Thus email and IP are PII.


No, if it is in your DB because it gets written to that DB as a result of them being associated with you, then that is PII, prescisely because that is information about them being associeted with you.

If you use an RNG to generate IP addresses, that does not represent any information about any person, hence no PII, even if it is in fact the IP address of a person that is protected under the relevant regulation.


More specifically:

"If a business collects and processes IP addresses, but has no legal means of linking those IP addresses to the identities of the relevant users, then those IP addresses are unlikely to be personal data. However, businesses should note that if they have sufficient information to link an IP address to a particular individual (e.g., through login details, cookies, or any other information or technology) then that IP address is personal data, and is subject to the full protections of EU data protection law."

https://www.whitecase.com/publications/alert/court-confirms-...


"implementation details is a huge pain in the ass"

Privacy is worth protecting. Plus, it's still a huge value add to your customers if you provide them these controls, whether you're legally required to or not.


Are you suggesting that it would be better if an entity should "own" / be in control of all information about it? I'm not trying to straw-man your argument, but I can't find any other self-consistent scenario in which what you describe as morally repugnant, ridiculous and dehumanizing is "solved".

Assuming that is what you meant, consider: Equifax's CEO owns all information about Equifax, looks at all this negative press recently and decides it should be scrubbed from the Internet / all publications / etc. If you meant it only to apply to people, lets play the Godwin's Law card and suggest that Hitler (or his descendants) wishes to scrub all information about the holocaust, etc.

I think that scenario is far, far worse.


It's possible to draw a distinction between corporations and people, Hobby-Lobby notwithstanding. You could argue, self-consistently, that people have a right to the information about them, while corporations (as a legal fiction) have no such right. In your example, Equifax's CEO would have no recourse to negative press about Equifax, but he would have recourse to negative information about himself, which he kinda does under existing libel & slander laws, though truth is an absolute defense for those.

The EU operates under similar provisions, which as an ex-Googler and tech entrepreneur I find pretty annoying, but as a person find pretty encouraging.

(There are issues even under this distinction that are problematic: if you commit a crime, does the public have a right to know? What if your crime puts them at risk? If you have a reputation for screwing your business partners over, should future business partners have a right to seek this out? But at the same time, there are huge negative externalities to not being able to control this information. If a company has false information on you - as happens pretty frequently - do they have a right to sell it to so many parties that correcting it becomes impossible?)


There's a difference between holding data saying "This is Bob Smith and his email address is bob@smith.com and his SSN is 123-45-6789", data that is personally identifying and could reasonably be considered private, and holding factual (even newsworthy) information about a person's public activities ("Bob Smith was convicted of securities fraud").


I find this idea genuinely intriguing. If such an idea were to work, it would need a clear-cut definition dividing the two types of information about a person (since we're talking in a legal context here, and you'd need to be able to defend yourself if Bob argues you shouldn't have a given piece of information that you have).

What if I lent you money and you never repaid me -- could you legally prevent me from telling anyone about it, by virtue of it being private information about you? If you think that's unreasonable, isn't that more or less the basis of your credit score?


I think we should be having a discussion about the dimensions of that, yes.

A set of questions I've been thinking through are:

What is privacy?

Can it be quantified?

What is identity?

What are the reasons we want to check or confirm identity?

Can some of those reasons be eliminated?

It seems to me that privacy is the ability to define and defend boundaries of information and disclosure. That the question of what is or isn't known, and to who's benefit that information is used (and in particular, to the benefit of the subject of the information, of society as a whole, or some specific third party or parties), matter. That there is a continuum of interests in protecting and revealing relevant information, much of which has to do with the relationship of the subject to society, and that those with great power, or a history of abusing society's trust, have a far smaller claim to relevant privacy. (Balanced, perhaps, with potential consequent risks: the wealthy in parts of the world are subject to stalkers, frauds, kidnapping, and extortion threats, for example.)

Persons in positions of high power, or convicted of crimes, or who've violated public trust, should be faced with greater disclosures. That would include, on at least two counts, Equifax's former CEO.

Much of the reason for seeking information is to establish trust or credit. Should I trust what you say, the capabilities you claim to have, that you own or control or have created specific properties, resources, or communications? Are you the specific subject of a given medical history? Do you owe, or can you claim, a tax debt or a pension credit?

Also part of this question is what risks (or opportunities) disclosure of specific information portends. Do the specifics of a romantic relationship you had 30 years ago matter? Does the fact that you are, or aren't, a politician, a military officer, or gay, matter? Or that the romantic partner was or was not of age of consent? Or that you were or weren't?

When is knowledge power or control? When is it liberating? For how long should it be controlled?

Do rights expire after a fixed set of time? On death? Or are they carried forward according to, say, tribal customs or beliefs? For how long?


I can't imagine it'd be all that different from existing laws on using the "likeness" of a person in movies, TV, or advertising. All you'd have to do is extend that list to third-party financial institutions and on the non-public-facing side of advertising.


Well, like a lot of things, it's a balancing act.

If we say "nobody is allowed to keep information about other people ever," that's clearly a huge reduction in freedom. Americans generally like freedom even if it brings some negative stuff with it. (Think The KKK - they would be flat out illegal in some countries but in America they are protected under the law)

Some people may say it's morally repugnant to restrict private entities from writing down information they happen to know about other people. If I write in my diary "HN user nostrademons said XXX" should HN user nostrademons own that information just because my diary might be stolen?

Right now how we balance it is companies are allowed to collect this information all they want and look at it themselves all they want. BUT it can only be accessed by a third party with your permission (you give a bank permission when you open an account, your landlord when they run a credit check, your insurance company when you get a quote, etc.) Creditors may send targeted offers to you based on your credit file, but you may opt out (or in) at any time. (you can do it here: https://www.optoutprescreen.com/). You can access your own file for free yearly as well as whenever you were denied something as a result of what's on your file(s). You have the right to dispute the information in your file if it's inaccurate. Creditors are required to disclose to you certain information that they used to make their decisions. You have the right to freeze your reports so nobody can access them.


I don’t find all of that inherently morally repugnant. I own several photographs of other people, for instance. I think it’s clear that you’re really talking about the bad things that can happen when a large entity (probably a government or large corporation) gets a large amount of personal data. I don’t disagree that there are many bad things such an entity can do, but I certainly don’t therefore intuit that anyone possessing someone else’s personal information or likeness is inherently morally repugnant, ridiculous, or dehumanizing.


Honestly, no. Because the flip side of that stance is that all information that has anything to do with me belongs solely to me. That just seems...naive? Unrealistic? I certainly don't find it morally repugnant or dehumanizing that someone can take a picture of me and keep it. And that's probably the mildest counterexample...

IMO it's only a problem if I did not give away that information. Thing is, we give away a lot of data about ourselves.


> IMO it's only a problem if I did not give away that information. Thing is, we give away a lot of data about ourselves.

Sure, and why is it so ridiculous to expect that I might change my mind about certain types of information, when given to a corporation, and want them to delete that information? Why shouldn't I have the right to, say, tell a company with sensitive financial data about me to delete it and terminate my relationship with them? If I can't do that, then I'm at their mercy not to sell that data, be acquired and have the data used in new ways that I did not authorize by the new owner, or be compromised and have that data in the hands of parties that would misuse it.

We're not talking about censoring obviously public information, here, or even allowing people to hide when they've done something newsworthy. We're talking about controlling the flow of, and access to, private, personally-identifying information.


Sure, and why is it so ridiculous to expect that I might change my mind about certain types of information, when given to a corporation, and want them to delete that information?

It's not ridiculous, but it does impose a cost on them, and by extension everyone else dealing with them, because you changed your mind. Whether you should automatically be entitled to impose that cost on everyone else and under what circumstances is not an easy question.

Why shouldn't I have the right to, say, tell a company with sensitive financial data about me to delete it and terminate my relationship with them?

Maybe they need that data for their own financial records. Maybe those records are things they are required by law to keep.

Maybe they use that data to protect themselves against fraud or other abuses, and only use it in reasonable ways for those purposes, and allowing fraudsters to require them to delete all traces of their previous interactions would leave them unreasonably vulnerable.

If I can't do that, then I'm at their mercy not to sell that data, be acquired and have the data used in new ways that I did not authorize by the new owner, or be compromised and have that data in the hands of parties that would misuse it.

That's a false dichotomy. Practical data protection is almost always going to be about restricting not just the initial collection of data but also how that data may be used and by whom once it has been collected. An isolationist approach where everything can be kept totally secret is impractical, but it's usually not what we really want anyway, since then you couldn't do anything useful and intentional with the data either. It's more useful to ensure that people who have some data about us for legitimate purposes do not to then repurpose that data or share it with others for less legitimate purposes or without sufficient transparency and additional consent as appropriate.


> It's not ridiculous, but it does impose a cost on them, and by extension everyone else dealing with them, because you changed your mind. Whether you should automatically be entitled to impose that cost on everyone else and under what circumstances is not an easy question.

Sure, and that's why we have initiatives like the GDPR that try to answer that question. It's not going to be perfect, but throwing up our hands and giving up isn't an answer either.

> Maybe they need that data for their own financial records. Maybe those records are things they are required by law to keep.

Needing that information to be personally-identifiable is pretty rare, and cases where that information needs to be retained for a long time are even rarer. But in cases where it's necessary, sure, of course, go for it. The point is that the data needs to be collected and kept for a legit business purpose.

> That's a false dichotomy.

No, it's not.

> Practical data protection is almost always going to be about restricting not just the initial collection of data but also how that data may be used and by whom once it has been collected.

That's been shown not to work all that well. Companies lose control of data all the time, whether due to a data breach, or due to unscrupulous practices internally that takes existing data to use in new ways, even if not specifically authorized.

> An isolationist approach where everything can be kept totally secret is impractical, but it's usually not what we really want anyway, since then you couldn't do anything useful and intentional with the data either.

Sure, and nowhere did I suggest that's what I wanted. Please stop putting words in my mouth. I'm totally fine giving out "secret" data if there's a benefit to my doing so. But if there is no benefit to me, then companies should not be entitled to my data.


Sure, and that's why we have initiatives like the GDPR that try to answer that question. It's not going to be perfect, but throwing up our hands and giving up isn't an answer either.

I would argue that the approach taken by the GDPR is pretty close to just throwing up our hands and giving up, just coming down on the other extreme.

Needing that information to be personally-identifiable is pretty rare, and cases where that information needs to be retained for a long time are even rarer.

Not at all. Just look at the records you are required to keep under EU VAT rules. One of the big criticisms since the changes in 2015 has been that they require an already demanding standard of evidence to be kept for the location of every single customer you sell to (if you're selling something within the scope of the rule change, obviously), you're required to keep that information for years, and your records are subject to audit by any of 28 different national tax authorities.

But in cases where it's necessary, sure, of course, go for it. The point is that the data needs to be collected and kept for a legit business purpose.

And what happens when that data includes, say, an IP address that was subject to geolocation when a customer was charged, thus linking that IP address and everything else in every log or database entry that you ever collected that mentions it with that specific customer? Since you may effectively be forced to keep the IP address associated with the customer to meet mandatory standards for tax record-keeping, must you now purge every related record or log line even from backups, on demand and entirely at your own expense?

What if those backups are stored, as many are, in an encrypted, deduplicating format? Are you now required to go through every backup you've ever taken in the history of your business and systematically obfuscate or delete every mention of that IP address? Do you have to take steps to erase any trace of it from the storage media involved, in case the media are lost and subject to recovery measures after an ordinary deletion? Do you realise how much time and money would be involved in doing that, every time a customer decided they didn't want you storing any personal data about them any more? It's totally impractical. There has to be some measure of being reasonable and proportionate in what is required.

That's been shown not to work all that well.

It doesn't work well when the regulations aren't enforced and there are limited meaningful penalties even for the sort of gross negligence that we've seen in cases like the Equifax leak. The idea that what Equifax did was compliant with the current rules is laughable, yet they've barely taken a slap on the wrist for it, despite both the degree of negligence that led to the breach and the scale and nature of the potential damage.

There's nothing inherently wrong with the principle, though. After all, there are many other things that we could do, but which are illegal and most of us don't, and we penalise those who break those laws. Why should this be any different?

I'm totally fine giving out "secret" data if there's a benefit to my doing so. But if there is no benefit to me, then companies should not be entitled to my data.

But that wasn't the scenario we were talking about. We were talking about a situation where someone legitimately had personal data about you, and you subsequently changed your mind and wanted them to delete that data. From the point of view of someone controlling and processing personal data in legitimate ways, giving you an absolute right to revoke that permission regardless of the practical consequences to anyone else involved is a totally different situation to giving you a right not to be involved in the first place.


There's an excluded-middle fallacy here somewhere.

I don't really have a problem if I end up in the background of someone's family photo and then they stick it in an album or on a hard disk somewhere. I'd have a pretty big problem if someone snapped a photo of me in a public place and then sold it to a white supremacist magazine as "the face of minorities taking over this country". We'd have an even bigger problem if there was somebody out there who had hacked every security camera in the world and was collecting images to train facial recognition for a fleet of killer drones who would eliminate all his adversaries.

These situations are not the same. Most people have no problem with the first. Most people would be pretty terrified of the last.


There might be an excluded-middle fallacy here somewhere but you haven't provided any clues as to its location.

But I think I get the gist of the thing you're pointing at, but surely you can understand why are other people would be wary of deciding what exactly is problematic in this regard via legislation, police, courts, etc., particularly as they exist today.


>does anyone else find it morally repugnant

Yes, the idea that you are entitled to edit other people's memories is as repugnant as it gets.


> Is there any way for me to get my information removed from Equifax?

No. (EDIT: If someone has a better idea, please reply!) I filed a complaint with the CFPB with citations from their breach as well as congressional testimony requesting my credit file be removed. The response was boilerplate:

"Thank you for contacting Equifax. We remain focused on consumer protection and committed to providing outstanding service and support. Protecting the security of the information in our possession is a responsibility we take very seriously and we apologize for the concern and frustration this cybersecurity incident causes. We have developed a comprehensive portfolio of services to support all U.S. consumers. Please refer to our dedicated website, https://www.equifaxsecurity2017.com, for the latest information and updates or contact our dedicated call center at 866-447-7559. The call center was set up to assist consumers and is open every day (including weekends) from 7:00 a.m. – 1:00 a.m. Eastern Time."

> Do I need to contact all of my line item creditors and ask them to remove references to Equifax?

Even if you contact your creditors, Equifax is under no obligation to remove the data. Most credit lines have the possibility of falling off after 10 years (7 years for negative trade lines), but there is no obligation for them to be removed.


I'm honestly curious if Equifax would fall under GDPR regulations? I'm sure there's some overlap of EU citizens who have lines of credit in the US.

We're having to prep for that at my corp currently, and it's VERY explicit about being able to pull up and remove all personal data, with some very hefty fines if you don't.

EDIT: thought about this further and peeked at our guidelines, they may be able to get around this by the "data is integral to the function of the business" exemption, but I'd still wonder if someone could speak with authority on this.


> they may be able to get around this by the "data is integral to the function of the business" exemption

That's probably more of an "integral to fulfilling its contractual obligations to those the data is about". It's more complicated than that, but the point is that you cannot simply declare it the purpose of your business to collect personal information and thus be exempt from data protrection regulation.


Fairly confident the CFPB does absolutely nothing to read complaints. I send in one for each credit agency and got the same boilerplate even though my complaint had nothing to do with the breach.


I filed the complaint through them as it acts as a notary function (complaint ID generated, my complaint and Equifax's response on file).


> Is there any way for me to get my information removed from Equifax?

No. Pay cash or get tracked.


Paying cash doesn't guarantee you won't end up in Equifax's databases. Your employer may report your income to The Work Number (owned by Equifax), your landlord may report your payment history, etc.

Furthermore, your case of "pay cash" pretty much excludes you from higher education and home ownership. Unless you are super wealthy.


>Your employer may report your income to The Work Number (owned by Equifax), your landlord may report your payment history, etc.

How is this legal?


Because that information is no different than anything in your credit report.

I forgot to mention the insurance companies have their own data brokers too. So you'll end up in there if you've ever had insurance.


So you have to be extremely wealthy in order to pay cash everywhere, pay the obamacare insurance penalty plus any medical costs paid for by you out of pocket in cash. Also, you cannot legally drive in most (all?) states in the US since they require having some form of insurance (typically 'liability' ins.) for your vehicle.

Wow, it's impossible to not end up in any of these systems.


Correct, the only way to avoid it is to live completely "off the grid," cabin in the woods style.

Paradoxically, if you were extremely wealthy you'd definitely want to purchase umbrella insurance to insure that wealth!!

I should mention that when I said insurance I meant car, homeowners, renters, and umbrella insurance, not medical insurance. I don't know about medical insurance and what sorts of data brokers they may use.

Banks use data brokers when you have a savings or checking account too.


Tracking is different than having your score data stored somewhere. Do you have a source for your "no" by the way? I'm genuinely curious.


I don't have a source, but you are the subject of Equifax's business, not a participant, so you have very little say. They scrape public data, and buy data from credit card issuers, store "loyalty" programs, etc. For the little you can do, see here: https://en.wikipedia.org/wiki/Credit_reporting_agency#United...

Note that there are all sorts of unregulated ways in which companies (e.g. Facebook) use CRA data.


The "source" is 99.99% of financial-y things you do use data brokers, if not Equifax specifically.

Got a bank account? You're in ChexSystem and/or Early Warning.

Ever had a car loan? Credit card? Student loan? Mortgage? You're in TransUnion, Equifax, and Experian, plus more.

Got a job at a large company? Your more likely than not to be in The Work Number.

Ever return an item to the store? You're probably in The Retail Equation.

Ever have car insurance, renter's insurance, and/or home owner's insurance? You're in LexusNexis.

It's virtually impossible, unless you live on a homestead completely off the grid, to avoid these data brokers knowing things about you.


All of your consumer activity is sold to data brokers. All of your financial transactions involving credit instruments is sold to data brokers. They then package your data into a profile and sell it to marketers who want to target different demographic slices and compare against competitors.

Most people don't have a listing in any phone books these days yet you can type in a name and get credible hits from whitepages.com. Where do you think they get that data on names and where people live when you've never had a business relationship with them?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: