Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Counterpoint, I have opted out at multiple airports including Boston without event. They all know opting out is a thing, they press one button to turn off the camera, I hold up a piece of paper over the cameras anyway just in case, and they have never said anything.

One reason to opt out even though everyone says “they already have your picture 1000 different ways” is that these cameras are not normal cameras, they are stereoscopic close-range cameras that take a 3D image of your face. That takes facial recognition accuracy up to 95%+, from 70% or less on a normal photo.

Furthermore, they say they do not retain the image. That may or may not be strictly true. But they do not say they are deleting the eigenvector, facial measurements, a hash of the image, or any other useful derivatives of the image



> One reason to opt out even though everyone says “they already have your picture 1000 different ways” is that these cameras are not normal cameras, they are stereoscopic close-range cameras that take a 3D image of your face. That takes facial recognition accuracy up to 95%+, from 70% or less on a normal photo.

Interesting, I'm fairly sure the cameras at one US airport I was (maybe Salt Lake City?) were just bog standard Logitech C920 like my previous webcam. Do you have more information on the cameras?


Haven't seen the setup but it seems possible to me that there may be several cameras, one that is clearly visible and used to give visual feedback in the computer screen so that you put your face in the right place, and another one that actually do the biometric scan.


The automated passport gates have way more, but the manual photo checks by guards are indeed a mounted C920.


Single sensor face rec is already above 95% on surveillance cameras, which do not have ideal positioning, even when they are intended specifically for face rec.


Accuracy is heavily dependent on how large the populations you’re dealing with. If you’re comparing footage from surveillance camera vs a database of 100+ million people that’s tough. Most algorithms are being tested at 10^5 vs 10^6.

Worse, unlike false positive with fingerprints the suspects will likely look similar to the cops not just the algorithm. There’s a major incentive to opt out simply as a result of false positives.


Even if its a non-special camera.. they're capturing a fixed profile photo. That's the gold standard used for matching against a dataset.


> Furthermore, they say they do not retain the image. That may or may not be strictly true. But they do not say they are deleting the eigenvector, facial measurements, a hash of the image, or any other useful derivatives of the image

So what? Do you think that the thousands of cameras you encounter every day aren’t doing this already?

I think the aim for culture is to limit what we can do with these data, rather than to try to prevent collection of these data. Cameras are too plentiful and powerful to not expect your image linked to your identity. Especially for governmental, lawful uses like immigration screening.


Would you consent to DNA screening every time you fly? You have to draw the line somewhere and I would rather draw the line here (not consenting to airport face scans).


If it was noninvasive and already pervasive, yes.

It’s not that I like facial scanning. It’s that it is pervasive and can do nothing about it.


> If it was noninvasive and already pervasive, yes.

That’s circular reasoning. It is all the more reason to oppose this crap today so it does not become pervasive. Think of anything that you don’t like but is pervasive. Don’t you wish the people that came before you would have opposed that a bit harder so you don’t have to deal with it now?

The noninvasive part of your argument does matter, but I feel that’s much harder to define as everyone will draw the line somewhere different.


I don’t think it’s circular as things become pervasive not because of slippery slope of just a few cameras and then adding more, but because the tech in those cameras became cheap from other trends.

In that cameras aren’t cheap inherently, but because chips and storage and power are so cheap it leads to cheap cameras. So people chose all of the other non-camera functions that also led to cheap camera.

For dna, I wouldn’t be a fan of just gradually adding dna identifying machines until it’s pervasive. I’d like to not do that. But what I mean is that dna sequencing will become so cheap due to other factors that it will be pervasive to just add id machines to air conditioners or whatever.


OTOH maybe we shouldn't make it too easy for them by giving in without any resistance whatsoever. Fight this to delay or derail the next thing which will be even more.


It’s not like we chose. Cameras and transistors are just absurdly cheap and have millions of other purposes. So we’d have to shut down tons of other useful functions to change it.

Just like dna sampling will one day be so cheap that air conditioners test for every person and every germ they see. It will be so cheap as hard to stop. I guess we can resist digital stuff making dna cheaper. But we’ll have to stop all the other benefits (real time, targeted anti-virals, etc).


> So we’d have to shut down tons of other useful functions to change it.

What? No. The problem with this stuff is the privacy-invasion aspect.

For example, ubiquitous camera installations in public spaces and businesses are 100% fine if they're truly closed-circuit systems that have no storage capability. If all these cameras offer is tele-vision, then there's no privacy problem.

So, if this sort of stuff was guaranteed to be used for a purpose that's beneficial to individual clients and the public at large (rather than scraping up pennies by selling the recordings to data brokers and data analysis firms), there'd be no problem at all.

The solution becomes clear: make video, audio, still image, biometric, and behavioral data toxic waste.

* If a company ever uses such data in any way that's not clearly and conspicuously disclosed in plain English to their clients, fire every executive member of that company and permanently bar each of them from holding any executive or management position ever again.

* If a company ever sends any payment (whether monetary, or in goods and/or services rendered) to a company that is (or owns) a data broker, customer/consumer data analysis firm, or similar, then that company is considered to have engaged the services of said company and presumed to be sending the data of their clients to said company. Fire every executive member of that company and permanently bar each of them from holding any executive or management position ever again.


> The solution becomes clear: make video, audio, still image, biometric, and behavioral data toxic waste.

This doesn’t seem realistic to me because those data are useful for so many valid purposes and the tech is ubiquitous. So we’d have to regulate and it would be expensive and futile.

Your solutions are difficult to enforce (eg, “fire every executive” doesn’t work even in things like Enron or mine disasters).


> So we’d have to regulate...

No shit? You don't say.

Seriously, regulations and laws are how you get companies to act against their own best interests.

> (eg, “fire every executive” doesn’t work even in things like Enron or mine disasters).

You seem to have forgotten the second part of the punishment. I'll quote the punishment again:

> [F]ire every executive member of that company and permanently bar each of them from holding any executive or management position ever again.

I don't remember any Enron executive being barred from holding any executive or management position ever again. Do you?

> Your solutions are difficult to enforce...

Oh? Someone comes to the relevant regulator, or Federal law enforcement with evidence of this crime happening. By law, FedGov will be obligated to investigate. Either they corroborate the evidence and deliver the punishment, or they do not and they do nothing.

This punishment is so extreme that you will only have to catch a few companies to prevent nearly everyone else from breaking this law. (After all, what executive team would risk decades of each of them getting a seven-to-eight-figure salary just so that a couple of them can get a one-off bonus one or two orders of magnitude smaller?)


There are all sorts of things that are cheap and easy, but society considers so dangerous that they are restricted by law. That doesn't stop all ne'er-do-wells, of course, but it stops most.


> Do you think that the thousands of cameras you encounter every day aren’t doing this already?

Of course that's happening. It's still worthwhile to reduce the amount of this sort of thing where possible, though.

> I think the aim for culture is to limit what we can do with these data, rather than to try to prevent collection of these data.

I agree that the real, serious issue isn't the photos themselves as much as the databases that hold them.

That said, why not address both problems? Particularly since we can't ever actually know if photos are being stored or not, but we (usually) can see the camera. In terms of verifiability, restricting the use of cameras is better than restricting the use of the data.


It would sure be nice to place those limits, but in the meantime it’s still worth opting out right?


Worth it to some people. People opt out of vaccines too, or not using smartphones. I think it’s a situation where people can opt out, legally and ethically, but few will.


Yes I agree completely. But given the capacity, isnt it a good move? I wasn’t making any statements about what most people have the time and energy for.


I have no idea where you live, but I for one do not encounter “thousands of cameras” every day. It’s not even close to that.


I live in woods a couple hours from any city.

My car has 2 cameras, my house has 4, my neighbors on either side have a bunch. Most folks around here also have dash cam in their car. All the businesses (pizza places, auto parts shops, small engine etc) have cameras everywhere. Local PD has new axon body cams. We for some reason have red light/traffic cams. State Routes, Interstates have cameras roughly every half mile.

If you are even remotely analagous to most people on HN, you are recorded at least once a day 1. without your knowledge 2.without consent.


recorded a dozen times =/= thousands of cameras. likely not even 100 cameras.

my neighbors all have door cams but the liklihood of them seeing me as I drive by is low. even if I get tagged by literally 20 cams on the way to work and on the way back, plus another 20 getting lunch, I'm still nowhere near "thousands"


you must not live in a city or suburb then? cameras in cars, cameras on doorbells, many security cameras with multiple angles at most stores, shops, places of employment, large fraction of houses, city streets, etc


> you must not live in a city or suburb then?

I don’t live in the US. Maybe that’s the difference?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: