I think the key difference here is that the digital signage technology you speak of, the recognition was a necessary requirement to show the appropriate content, clothing for example.
In this case, the recognition doesn't do anything for the end user of the vending machine. It is purely data for other brokers to use. The vending machine didn't say to the user that people of their demographic were buying Diet Coke over normal Coke.
Slapping facial recognition on something that didn't have it before, without any obvious functionality or benefit for the persons face being scanned is very much cause for concern, in my opinion.
This was an M&M (candy) machine. To really know, someone would need to audit the hardware and software, so we're only speculating. Can't definitively say without more info.
Was trying to make the point that there is a potential justification for the functionality, though tenuous.
I don't know why you think it's relevant. All that matters is the privacy law in the local jurisdiction. Some people will be fine with it, others wont. The real question is: what's the harm?
Because I don't necessarily think that local privacy law is all that great in the digital age, nor is our ability to ensure that we're complying with it, especially when third parties are involved.
People were clearly concerned about it, because the nature of its services weren't all that clear to the end user. Students protested, it was investigated and found that no PII information to taken, great no further action required.
My point is that the situation was clearly cause for concern because we have a lack of trust in these technologies and those who use them.
Case in point from the article "the embedded cameras inside Cadillac Fairview’s digital information kiosks used facial recognition technology to record over five million images of shoppers at malls without their customers’ knowledge or consent."
Exactly. All the company gets is probably anonymized aggregate data like "Product X is bought by men 40% and women 60%", which would pass the GDPR and any privacy protection laws. The point is that we don't have access to the software so we have to trust them about doing (only) that kind of aggregations. Furthermore, as we don't trust companies anymore, we fear that cameras are used for other not innocuous purposes.
By the way, does any 3rd party inspect the software of all those machines with cameras, before they are allowed to be installed? Probably not so it's their word against the suspicions of privacy wise people, until something goes wrong and some kind of investigation starts.
In this case, the recognition doesn't do anything for the end user of the vending machine. It is purely data for other brokers to use. The vending machine didn't say to the user that people of their demographic were buying Diet Coke over normal Coke.
Slapping facial recognition on something that didn't have it before, without any obvious functionality or benefit for the persons face being scanned is very much cause for concern, in my opinion.