Presumably it would need to be trained with good examples of solutions to complex problems. Maybe there just won't be that many for the foreseeable future.
Yeh exactly. AI was trained on real data, and in a few years, there will only be AI versions of real data to run off. ie. alot of that really crappy code will be deployed to github, and the result will be that and more shitty code will be written.
Thats pretty much what has happened over the last 30 years. You need less skills to produce more and more. Google search started it all, i remmeber back then I quipped the Offshored devs wouldnt be able to get a job without an internet connection because previously us engineers prided ourselves on understand and memorising APIs. The Ai programmer will outpace India's cert mills in quick succession but it wont benifit anyone except the Large Corperations who own the Ai. Atleast the Google devs added questions to SO, the AI dev is leecher who only profits, plagerises, never contributes.
I recommend reading the comments, there are serious concerns about the validity of the study.
There are also fundamental math errors - which the authors acknowledge, search for "we acknowledge" in the comments. Funny part - even while acknowledging it, they still fail to do the math correctly - PPV is 39.7% not 59%.
Still not enough, read Esther Rodriguez' comment on the paper.
Might be innocent mistakes. Might even still be a correct conclusion - but this paper alone can't stand by itself, and there's nothing else backing it up (as far as I can tell with a quick search)
It makes sense to me. This is a database keyed by facial images. They don't know your name with certainty. The only way to look you up in the database is by face. Then presumably they need the ID to make sure it's you who's requesting the info. Hard to imagine how else to do it, given the nature of the technology.
Exactly. Otherwise, this is just a vector for any person to exploit the process and freely play cop by uploading a photo of someone they're trying to track.
Yes, but the value is to provide that in exchange for money, and ostensibly only to law enforcement agencies and similar organizations. (If you try to sign up on their website, it says "Clearview is available to active law enforcement personnel" and that you need to apply for access.) If you're a random citizen and can get the same data, especially for free, the value proposition breaks. And the privacy implications would be worse.
So, I get why they ask for ID, even though I also get the reluctance to give them your ID since it could help tune their system.
My wife gave birth last year in California, and immediately upon returning home, we located the paperwork for requesting the DNA be destroyed and submitted it. We received a response telling us that our request was received, but that's all. Unfortunately neither of us can remember where to find the form we needed to send it to request it be destroyed. Maybe it was in the hospital paperwork?
If you captured and replayed the IR flashing signals you might be able to abuse such a system to prevent people from taking pictures of other things, such as yourself. I wonder if a little bit of cryptography embedded in the signal and the phone could make replay attacks more difficult, perhaps something similar to the codes used by garage door openers and car keyfobs. Or maybe by encrypting a counter.
If the IR sensor was separate from the camera module you might be able to put tape over it and prevent the signal from being received by your phone, defeating it. But if the IR sensor was actually a custom component embedded inside the CCD chip (which is not out of the question for Apple) then this would be difficult to defeat. You'd have to carefully mask out the non-IR-filtered area of the CCD chip with material that filters IR.
Question for the developers: It looks like the iOS app allows you to download apps built with the desktop tool, which supports evaluation of Javascript snippets. Isn't this against the App Store policy for the download and execution of arbitrary code?
Hi, one of the devs here! You're correct, PencilCase: Player can evaluate JavaScript using the JavaScriptCore framework included on the device. The App Store already has a lot of apps available that allow exactly this or other very similar options (Python, Lua, etc.) and we look forward to seeing how people take advantage of this ability.
The difference I'm specifically curious about is the issue about 'downloading' arbitrary code and evaluating it. Current app offerings don't have this support because Apple has forbidden it in the past unless the evaluation goes through the Javascript context of a web view, which doesn't appear to be the case for PencilCase. I guess I'm curious if they have relaxed their restrictions in this case, are unaware of that feature, or something else.
Yeah, App Store guidelines can be ambiguous, and JavaScriptCore becoming a public framework could be seen as an endorsement of this type of functionality, but we don't have any information that you wouldn't.