"From 2007 to 2016 Funch carried out his project 42nd and Vanderbilt in which he captures the same person twice, mid-commute, leaving the viewer to wonder if they were photographed days, months, or even years apart."
It's actually a different mechanism entirely.
Human Resource Machine is optimising a single threaded small memory process. (Single Instruction Single Data)
7 Billion Humans is about massively parallel programming more like code on a GPU. (Single Program Multiple Data)
The ways to solve the puzzles are very different because of this, and it's interesting but not a way of optimising that I'm used to (so I haven't finished it yet!).
That has nothing to do with why Apple owners prefer blue bubbles. It’s because SMS is unreliable and restrictive. You can’t share files, some carriers truncate messages and send them as multiple messages, media formats are limited, group chats don’t work right, typing indicator doesn’t work, read receipts don’t work, etc. Green bubble does not mean lower status, it means a significantly degraded experience.
Some kids may act like those with green bubbles are not part of the in group, but this is done in jest. Kids do that with everything. My generation did it with video game consoles and sports teams. You had Sega people and Nintendo people. Nike vs Adidas. It wasn’t about status.
from an iphone users perspective there is no problem that requires a solution. Since iMessage is installed by the default, and there is no need for another app. And with iOS being pretty dominant in the US, no other app is going to get the traction it needs on iOS to be a successful cross platform communications.
If women are disproportionately hired at lower levels, pay equity will still be _very_ off, even if the data says otherwise on the surface-level.
Ex: Woman w/ 4 years experience hired at T3. Man w/ 4 years experience hired at T4. Both are "in range" of their median comp per level, but the man is being paid more for his expertise.
Literally the second sentence:
"Its members locate flaws in software, privately report them to the manufacturers, and give them 90 days to resolve the problem before publicly disclosing it."
Privately disclosed to Apple, 90 days later they published. Simple as that.
I used to think that 90 days is quite unfair. And in many ways it can be. But it’s a great equalizer. And that way no one company can claim that some other company got preferential treatment. And everyone by now knows that that’s what it’s going to be. Instead of p0 team having to have a fruitless back and forth with vendors about the impact and what would be a reasonable timeline for disclosure.
90 days is equality but not equity. Not all bugs can be fixed in the same way. Moreover, 90 days seems arbitrary to me, unless there was some prior study behind this number.
You're absolutely right! 90 days does seem incredibly arbitrary, like it was chosen for political reasons. And this policy is definitely equal, but wildly inequitable.
Is it perhaps possible that equitable treatments of vulnerabilities and companies might not be particularly high on the list of priorities for GPZ? Some might even argue that past attempts at equitable treatment have backfired badly, with many cases of companies abusing the time this gets them to not fix vulnerabilities.
Again, you're completely correct. Though I would genuinely love to hear your ideas of what equitable policy would look like - it could easily be better!
Google has the responsibility to inform users that the software they are using has known vulnerabilities as much as they have a responsibility to disclose them quietly to the software vendors that can fix them.
The way you laid things out, Google should just collect zero-days and sit on them? Do you see the absurdity of that? From a business perspective, having these vulnerabilities around makes it easer for their competitors to collect the same kinds of data about internet search and private emails from people around the internet that Google collects from legit means. Getting vulnerabilities fixes widens Google's data moat.
Disclosing issues is not "policing". They are not arresting people, or taking any action other than stating the truth, that some software is vulnerable.
If they disclose at 90 days and harm ensues, the user bears responsibility for continuing to use the software. If they trust the software vendor to issue timely updates, then they can turn around and lay blame at the vendor for not fixing the issue. Or they can blame the hacker.
I assume the GP was asking if the 90 day rule was really important to uphold, or if the disclosure could just be delayed longer until the patch went out.
Well,expecting there to be a patch without the 90 day exploit exposure is very generous. The whole point of a 90-day (or any arbitrary stretch of time) deadline is that a lot of companies are funny when it comes to exploits. Security doesn't ever make a company money, it's a high cost that can only (at best) hope to prevent the company from having to make reparations after a breach, maybe lose a few customers. As such, many companies treat security reports with indifference and do nothing whatsoever about reported exploits until they're forced to. The only real way private researchers or security groups like Project Zero have to light a fire under the company concerning the exploit is to release the exploit to the public when it becomes clear that the company isn't going to fix the vulnerability on their own. At least now consumers are made aware of the exploit and can make an informed decision on a plan of action. 90 days, 180 days, a year... it doesn't matter because people would criticize the length of time no matter what it is.
> My naive reading of this is that Google is saying we need to rethink processors themselves if we want to fix this (and we really do want to fix it). Am I reading it correctly?