> I have a good friend at Google. The motto they go by is that unless 10000 people are impacted by an issue, it's really not worth their time to investigate.
At Google's scale, this makes absolutely perfect sense: Google (according to Wikipedia) currently has 139,995 engineers, and up to approx. 2 to 4.8 billion customers (using a floor of Chrome's user base and a ceiling of "number of users connected to the internet"). This means each and every engineer has a broadly averaged/amortized potential impact on up to 14,286-34,286 users. None of those developers will consistently produce useful output if they're thinking about all that responsibility all day, and probably quite a majority wouldn't produce any output at all if they were tasked with interacting with all the users their products impacted (the most impactful developers might be faced with queues filling with maybe 300 tickets a second or more).
However, at world scale, where just about everything that relates to humans is sufficiently fractal-like that it doesn't track along a 10,000-entry/point/dimension/column/etc-sized graph/curve/vector space/BigTable/whatever, you get issues like this.
I can kinda understand (while headscratching through the math for this for the first time) why Google hires so many external contractors (who presumably aren't counted as employees?) to try and combat this sort of thing, but there's only so much that can be done there too.
It's a really difficult problem.
Human empathy seems to have a serious bathtub curve for "things and problems that are human-sized", with maximum sensitivity around maybe 1-20 people. Anything smaller than a human is only intrinsically interesting if it's cute, and anything bigger than a human can probably figure its problems out itself, and is only intrinsically worth my attention if whatever it's doing might kill *me* in particular, and possibly the group I'm in.
Sadly, this is a problem *because* the humans on both sides of the fence equally bleed red and run the same legacy firmware, while the producer end of the queue is grossly under-represented.
Scaling up an individual's or group's impact unfortunately hits the edge of that empathy bathtub curve in a hurry when you go beyond even just a few hundred recipients, let alone a thousand. What am I supposed to think of 150,000 Twitter likes, or 20 million TikTok views, or 20 upvotes on HN? It sort of blurs out to a fuzzy "...:D" that holds zero semantic value (it doesn't provoke an intuitive context-specific response), and also close to zero little intrinsic value (I don't know how to reason about it in isolation).
So, what are Google's engineers supposed to do to solve these kinds of problems? Serious question.
Saying "they're big enough, they'll figure it out" is just bumping into the edge of the empathy curve. Saying "well, they need to scale out their empathy" implies the engineers at Google have access to some sort of intrinsically more sensitive model of intuition (the self-correcting kind that would naturally occur at the individual level and propagates outward in groups). A real solution is needed here.
The only thing Google can do is collectively whittle away and figure out solutions to this fundamentally non-intuitive problem using... *drumroll*... bureaucracy. "NOOOoooo," I hear you say... but that's the only glue available to collectively hold *checks* 139,995 empathy bathtub curves together. Yep, it's like the PHP of duct tape, but sadly humans haven't figured anything else out yet, crazily enough... almost like the empathy model we use doesn't find solving for the problem space as a whole interesting, or something...
--
Ignore the above, Google is a mean bully that abused their power for {killing {Google Reader/related-image search/other favorite product}/locking someone out of their account/locking Android down/having UI inconsistency/etc etc etc} D:<
They have finally removed the photo this morning. Thanks to all who reported it. On my Facebook page, I have the screenshots of all the incorrect images that they placed in the knowledge panel.