> It's not like they can review every picture individually, so they're going to need a general policy.
Even with a general policy, they still have to review whether a picture falls within the boundaries of the general policy. And there is no debate that they need a general policy, the debate is over what that policy should look like.
Maybe not, but they certainly can have a human review every single potential image after the fact (which they do if I remember correctly) because relying on an algorithm to do this is simply impossible and therefore even dumber than not being able to hire proper content screeners.
Even if it's executed in the brain of a human content screener, it's still an algorithm- if you want it to be consistent, it has to be documented and clearly communicated to the screeners.
They messed up, received backlash, and changed their policy based on it.