Given the sheer volume, I honestly wonder if a human does ever see it unless it's called out on Twitter (or whatever).
Even if it's flagged as an Iconic Image, you have to look at the context and see if it isn't posted by Jared from Subway.
Easier to auto flag than make a "waiting for glacially slow human" queue. (Esp. when one-million-moms has a protest van idling on the launch deck 24/7.)
All you really need is a filter that recognizes a few select images and allow them through. And adding something to that list shouldn't take much time, effort, or controversy.
> It's not like they can review every picture individually, so they're going to need a general policy.
Even with a general policy, they still have to review whether a picture falls within the boundaries of the general policy. And there is no debate that they need a general policy, the debate is over what that policy should look like.
Maybe not, but they certainly can have a human review every single potential image after the fact (which they do if I remember correctly) because relying on an algorithm to do this is simply impossible and therefore even dumber than not being able to hire proper content screeners.
Even if it's executed in the brain of a human content screener, it's still an algorithm- if you want it to be consistent, it has to be documented and clearly communicated to the screeners.
It is absolutely not difficult to distinguish between a photo of burning children and a photo of a sex act. In fact, it's pretty ridiculous to characterize this photo as that "of a nude child."
I don't see any flames in that picture. I've never seen any flames in that picture, but I've been told (since junior high school) that it is a picture of a burning child. What's up with that?
Common sense tells me that the child's clothing was on fire, and she took those clothes off and ran away, and that's what we see a photo of. Fine.
But then... why is this photo described as a picture that contains flames? Not just here. That's always how it is described.
She is surely burning. A napalm bomb was dropped on her village. The photo documents people fleeing. Look at her arms; you can see the skin peeling away.
The flame and smoke from the napalm attack that all the people in the picture are running from is pretty much the whole background of the picture, though the fact that its black and white and the cloud is fairly thick may make that less immediately recognizable.
> but I've been told (since junior high school) that it is a picture of a burning child.
Its not a picture of a child in flames [0], though its picture of a child experiencing burning as a result of the napalm drop (according to her, her recollection is that she was screaming "too hot, too hot" at the time the photo was taken), and the absence of visible flame isn't the same thing as the absence of continued burning.
[0] Well, not the child that's the central feature of the photograph. There are children burning in that sense as well in the picture, I suppose, if you consider the background and what is actually burning there.
There are probably both. You lumped them as if all porn accounts are spam accounts when that's very likely not true, and that is what SysArchitect was commenting on.
They could start by rejecting the default assumption that nudity should be considered sexual by default. And maybe the default assumption that sexuality is bad - I don't want them to shift to trying to detect if an image is arousing or not; but I do want whatever censorship system they feel they need to have in place to be able to distinguish between child pornography and a photograph of a child being burned by napalm. "Really fucking stupid" is an understatement.
They could start by rejecting the default assumption that nudity should be considered sexual by default. And maybe the default assumption that sexuality is bad
I agree with the suggestion, but can't help but laugh at how impractical it is. The people who would make the change are bay area progressives working in tech. And that means saying all the "right" things about sex positivity but stopping short of actually becoming sex positive.
I didn't see any napalm in the image. The girl had removed her burning clothes before the photograph was taken. The reason this particular image is considered acceptable is the historical context.
If I were to recreate the image by hiring actors, that would probably be considered child pornography.
> The girl had removed her burning clothes before the photograph was taken.
Removing clothes after the napalm had burned through them (which was manifestly the case here) certainly may reduce the degree to which you are being burned by napalm, but does not mean you are no longer be burned by napalm.
> If I were to recreate the image by hiring actors, that would probably be considered child pornography.
Be considered by whom? Pornography requires more than simple nudity.
She may be burning during the photograph, but that is not obvious without the extra commentary.
> Pornography requires more than simple nudity.
Perhaps, but I can understand Facebook removing that photograph, just in case a judge thinks differently. Child pornography does not appear to be something the FBI takes lightly.
I was not talking about the photograph in the article, but a hypothetical recreation. Even if a photographer were to explain such a recreation were a satire or had some artistic value, I think many people would feel uncomfortable with it. Certainly most would not want their children to pose for such a photograph.
From that thought experiment, I'm deducing that it's the historical context which makes the original photograph acceptable to distribute, and not anything intrinsic to the image.