Hacker News new | past | comments | ask | show | jobs | submit login

[flagged]



Meaning, difficult to create a filter that "knows it when it sees it."

http://blogs.wsj.com/law/2007/09/27/the-origins-of-justice-s...


Except in both instances they acknowledged that it was the result of a 'Report' and then a human review, and not an algorithmic decision.


[flagged]


Please stop. Insults are not OK on Hacker News, and the need for civility when discussing controversial topics is greater, not lesser.

https://news.ycombinator.com/newsguidelines.html


Given the sheer volume, I honestly wonder if a human does ever see it unless it's called out on Twitter (or whatever).

Even if it's flagged as an Iconic Image, you have to look at the context and see if it isn't posted by Jared from Subway.

Easier to auto flag than make a "waiting for glacially slow human" queue. (Esp. when one-million-moms has a protest van idling on the launch deck 24/7.)


Honestly, that person probably had a form that went:

[x] Children

[x] Naked

And the system did the rest.

I doubt they also have

[ ] Socially acceptable photo of naked children

on the form.


All you really need is a filter that recognizes a few select images and allow them through. And adding something to that list shouldn't take much time, effort, or controversy.


I don't know, I can see their point. It's not like they can review every picture individually, so they're going to need a general policy.

They messed up, received backlash, and changed their policy based on it.


> It's not like they can review every picture individually, so they're going to need a general policy.

Even with a general policy, they still have to review whether a picture falls within the boundaries of the general policy. And there is no debate that they need a general policy, the debate is over what that policy should look like.


> It's not like they can review every picture individually

It's exactly like that. They could not allow pictures at all, not censor at all, or close down the business, too. Every option is open to them.


Uhhh, right, but those options aren't really options since it would damage them far more than a policy that needs to be tweaked occasionally.

"Shut facebook down" is a bit ridiculous. Congrats, you're technically correct that it's an option open to them.


So correct, then.


Maybe not, but they certainly can have a human review every single potential image after the fact (which they do if I remember correctly) because relying on an algorithm to do this is simply impossible and therefore even dumber than not being able to hire proper content screeners.


Even if it's executed in the brain of a human content screener, it's still an algorithm- if you want it to be consistent, it has to be documented and clearly communicated to the screeners.


I can't believe you're catching flak for this.

It is absolutely not difficult to distinguish between a photo of burning children and a photo of a sex act. In fact, it's pretty ridiculous to characterize this photo as that "of a nude child."


I don't see any flames in that picture. I've never seen any flames in that picture, but I've been told (since junior high school) that it is a picture of a burning child. What's up with that?

Common sense tells me that the child's clothing was on fire, and she took those clothes off and ran away, and that's what we see a photo of. Fine.

But then... why is this photo described as a picture that contains flames? Not just here. That's always how it is described.


She is surely burning. A napalm bomb was dropped on her village. The photo documents people fleeing. Look at her arms; you can see the skin peeling away.

Here's a photo of her back: https://i.ytimg.com/vi/KZZvVl11PvU/maxresdefault.jpg. This is the life-long damage: http://img2.timeinc.net/people/i/2015/news/151109/napalm-gir....


> I don't see any flames in that picture.

The flame and smoke from the napalm attack that all the people in the picture are running from is pretty much the whole background of the picture, though the fact that its black and white and the cloud is fairly thick may make that less immediately recognizable.

> but I've been told (since junior high school) that it is a picture of a burning child.

Its not a picture of a child in flames [0], though its picture of a child experiencing burning as a result of the napalm drop (according to her, her recollection is that she was screaming "too hot, too hot" at the time the photo was taken), and the absence of visible flame isn't the same thing as the absence of continued burning.

[0] Well, not the child that's the central feature of the photograph. There are children burning in that sense as well in the picture, I suppose, if you consider the background and what is actually burning there.


What an odd thing to be pedantic about. I mean, that's what bothers you about this photograph?

Besides, burning does not require fire to be present. It only requires heat (or chemical reaction).


I didn't mean to be pedantic. I merely was asking a question. Bear in mind that it's a question that I first wondered about when I was myself a child.


It is difficult probably because they don't want to employ real humans to discern the difference.

I'd take their policy over Twitter's. Know how many spam/porn accounts littering on Twitter? At least Facebook is not a home for spammers.


What's wrong with porn accounts on Twitter?


They are spam accounts, not porn accounts. Please don't tell me you can't tell the difference.


There are probably both. You lumped them as if all porn accounts are spam accounts when that's very likely not true, and that is what SysArchitect was commenting on.


They could start by rejecting the default assumption that nudity should be considered sexual by default. And maybe the default assumption that sexuality is bad - I don't want them to shift to trying to detect if an image is arousing or not; but I do want whatever censorship system they feel they need to have in place to be able to distinguish between child pornography and a photograph of a child being burned by napalm. "Really fucking stupid" is an understatement.


They could start by rejecting the default assumption that nudity should be considered sexual by default. And maybe the default assumption that sexuality is bad

I agree with the suggestion, but can't help but laugh at how impractical it is. The people who would make the change are bay area progressives working in tech. And that means saying all the "right" things about sex positivity but stopping short of actually becoming sex positive.


well, yeah. that would hurt their bottom line


> being burned by napalm

I didn't see any napalm in the image. The girl had removed her burning clothes before the photograph was taken. The reason this particular image is considered acceptable is the historical context.

If I were to recreate the image by hiring actors, that would probably be considered child pornography.


> The girl had removed her burning clothes before the photograph was taken.

Removing clothes after the napalm had burned through them (which was manifestly the case here) certainly may reduce the degree to which you are being burned by napalm, but does not mean you are no longer be burned by napalm.

> If I were to recreate the image by hiring actors, that would probably be considered child pornography.

Be considered by whom? Pornography requires more than simple nudity.


She may be burning during the photograph, but that is not obvious without the extra commentary.

> Pornography requires more than simple nudity.

Perhaps, but I can understand Facebook removing that photograph, just in case a judge thinks differently. Child pornography does not appear to be something the FBI takes lightly.


> that would probably be considered child pornography.

What on earth are you talking about? Is there anything even vaguely sexual about this photograph? Did I miss something?


I was not talking about the photograph in the article, but a hypothetical recreation. Even if a photographer were to explain such a recreation were a satire or had some artistic value, I think many people would feel uncomfortable with it. Certainly most would not want their children to pose for such a photograph.

From that thought experiment, I'm deducing that it's the historical context which makes the original photograph acceptable to distribute, and not anything intrinsic to the image.


I think this is actually an interesting art experiment. I wonder if anyone will try.


Yeah, the art is the question of whether or not such a thing would be illegal. Performance art, of a sort.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: