I'm not surprised. I remember reading an article a long time ago about how porn was easier to find with Bing than Google, because the latter was doing a lot more censoring.
...and in general, I've found Bing to be more useful for searching obscure things. There's more spam in the results, but at the same time you're more likely to find what you're looking for amongst them.
But unfortunately with articles like this, it seems that might change... as much as I'm against CP and abuse in general, I'm also against censorship and the degradation of search engine results to only the most mainstream/popular topics.
Yea, they filter their e-mail platforms for sure, but part of it might be the justice department binding their hands.
People in companies that track this stuff down get special hash lists from certain law enforcement agencies and the evidence itself is pretty much a controlled substance; so they need special authorization to safely handle and report it with a chain of custody. Two Microsoft employees got PTSD because of the large amount of data they had to check:
I knew a lawyer who had to defend someone in this type of case. He was allowed to view the evidence (he chose not to), but only him. No paralegals, no one else in his office that wasn't directly sitting on the defense table.
If you're not granted specific authorization, even looking at this content is illegal, so Microsoft, Facebook, Apple, Google, etc have a very limited staff of people who are even authorized to handle this stuff.
They don't just filter their e-mail platforms and consume hash lists from law enforcement agencies. On the contrary,
Microsoft funds and freely distributes the most common tool for checking photos (not file hash based, I think proprietary). That's likely the project the employees you mentioned were on, a job I wouldn't wish on anyone I know. I hope they were well supported both while working and afterward.
I want to avoid using the term - even though they use it - because it's not a hash function like most people are familiar from computer science. Their "hash" function is resistant to alterations and is more like a "fingerprint".
Edit: to be completely clear, PhotoDNA isn't a cryptographic hash. It's a hash function that maximizes similarity of hashes based on inputs, and is probably closer to a bloom filter in some respects.
> I knew a lawyer who had to defend someone in this type of case. He was allowed to view the evidence (he chose not to), but only him. No paralegals, no one else in his office that wasn't directly sitting on the defense table.
> If you're not granted specific authorization, even looking at this content is illegal, so Microsoft, Facebook, Apple, Google, etc have a very limited staff of people who are even authorized to handle this stuff.
With the risk of offending folks, this is such a retarded law/process. Holy fuck. This is dystopian-level letter of the law vs spirit of the law.
It’s the old argument in favor of DRM and restrictive software licensing. “Think of the children” has long been used to prevent people from gaining access to warez and software cracks. As a famous man once said: “Software is like sex: it’s better when it’s free”
The article seems purposefully designed to create a hysteria bubble. The subject is highly repugnant, and the associated emotional charge can be used to attack the target of the article (Microsoft), pressuring them towards an undesirable outcome.
1. "Google does it better than Bing"
Google unfortunately does it better than Bing by censoring all searches. It has been impossible to turn off safesearch on Google for many years (the "filter explicit results" setting only switches between soft and hard filtering). This has a positive outcome in this situation, but also greatly degrades the quality of available results in other cases.
2. Omegle?
The article mention Omegle's role very explicitly, however it reserves all of its vitriol for Microsoft specifically. Why? Omegle and the other platforms that are actually producing, hosting and facilitating the origination of this disgusting content should be the ones that have to get their shit together, or in some cases be persecuted. But Microsoft is a "juicier" target.
3 "even people not seeking this kind of disgusting imagery could be led to it"
They turned off SafeSearch, as clearly seen in the article's illustrative screenshot. This is a sleazy statement meant to whip people into a frenzy.
4. "Microsoft must" "human moderators" "underfunding" "another example of tech companies refusing (...)"
The article explicitly orders people, over and over, to be outraged, despite Microsoft's prompt and appropriate response, and makes assumptions that are unconfirmed or untrue.
TechCrunch's conduct here was not great - it seems to me like they handled the situation in such a way that showed their only aim was to attract clicks, rather than get the problem resolved promptly. It was tabloid behavior.
i for one am furious about someone's news agenda and not about how Bing, a top-tier product featured on billions of computer supported by the best engineers at Microsoft, serves child pornography
Right, the same technology that would be used to block CP and abuse could probably be easily reconfigured to block any topic of choice. Some things are obviously worth blocking, but it's hard to decide where to draw the line, especially when many topics that some would consider blocking are highly politicized.
edit: to clarify again, some things are obviously worth blocking (like CP)! The issue is that once that technology has actually been deployed and demonstrated, you can be sure that different groups will start pushing for other things to be blocked as well. It changes the dialogue from "it'd be nice if you could figure out how to block X" to "hey, you're already blocking Y so why not also block X?" and that's a fairly potent change in my opinion.
Right, the same technology that would be used to block CP and abuse could probably be easily reconfigured to block any topic of choice.
...and more deviously, no one would want to be seen arguing for CP, so it makes for an effective "memory hole" to censor anything.
I have on a few occasions been searching for some highly obscure and technical subjects with Google, using search terms that have basically no connection to porn nor children whatsoever, and saw the ominous "some results have been removed from this page because they contain suspected images of child abuse" message. It really makes one wonder, especially since the word "suspected" is somewhat unsettling: was it actual CP that was removed, or something else entirely for a different reason? The fact that mere possession of CP is illegal makes for a chilling effect ("how dare you even question it --- are you a pedophile? You would not have gotten that warning if you weren't searching for 'bad things', right?"), and an extremely powerful censorship tactic.
The exact line gets blurry with various media. Kim, the female protagonist in the Broadway Musical "Miss Saigon" is a 17-year old whore. There's implied sex, as well as multiple stripper-dances throughout the musical to hammer in the sexual issues that Kim faces.
Something like Miss Saigon is allowed and not CP (despite being below the age of 18) because of cultural significance and whatnot.
There are clear cut cases of CP of course. But when musicals / movies / various media toy with the idea of 16-year-old or 17-year-old girls who are exploring their sexuality... where do you draw the line? What should get censored?
Lolita the novel + movie also gets brought up a lot. As it is explicitly a story about a middle-age man exploring sexuality / erotic elements with an underage girl. It toes the line and never becomes sexually explicit, but its clear from the context what the novel is pushing.
No, because Miss Saigon is a fictional piece of work, as is Lolita. Committing a crime in a piece of fiction is obviously not equivalent to actual child pornography, just like showing an action movie isn't equivalent to footage of actual violent crime.
There is no blurry line here. Child abuse and child pornography are clearly defined terms. Justifying child abuse in a novel might be morally offensive, but is not a crime, abusing an actual child however, and publishing footage of it, is.
> There is no blurry line here. Child abuse and child pornography are clearly defined terms. Justifying child abuse in a novel might be morally offensive, but is not a crime, abusing an actual child however, and publishing footage of it, is.
Okay, let me give an alternative situation then.
A 14 year old girl sends a sexual text message to her (also underage) boyfriend. In Minnesota, the 14-year-old-girl was charged with Child Pornography.
If you think morality is black-and-white, then you are going to cause issues to many people. There's nothing "black and white" about distributing child porn. There are unfortunately, a ton of gray areas.
Yes, there are clear-cut cases of CP that need to be banned. But you must ALWAYS be wary about the edge cases, lest you harm otherwise innocent people. A 14-year-old who sends naked pictures of herself to her friends is... well... objectionable, but it shouldn't result in a sex-crime on her permanent record.
This is the way it must be. Otherwise an underage person could, say, send nude photos of herself to all her teachers, call the FBI, and have them all arrested while she faces no penalty. If simply having the file on your computer is illegal, then any creator of this content, no matter what his or her age, must be liable.
That is a fairly programmatic, not judicious, interpretation of how the law and morality intersect. Maybe the people involved, law enforcement and the courts are a bit better than mere computers. Just a thought.
Nope, if I hide cocaine in your house the crime is mine. If person X send person Y unsolicited CP with the intention of calling the police on them I am pretty sure X is committing a crime.
To be more precise, in most jurisdictions, there's strict liability for possession of child pornography. Which means that mens rea, or even awareness of the crime, is not required.
Minnesota’s criminal statute would have absolutely nothing to do with any federal statute or any other state’s criminal statute
Good news! If you are rich enough to pay and keep paying your lawyers then you can move the case to the federal venue either immediately or in appeals court. Now we have a standardized venue to actually discuess any gray areas or lack thereof
That’ll be $2000 for this “hour” of work to afford your freedom, thanks
Thats pretty interesting, it seems there have been some narrow circumstances for removal of criminal cases from state to federal venue but not a viable option like in civil cases
> There is no blurry line here. Child abuse and child pornography are clearly defined terms.
In many places around the world it's clearly defined to include purely fictional depictions. It seems to be rare, but there have been convictions for possessing child pornography cartoons in the US.
I believe in many places it is illegal to distribute even a novel with explicit CP.
A relevant note is the distinction between production, distribution and possession (both directly or indirectly) when talking about corner cases.
A 14 years old should not be incriminated of producing or possessing photos of his/her naked body, he or she could be incriminated of distribution in extreme cases (the cited case should not be one of these cases)
> Child abuse and child pornography are clearly defined terms.
How well defined are they really? People have gone to jail for possessing cartoon imagery that depicts children in a way that is deemed pornography under child pornography laws. That does not match your definition of abusing an actual child and publishing footage of it. Such imagery is entirely fictional and seems closer to a story in a novel than a physical act that took place in the real world.
We're not talking about sending people to jail, we're talking about blocking content from search engines. Whether or not you think it should be illegal to possess that imagery (which it is in many countries), it seems totally appropriate to block pretty much anything that could be construed as CP from search engines.
Indeed we are. It really doesn't matter what should or shouldn't be illegal, the topic at hand is what defines child pornography. It does not seem to have a clear cut definition, and is entirely in the eye of the beholder. Some believe fiction is absolutely child pornography (as we saw in such legal proceedings), others believe that it requires a physical human to be present to facilitate in the creation of the content. There is nothing to suggest that it is actually cut and dry. Are Lolita and Miss Saigon child porn? Depends on who you ask.
Nudity and porn aren't the same thing. Porn is media intended to provoke sexual gratification. If you prefer porn is what the majority would consider material intended for sexual gratification.
Babies aren't sexual and they also don't as far as I can tell have any legal right not to have their parents take or share pictures of them. We could do well not to invent imaginary legal rights.
Do you really want to be the arbiter of right and wrong in all these cases? A lot of people think naked baby pictures are fine. Who are you to tell them they're not? It seems to me you want to police the whole world to make what you think is indecent disappear. Guess who also has similar ideals? Hardcore islamists (not that I'm saying wanting to block search results and creating a totalitarian regime are the same thing, but the ideal is wrong in both cases, in my opinion).
> Committing a crime in a piece of fiction is obviously not equivalent to actual child pornography
That depends on the jurisdiction. In the USA it isn’t, in Australia fictions in certain media (eg anime) can be considered CP. Even porn with adults who just look young can be considered CP.
> Even porn with adults who just look young can be considered CP.
That's terrifying. I can't think of a more subjective measure to codify into law than, "if she looks young." In the USA we already have a huge problem with subjectivity in interpretation of laws which allows our government/police to go after people they don't like (with real or imagined infractions).
Actually now that I think about it, I think that's what the state of Texas is doing to Cody Wilson. They're charging him with child molestation and child prostitution by saying that even tho he thought she was 18, she "didn't look like it." (not defending Cody, just stating that the USA may be no better at all in the subjective interpretation of law department).
I wonder how the future will classify AI generated content which depicting things like CP[1] but where none of the actors are real. We haven't quite crossed that threshold yet but wouldn't be surprised if something like this were to hit the headlines some time very soon. Will people claim that this is a valuable surrogate to keep these sick people from acting on real victims or will we see claims that it encourages them (like terrorists get radicalized etc).
[1] but not only. other genres like snuff would come to mind. How about an AI generated take on "A Serbian Film"
The line wouldn't be blurry if you were given 10 pictures and asked to sort them by what's allowed and what's not.
We don't have 10 pictures here, we have at least billions and we can't sort them by hand. Using an automatic filtering system is where the blurry line problem comes in.
Fully clothed photos of kids are CP depending on the site where you find them for example. It is "easy" to block everything that could be CP, but that would include photos of children in diapers on Facebook posted by their parents.
It is easy to classify the more horrible stuff, but as soon as you make a line you can be sure some shithead will search for a way to blur it.
In more general terms this is why many things should be at least a bit subjective. Clean-cut definition are too easy to abuse.
This is even more problematic when attempting the use the automated filters that would be necessary at this kind of scale, which have no sense of context at all.
A recent demonstration was in Tumblr's attempt to block all pornography. They provided some canonical examples of things that are explicitly allowed. It was then discovered that their automatic porn filters blocked the images they used as the specific examples of things that shouldn't be blocked.
It’s more brilliant witty egocentric diary of a predator than justification by a predator, punctuated by irony and an unreliable narrator, but basically yes.
Fair enough. I should note that Lolita, the book, has been called to be banned on several occasions though. Because the sexual nature of the book makes people uncomfortable.
In any case, a search engine (like Google or Bing) may decide to ban sexual content like Lolita, for being sexual + about underage minors.
I talk about an actual pornographic case in this post. A 14-year old girl sends a sexually explicit image to her boyfriend, and is therefore charged with distribution of child pornography.
If I can give my uninformed and probably lacking opinion the only reasonable definition of CP should revolve around crimes outside of the digital world.
(Don't take this too literally)
It is for sure CP if it involves actual child abuse or if it promotes child abuse (or normalize sexualization of children)
In Australia young looking porn actresses/actor and loli hentai are illegal because they promote CP as a fetish.
There is an anime I really liked, "Miss Kobayashi's Dragon Maid", except that there are small children (7-9) that while are never depicted in sexual situation are often given excessive sexual maturity; I found that extremely unsettling (I never finished that otherwise truly beautiful series for this reason) because it is how one would try to normalize the idea that "maybe a kid would like it".
In a sense I think we should treat CP as the correspective of hate speech. Hate crimes : hate speech ~ child abuse : CP.
This highlight the conflict free speech vs censorship.
If you are still reading allow me one last opinion :-) another problem with thinking that a problem is simple is that you might miss an important connection between seemingly unconnected problems.
Well, if you didnt have so many pedophiles working in movies and television, then maybe this issue would be a lot easier.
Look at Epstein, Weinstein, Spacey, Schneider, [other disney nickolodian people], and the ridiculous number of abusers in politics in the UK and US, as well as the music industry.
Its a huge problem.
I am personally suprised that not a single person at Youtube has been held accountable for all the blatent shit on that cess-pool of a site.
If you need links and information on Youtube - bing 'Elsagate' as just one example.
Just like you don't need to draw an exact line to know which side porn should be on, or anti-government rabble-rousing, or blasphemy. "Graphic videos of murder" might also be there, given the parallels to CP.
And in case it's not clear, I think those should all be on the same side of the line. The "this is OK" side. (I'm pretty absolutist here and think the "obscenity exception" is complete bullshit)
But that's not what you think, is it? So perhaps that line isn't quite as precisely bounded as you're thinking.
While ideologically coherent this position is inconsistent with another fact: it should be illegal to push people to commit crimes.
We could design strongly addictive porn that would significantly increase the market for production of actual child abuse.
More precisely if you abolish the concept of "obscenity exception" you still have that a lot of the same is illegal by "crime promulgation" (not a nice word, could not think of anything better)
> it should be illegal to push people to commit crimes.
Ahhh, so Snoop Dogg's music should be illegal, then? Since it's supporting the consumption of Marijuana, a drug banned at the federal level in the US.
Or on a more serious note, should encouraging civil disobedience during the civil rights era have been illegal?
That is a stupidly broad statement, and it has consequences that I don't think you'd like.
> We could design strongly addictive porn that would significantly increase the market for production of actual child abuse.
We could also do it with some future form of ML-assisted CGI - which is also currently illegal in the states. I'd say that's a pretty close substitute good for the kind of CP that requires hurting kids.
But really, I don't think the number of people willing to commit a major felony on camera and distribute the video would go up significantly if the end product wasn't illegal to possess. I may have too high an estimation of how intelligent people are...
> While ideologically coherent this position is inconsistent with another fact: it should be illegal to push people to commit crimes.
“Should” indicates a statement of subjective preference, not a fact.
> More precisely if you abolish the concept of "obscenity exception" you still have that a lot of the same is illegal by "crime promulgation" (not a nice word, could not think of anything better)
The Supreme Court has found that the ability of the Government to restrict “crime promulgation” is narrow, due to the First Amendment:
“the constitutional guarantees of free speech and free press do not permit a State to forbid or proscribe advocacy of the use of force or of law violation except where such advocacy is directed to inciting or producing imminent lawless action and is likely to incite or produce such action” Brandenburg v. Ohio (1969).
Homosexuality used to be illegal. If we censored anyone talking about it, it would probably still be illegal today and millions of people would still feel disgusted at themselves for having "unnatural feelings."
Legality isn't always the best way to draw a line.
I didn't say it was moral or what's best for the world, I just meant it's an easily defined goal. And with regards to the other commenter asking about jurisdiction, you change it based on the IP address (obviously this can be spoofed).
I am not claiming that there is a simple solution for how to handle censorship that is optimal for the benefit of mankind. I am saying the way companies should approach implementing it is relatively straight-forward.
Otherwise you'll have endless debates about what is and is not moral, where nobody will ever agree and product managers and engineers and everybody else will constantly be in tension and nobody will be happy.
Censor as little as necessary based on laws and let people make their own decisions for what they want to view.
I agree with the sentiment, but the problem is that it's actually quite hard to distinguish "legal" and "illegal". Judges do that for a living and see how long a single case takes them. It's simply not feasible for an automated system to do it properly.
> Homosexuality used to be illegal. If we censored anyone talking about it, it would probably still be illegal today and millions of people would still feel disgusted at themselves for having "unnatural feelings."
But that's not what the issue is here. Displaying CP is literally illegal - was discussing homosexuality illegal? Was viewing nude males? It's the difference between blocking the literal illegal item or blocking items adjacent to it.
Yes there are countries where displaying homosexuality is illegal (e.g. Iran). I cannot comment on the legality of discussing homosexuality but that's irrelevant anyway since discussing CP is also legal.
> Displaying CP is literally illegal - was discussing homosexuality illegal? Was viewing nude males?
These are weird comparisons to try to make.
Discussing CP is legal. Viewing it is not. Viewing homosexual pornography has been illegal in the past. Viewing nude males wasn't illegal, just like viewing nude children isn't necessarily.
Perfect. Discussion is fine, but the actual illegal act is illegal. So.. the line is easy to draw. Once viewing an imagine is illegal you shouldn't allow the viewing of that image.
The problem is that humans don't suddenly turn green or grow horns or have any other obvious and distinctive change upon immediately reaching the legal age for porn, there's issues of consent, etc.
Although legal vs illegal is binary, the decision tree leading up to that is ridiculously complex.
Should we censor all links with the words marijuana in? That's illegal in most jurisdictions. What if someone is trying to find a medical dispensary for it in a medical state and can't because -censored?
Obviously cp should be censored... but then SE's could start censoring political stuff or anything that does or doesn't support fascism or socialism depending on their mood and political leanings. Not that censoring fascism is bad, but people should still be able to make up their own minds about things through research.
Hopefully AI will become good enough to accurately 'age' and block under-age pictures of minors altogether regardless of whether they're nude or not, honestly I think minors should be protected a bit, although not sure how that would work w/ minors in the public image like child actors.
Because there are numerous examples of companies misidentifying photos of young children in non-sexual contexts as CP, creating legal and other problems for their parents. This is especially problematic given that even if you're legally cleared, even just being investigated for this reason is a massive social stigma (because the moral panic around CP is so extreme, it thwarts presumption of innocence, due process etc).
Because what YOU think of as CP, what the US government thinks of as CP, and what an automated filtering system thinks of as CP might be three completely different things.
Search engines, messaging services, and social networks that are all threatened by this should forego their competitive instincts and share information about how to combat this issue. If Google doesn’t have this issue but Bing and DuckDuckGo do, I’d feel morally bound as Google to share best practices to help them prevent indexing, suggesting, and retrieving this content.
Microsoft also shares a lot of stuff with the industry, including PhotoDNA hashes of known CP images. Just because it is not publicly discussed doesn't mean there isn't a lot of work going on, even between organizations, to work on this problem.
There's a reason this was shared with TechCrunch before it was shared with Microsoft.
Do we know Google didn't also have this issue? Or did they have it, patch it, and then make the press aware. Look back at Facebook's PR groups spamming TechCrunch with "tips" and hoping to seed negative articles about their adversaries. This whole thing, while a very valid problem that needs to be addressed immediately- reads exactly like some PR group dropping tips on a client's competitors to TechCrunch.
Forgive me if I'm skeptical of the motives of someone who cares more about the press finding out first that Bing is leaking potential child porn, over actually removing access to child porn.
I don’t think this needs to be a conspiracy. Bing is far behind. I worked on a suicide awareness project when I was at Quora, and at least at the time Google had help links featured for suicide subjects and Bing had instructions on how to kill yourself. Arguably Bing had the better search results but not the high ground. This could be another one of those cases.
Bing is totally behind, agreed. I probably shouldn't have included the part that maybe Google had the issue and fixed it because I actually don't believe that, and it detracts from the point. I totally agree Google has the absolute high ground here. Which, I won't lie, made me suspicious; particularly with Microsoft doing well in the press recently and Google, well, not.
I'm biased because I generally like Microsoft better than Google, but this whole thing begs the question: why was this directed to the media before Microsoft? Both could've been made aware. Plenty of disclosure-like articles are written with the claim "at press time, the <problem> is no longer showing" and they're no less impactful. With child pornography, of all things, why the hell is Tech Crunch pushing this story so quickly that they had to issue a warning to not look up the links because you could be liable? Like, Microsoft is going to be rightfully shamed either way, ya really need to maximize the shock value with that extra bit? At the cost of leaving active child pornography in the open. Come on.
>>I worked on a suicide awareness project when I was at Quora, and at least at the time Google had help links featured for suicide subjects and Bing had instructions on how to kill yourself. Arguably Bing had the better search results but not the high ground.
This is not being behind, it's showing what the user wants. BING should have banners or ads on suicide prevention, that's all. If you want to kill yourself, a suicide prevention page is not the most relevant one.
This story is to give a black eye to MSFT, they could have just told them instead of doing studies, but that doesn't bring clicks to their site. A lot of times competitors are behind such stories. Not necessarily google, it could be a vendor hoping that MS hires them.
It’s definitely being behind. Helping your users die will affect your ability to profit from them in the future, it also costs almost nothing to show a suicide prevention banner with a help line, and they are known to help prevent suicide.
That said, I’m not at all arguing that a competitor or vendor wasn’t the reason for this article.
I'm not saying that this is right, but the reality is that Microsoft's failure here drives more people to Google, so there is actually a lot of capitalist incentive to allow Bing to fail at this.
Morally, we should all strive to minimize this kind of thing - but from a business perspective, this story will absolutely earn Google business and capital.
Excessive filtering (threshold set to take no chances) has started to make people seek Google alternatives. The easiest way for Google to respond to this threat is to push stories that will pressure the competition into damaging results with excessive filtering.
The search given at the start of the article, "porn kids", now returns that they have nothing matching. Unlike other two word searches, where if it doesn't find something matching both words, it looks for similar things, "porn kids" is a flat out statement that it has nothing, suggesting that they explicitly special cased it.
"porn teens" returns plenty of results, and I doubt that they are all teens who are legally old enough.
One of the two word phrases that should not have had any matches that I tested was "porn isomorphisms". As expected, it gave me a lot of porn that had nothing to do with isomorphisms, and a small amount related to isomorphisms that had nothing to do with porn. All the porn involved women.
I then tried "porn homomorphisms", expecting similar results to "porn isomorphisms", except maybe the "homo" in there would lead to more male porn. To my surprise, there isn't any porn in the results! It's almost all math stuff, or word stuff.
> "porn teens" returns plenty of results, and I doubt that they are all teens who are legally old enough.
Because they hire people who are 18 or 19 who look younger.
While I can't dig up an article about it, there was a guy in Florida who was brought to trial on child pornography charges. Some "expert" insisted the girl couldn't possibly be 18. The woman had to fly over and testify that she was 18 at the time.
> In 2009, federal agents arrested a man in Puerto Rico on suspicion of possessing child pornography that included Fuentes. At trial, a pediatrician testified that Lupe was underage based on her appearance. Lawyers for the defense subpoenaed her to show her passport, proving that she was 19 years old at the time of filming.
Teen porn is certainly fine, and I believe you'd be hard pressed to make the case that Bing's results for "teen porn" are illegal [1]... porn companies specifically hire of-age performers that look young, then further dress them up (hair, makeup, outfit) to look even younger. Nothing illegal about that, and I don't see anything morally wrong with people choosing to watch porn with people that look closer to their own age.
[1]: I haven't actually done the search myself, for obvious reasons.
> The search given at the start of the article, "porn kids", now returns that they have nothing matching. Unlike other two word searches, where if it doesn't find something matching both words, it looks for similar things, "porn kids" is a flat out statement that it has nothing, suggesting that they explicitly special cased it.
Doesn't really surprise me. I can just imagine the manager demanding that the child porn searches stop immediately and this makes the most sense for a rapid solution. The question is now though how well did they guard it. I'm not going to experiment though, those kinds of could probably land me in jail here.
This is what I was going to say. If I was an admin that's what I would've done. I'm actually surprised they hadn't already taken any steps to prevent this kind of fallout.
As for the latter point, ditto. However, I would also avoid searching that kind of stuff simply because I don't want to have such a gruesome image in my head.
> There’s no excuse for a company like Microsoft that earned $8.8 billion in profit last quarter to be underfunding safety measures.
Perhaps the problem is that there is an excuse: it is likely that they believe that they are neither responsible for the images they index, nor are they beholden to proactively engage with law enforcement.
This is where treading the line between conduit and publisher can bite the corporation who is attempting to do so. While Microsoft does not host the images, it does index them and serve links to them. What's more, they also host thumbnails and metadata that allow the images to be browsed and discovered.
Because they host the thumbnails and metadata, are they not a publisher? They're hardly acting as a disinterested router of opaque data when the data served is stored on their servers and indexed by their algorithms.
I don't see how torrent linking is a crime and Bing indexing and offering this kind of content is not. Especially when the anti-piracy lobbying crowd insisted on "what about the children ?".
I don't think Microsoft will be using that argument anytime in the near future, the public backlash would be huge and if they were knowingly indexing it they would still be subject to legal liability - heck even if they were unknowingly indexing it they could be liable for negligence if due care wasn't taken to prevent these kinds of images from being circulated.
I think Microsoft would argue that there is no real way to prevent these images from being indexed considering that most indexing is automated and it's hard to train their systems to recognize offending images and their human employees can't possibly look through all the images to prevent child porn from being posted.
That's why Google hires people to curate their image database and flag illegal images, so that they can detect them automatically in the future.
It looks like Microsoft wasn't doing anything at all to stop new child porn. The fact that their recommendation engine pushes you further in the direction of pedophilia is proof enough that they weren't actively trying to avoid this kind of thing.
Having child pornography in your pocession is a crime. If MSFT has indexed without showing any good faith effort to avoid indexing then the DOJ can and should prosecute them for the felony that it is.
> Having child pornography in your pocession is a crime.
That really depends on the legal domain one is operating in and even between those where it is a crime, the definitions as to what constitutes as "child pornography" and what not, isn't as easily drawn as some people like to think [0].
Around 1500 people a year according to government stats of the UK (VAWG report) are convicted of having animated "child porn". Robin Hoque was unfortunately not an isolated case.
Ah yes, the largest (or second largest, it varies per day) company is failing miserably on all fronts according to anonymous commenter on HN. Sounds legit.
Yes but the company can’t be put in jail. At the most they will punish someone if they did something knowingly. That’s the issue with this law, it acts like companies are people, but in the end the punishments are not the same.
The root goes back to the fact that child porn laws do not contain a mens rea. It's a statutory crime: you possessed the image, you're guilty.
Almost every other crime take in account the mind of the accused. Someone who forgot their dog food was on the bottom of the cart and leaves isn't committing theft, as long as they go back and make it right. And on the other end, we execute people who conduct premeditated murder. And also we take in account an accidental homicide due to negligence (manslaughter) as a limited punishment.
And worst yet, how do you tell a 17year364days old's image on pornhub and 18year old's picture? You get arrested for CP on the 17.364 image if you're caught and they know it's bad. Did it matter if the minor perjured themselves? Nope. Did they use a fake ID? Who cares. You're at fault.
Most child porn statutes, even if they dont include a mens rea requirement in the plain language, often have it read in. Child porn is NOT a strict liability crime. Rather, it's treated more like a drug crime.
Corporations aren't people, but that doesn't mean that they and the people that run it are immune from prosecution. As an example, the Deputy AG in California is considering murder or manslaughter charges against PGE[1].
For statutory crimes, this is how it works for people.
"We don't care the reason or mitigating circumstances you have, or even if you did it. You have the $thing , so you're at fault."
(The -1's, this is how statutory crimes work. I'm stating a fact, not making ethical determinations about it. I personally think that mens rea absolutely needs to be incorporated at a minimum.)
As I've mentioned above, child porn is NOT a strict liability crime. The prosecutor will definitely need to prove you knew or should have known it was child porn.
I do not believe there is any purposeful or malicious intent from Microsoft. But that doesn't mean they cannot or should not be held responsible for the data that they host and serve.
Wave after wave of recent subreddit bannings due to reddit now believing they will be directly responsible for data they are helping to serve should make one stop to reflect about how exactly such a law would be enforced.
Fine, but responsible in what way? Not like they did it on purpose. Their bots found the images, Bing just forgot to exclude them. Now they're work a lot more to exclude them.
I think Microsoft is being purposefuly laxer than other engines. Their main intend must be to have more ‘regular’ porn, but child porn will also fall in the basket.
What’s the gain? relevance among porn users.
Perhaps we can compare this to fishing. Nobody catches dolphins on purpose, but a lot of fishers will use wide nets well knowing that some dolphins will be trapped in.
If there wasn't a gain, why would they process or index them in the first place? If you don't want to be responsible for the data you process, don't process it.
Because classifying huge amounts of content is highly non-trivial. It's not like there are a million employees sifting through every spidered resource and manually deciding whether it should be indexed.
This article doesn't do justice to the work Microsoft has done with combatting child pornography in an attempt to paint Bing as some intentionally neglected backwater and Microsoft as some criminally negligent monolith. Microsoft's PhotoDNA is far and away the most widely shared detection tool and is in place with a ridiculous range of organizations. More importantly, Microsoft works closely with LE and makes it easy for their partners to integrate with LE. Google avoids the problem in the article entirely by doing a nudity blacklist for image results, which nicely solves the problem of some reporter doing a hit job, but avoids actually doing anything that makes finding material more difficult.
There are plenty of problems with black-listing based approaches, and I strongly disagree with some of the technical decisions made in PhotoDNA [1], but Microsoft should be credited for their commitment to reporting and prosecuting, and bringing their processes to other companies who otherwise would be unable to dedicate the resources required to be effective.
What I find bizarre is that there would be so much child pornography still left in the indexable public internet. I would have imagined that pedophiles would have been pushed underground long ago. Why hasn't law enforcement attacked the hosters and sources of such content? Surely finding them can't be that difficult if even bing finds the material.
Apparently one of the reasons for removal of all NSFW content from tumblr was the sheer amount of CP on it - someone on reddit wrote a whole comment about it. Basically people created lots of blogs linking to other CP content, which was re-shared, but even if the original blog was deleted you could still see the re-shares and find the original content.
There was so much CP on Tumblr I wonder if it wasn’t an elaborate honeypot. It was sooo bad and over the top. People have no idea. There’s also been a lot of CP on Instagram, including child sex trafficking rings advertising their wares. Backpage was the largest single source of child sex trafficking operations in the US and many in LE knew it for years before it was finally shut down.
I’ve unfortunately seen far too much of what people don’t believe is out there. That’s what I get for helping anons “dig”. As they say “nothing is beyond our reach”.
The real story is much different. Interpol, the FBI, etc have been handed a golden tool to find child porn users. They could be prosecuting all those sites and investigating their visitors, but instead, they turn a blind eye.
Rather than stop real predators seeking out this garbage, they run sting operations to entrap people who are too uninformed and poor to fight back then pat themselves on the back about fighting crime.
Chasing users feels like a complete waste of time. Taking down the sources (e.g. on the darkweb) and physically recovering the kids involved is surely a better use of their time, right? Ideally they'd do both, but finite resources...
Chasing users is useful because those people are high-risk to be abusing children in their lives.
The government drones on and on about the dangers of "The dark web" and how we need to give up all our security (because if we don't, they can't possibly catch anyone).
Instead, we find hundreds of sites so easily accessible that a standard web crawler can locate them. It wouldn't be too difficult to track down the site and monitor it to find the owner. At that point, shut down the site and squeeze the owner. Sooner or later, one of them is going to have links to producers that can be followed.
I would hope that any content found by human moderators (or possibly AI programs, in situations where the AI is particularly confident) gets reported to law enforcement.
This allows law enforcement—given the requisite time and money—to do the things you describe, without also endangering regular people.
Unless you're advocating the FBI should purposefully leave these results in place as a form of entrapment, which has a plethora of other issues.
Bing -- which obviously has a magical ability to find child porn as they crawl the web that non LEO can match. There was more than a little sarcastic meaning.
This isn't someone hiding on TOR. This is someone on an easily traceable, publicly available site who could be tracked down and squeezed to find the source (and hopefully find and rescue the children).
>“Omegle for 12 years old” prompted Bing to suggest searching for “Kids On Omegle Showing”,
Results called "kids on omegle showing" suggests that the kids were being prompted by predators to produce child pornography on social networks. There has got to be some rule about letting kids access these social networking platforms. Who could possibly think it's a good idea to let a child post their photos videos and profile information online and leave that open to the public for any predator who wants to reach out to them. And what's worse these kids are probably using these things unsupervised.
I wonder how a search company could hope to really effectively combat this content considering it's probably constantly being produced and circulated on a daily basis. Although one should expect them to keep track of and closely monitor keyword phrases routinely associated with child porn.
Omegle is more of a chat room than a social network. Users can video chat with random strangers. What I found odd is that Omegle video chat is moderated, probably to curb unwanted sexual content, but there are also two other video chat categories: adult, which allows sexual content, and unmoderated, which sounds like it has no systems in place to remove unwanted content like child pornography. It seems absurd to me to have an unmoderated category on top of an adult category - what legal things would you need an unmoderated video chat with strangers for?
The results for "kids on Omegle showing" are most likely other users who have captured screenshots or recordings of the illegal video streams on Omegle.
Wow. That must've been a pretty disgusting story to investigate. I heard people at Facebook that have to moderate this kind of stuff develop PTSD-like problems.
"That must've been a pretty disgusting story to investigate."
Please, this is limited to law enforcement. DO NOT TRY TO INVESTIGATE THIS yourself. Looking and finding CP could be considered a crime already in your jurisdiction.
At the same time, citizen investigations can be powerful things.
Investigating corporations for flippantly allowing child abuse materials to be indexed and distributed on their platforms should never be illegal. The fact that Israeli investigators did this research, rather than Americans (Google is an American company) shows that our own low enforcement clearly isn't doing it's job here.
I have a toolchain for finding child abuse content on Youtube, and I use it to report videos to Youtube for takedown. It's absolutely insane that I can be held criminally responsible for finding this content - but Youtube is immune from consequences hosting it.
Some of the videos I have found had millions of views.
Many that I have reported have not been taken down.
I’m still haunted by some of the things I stumbled across many years ago when I was exploring FTPs just to see what I could find. That all ended the day a video labeled “AMV 1” turned out to be (at least the few seconds I saw) a very young boy tied up and being raped. I reported anonymously to the FBI, but even then it was clear they were overwhelmed and I never heard back.
If I had to see that kind of thing as a job, I’d die.
My friend is on the other side of this fence, FBI agent, that's had to conduct surveillance of child pornographers after tips like this to collect sufficient evidence to prosecute. Luckily, he was eventually transferred into a white collar crime division.
There are a lot of revolting, but vital jobs out there. Hats off to those with the psychological disposition to handle them, even if not sustainable in the long term.
When I was looking at which professional direction to take, I started looking at government jobs (mostly law enforcement) and spoke with some industry people. My take was: Get prepared to look at a LOT of ugly stuff. Also heard horror stories that just thinking about it, makes me want to puke and punch somebody at the same time.
The more I think about this, you can have a serialized A->B filter. Filter A is the large group of convicted pedos. They won't likely be as traumatized as the B group. The B group is the existing moderators and will get whatever the A group missed. The A group should minimize the PTSD impact on the B group.
I said leader of the free world. They will choose to not work in the coal mine and prefer places with kids. They have served their time so we have to give them options they will choose.
I'm trying to figure out what the hell is going on inside your head. Child predators that get released from prison can and do have restrictions on employment placed on them. They're not allowed to work with kids. Are you unaware of that or something? Do you actually oppose that? I can't figure out your angle here, but you're distressing me.
I came up with a valid solution to a real world problem. Denial just continues such problems. That is where my head is. Pragmatic viable solutions. I oppose doing the same thing over and over again and expecting different results. Today they are most certainly allowed to work in places that have children, just not places that specifically meant for children such as schools. The moderation room at microsoft does not have children.
I never said to roll back existing restrictions. I feel that I am being projected on. Perhaps this topic is too emotionally charged for HN.
> I came up with a valid solution to a real world problem.
No, you absolutely did not. The real solution, which has already been arrived upon, is to legally forbid pedophile felons from working with or around kids. Rolling that back is not 'pragmatic', it's fucking moronic. Feeding their fantasies by paying them to look at pornography is the worst idea I've heard in a LONG time. You are seriously creeping me out.
> The real solution, which has already been arrived upon, is to legally forbid pedophile felons from working with or around kids
That doesn't stop them, especially if they have not been convinced of a crime. Instead giving them the option to get paid to look at said images will make a lot of potential criminals get a job away from any children - plus they won't be getting PTSD from looking at them, unlike non-pedophile operators.
> Feeding their fantasies by paying them to look at pornography is the worst idea I've heard in a LONG time
Mind if I ask why? It's not like most of them are not already looking at CP.
I've heard rumors that PhotoDNA incentived the creation of new child pornography that ostensibly isn't in their database, rather than the traditional sharing of older images.
That's an interesting idea. I don't see how they would establish cause and effect, so I'm inclined to not believe it. But it is interesting to consider unexpected incentives.
I mean, the effects (your life being ruined forever if caught) isn't exactly amenable to the scientific process and fully establishing cause and effect. It's totally rational for them be extra cautious beyond what they can establish.
I meant, how would you prove that more was produced as a result? How do you know that now isn't produced because there are now people in the world in general? Or some other cause?
It shouldn't be. PhotoDNA produces fingerprints of known illegal images. But there still must be a human moderator in the chain to flat an image as illegal. Preventing new illegal images from being indexed is an entirely different, and much harder problem.
It is ironic but not entirely surprising. Microsoft is enormous and the process for managing Bing as a product is probably so onerous that even integrating an “in-house” system might take years and incur heaps of bureaucracy.
If Bing really wanted to do this, it’d probably be the same process as a separate company, except that the email domain of the parties working on the integration would be the same.
I would suspect every search engine has this problem, and every social network has the problem WhatsApp did with trading pictures in private groups.
Calling out Bing in particular isn’t useful, but drawing attention to the problem might be — or it could turn into another repressive crusade. Hard to say.
It's self-evident that a privacy-focused service is inevitably going to attract illegal/anathematic behavior. And it's possible that after enough stories like this and "terrorists coordinated on <random E2E encryption messaging app>", average people may decide that the tradeoffs inherent to privacy aren't worth it and start supporting things like Australia's encryption ban.
They way this usually works is: a problem surfaces, a team is created that creates countermeasures, they measure the results, they solve it, then the metrics become part of a dashboard. They run against a wall improving the metrics, they move on, team funding is cut in half (or moved to another location, like india/china, with a junior team). Meanwhile adversaries find new ways to counter the measure. The article mentions several components of the stack, developed in different locations. I presume right now they have put up a tiger team to fix it. The cycle begins again...
Some of the search terms seemed like pretty low-hanging fruit. "Porn CP"? I thought that there was some kind of initiative to just show no results for CP-seeking queries, and that seems like one of the ones that would be pretty obvious to block. I wonder if we're not just seeing the result of a bug.
Google does that, some are obvious but even those that seem obvious end up causing problems.
You can't use the word "minor" at all in combination with some terms, even though it's a common last name.
I don't know what the answer is, but it seems like we should be able to do a much better job selectively returning results to "wall off" the things we want to separate and avoid accidentally letting through, rather than simplistically blocking words, which is clearly what's happening in some cases.
You don't necessarily have to identify every single image to know that certain terms should not be returning results from sites and pages that have a high probability of being porn, and certain NSFW terms should not be returning results from sites aimed at, or content known to have been made by, children.
Anecdotally it seems like all we've done is to edge closer to ruining search for legitimate situations without accomplishing much.
> I don't know what the answer is, but it seems like we should be able to do a much better job selectively returning results to "wall off" the things we want to separate and avoid accidentally letting through, rather than simplistically blocking words, which is clearly what's happening in some cases.
This is just defense in depth. If you know a query returns 90% CP before filtering, even if your filters are 99% sensitive, you're going to get some CP in the first few pages for that set of terms. So if you identify a query as CP-seeking, then you would rather probably just show nothing at all. Of course, the definition of CP-seeking would have to be tuned, but the ratio of legitimate to CP results would have to be a component of that.
> You don't necessarily have to identify every single image to know that certain terms should not be returning results from sites and pages that have a high probability of being porn
We definitely already do that, and have been doing at least since the late 2000s.
> Anecdotally it seems like all we've done is to edge closer to ruining search for legitimate situations without accomplishing much.
Objectively, this is not true. We have accomplished a great deal in terms of CP suppression, and search is better than it ever has been for the vast majority of legitimate queries. Regardless, I'm sorry that you feel that way.
You would think that people who have illegal content would try to hide it from a mainstream search engine? Aren't they just sitting ducks waiting for police to come arrest them? I thought this kind of thing was on the dark net, and there was no way to accidentally see it?
The problem of this article is that it is untrue, and that because the material may not be viewed, almost nobody has any means to check that it is untrue.
Apart from incidents, which are probably removed very fast, there has not been and is there is no child pornography on Bing in the past year. It is true that the keyword recommendations are very disturbing. They probably reflect what other people have been searching for.
Source: I'm sexually attracted to (some) children, and I sometimes use Bing to search for legal photos of (naked) children, because I know that Bing uses PhotoDNA to filter out everything illegal. I have never seen any child pornography. There's only naturism without any sexual posing and models (woman) that might look 15-17 but are actually from legit porn sites.
You may think that's disgusting or immoral, and I can understand the disgusting part. I know that I'm sexually attracted to young boys since I was 15/16 years old, and I have decided to never act on that attraction. I have a stable relationship with another adult. However, viewing naturism photos of children is not illegal, and I don't think I harm anyone by viewing such photos.
I you want to have some information about pedophilia, as there are lots of myths about it, I think this is a good resource with linked sources: https://pedofieltweets.wordpress.com/2018/12/30/pedophilia-e... . Main point to take away, is that most abuse isn't committed by pedophiles, and that it is very likely that most pedophiles don't abuse children.
For the report embedded in the techcrunch article, I'm getting "This document has been removed from scribd" when attempting to view or download it. Why couldn't they just publish it as a PDF file on the techcrunch website?
Without even reading the article, I know exactly what they are talking about. I completely stopped using Bing a few weeks ago after stumbling upon the search phrase "nude beach" with safe search turned off. Do not do this.
"Nude beach", and other terms related to social nudity and body freedom, have long been coopted by child pornographers and sex traffickers.
It's a very common phrase that's been around for a very long time (decades) used by those in the CP community, often related to selling or buying images or videos.
Tangent: (I need to start making a note of the search terms I'm using when I notice this so I can get confirmation I'm not just ignorant of the relation, but) I'm often surprised by the number of suggestive images that show up in image results even with safe search selected. I usually use duckduckgo, but sometimes also google image search.
Search is one of those businesses where just delivering a service that to most consumers seems to not change much from one year to the next actually requires a lot of work.
Actually improving noticebly requires an astonishing amount of work. Improving to the degree that users overcome their brand biases is harder still.
one) Article should redact search terms if it is illegal to try them - we can't even verify findings are real. Also, I predict a number of people (likely low but still a lot), searched the terms to verify.
two) A lot of pornographic websites will advertise "young", "girls", "teen", and maybe even "kid" but all persons will be of legal age, some though do dress/look much younger.
A) Maybe the people doing the investigation can't tell / have poor judgement.
B) How would Microsoft Bing know whether the young "stars" look young, or are actually young?
My vote would be to ban the whole damn search keywords altogether. Let investigator find and prosecute those in the darker corners of the web but just ban searches for keywords like those in the article.
1. Not filtering out all child pornography images..
2. Suggesting search terms relating to child pornography.
The second point is the most damning, in my opinion. Here's an example:
1. Navigate to Bing Images.
2. Turn off safe search.
3. Search Bing images "sex"
4. Click on the second image.
At the top of the image detail page, there will be suggested search terms. One of them, for me, is "baby sex fetish"
They seem to process each search result image, and suggest other search terms based on what they think the image contains. For example, if I search for "breast", and click on a NSFW picture containing smaller-chested model, I end up with suggestions like "Skinny Small Tit Girls Naked".
For what it's worth, Google doesn't provide suggestions for NSFW searches at all. I think this approach goes a long ways to solving this specific issue.
Duckduckgo does this too. Rather than typing 'ne' to autocomplete this very site, I typed 'nn' once and there was child porn on duckduckgo's first page.
I've switched from Google to Duckduckgo for the majority of searches and found that using Duckduckgo's image search is too NSFW to be used at work.
One time I searched for a building called "Banana Alley" and was trying to find a photo of it to send to a friend. Google correctly returned a photo of the building, however Duckduckgo returned some "interesting things adults do with bananas".
I believe Google has a pre-processor to work out if you're searching for adult content first before deciding if to show adult content in the image results.
So if I search for random terms - Google will never show porn, unless I include sex-related terms in the search query. And only then will the NSFW filter be removed.
Microsoft is, as the article points out, a multi-billion dollar company. Though they clearly have not invested enough effort into preventing this, they've almost certainly invested a great deal of effort. And it was still ineffective.
What does this mean for smaller companies that make products which index the open web? Put another way, does this stuff show up on Duck Duck Go?
Is there a reason they don't run child exploitation detection ML engine (PhotoDNA) on the media they index? I'd guess they already run an ML engine to "describe" a image while they index it and this additional step shouldn't be too expensive?
PhotoDNA isn't ML-based. My understanding is that it's essentially a Bloom filter for images -- it'll tell you if your image is in the set of known-bad images, but it won't recognize newly created images.
Based on the redactions in the images in the article, those searches are returning ~75% cp. I guess that could still be a small portion of whats out there, but that's horrifying.
Porn accounts for 1% of images. Child porn accounts for 1 in a million fraction of that. So it’s 10 per 1 billion images.
If you figure the filters exclude 90% of child porn images, that’s 1 per billion which will show up in search results.
I can’t find a good estimate of total number of images, but YouTube shows 5 billion videos a day and gets 1800 minutes of video per minute.
So if we estimate a trillion photos, then we’d expect around a thousand child porn images to make it past the filters, and Bing to be able to return a few pages of 75% child porn when we accidentally stumble on a term in that category.
There's a slightly bizarre situation here where search algorithms are working against themselves, yeah. Presumably anything spiked by PhotoDNA isn't returned at all - which frees up those spots for the next-most-relevant result. And the more effective Bing's indexing is, well, the worse that result will be...
Since PhotoDNA is basically a known-bad tool, it presumably can't win the battle unless turnover is fairly low. Penalizing or hiding sites with many PhotoDNA hits (or perhaps a high percentage of hits) might do better by targeting concentrators, but that would depend on what sort of sites are serving this stuff. I assume they have to be fairly small/scattered to stay operational, which in turn makes it harder to predict what sort of content they have.
(And despite the article, it doesn't seem clear that Google has solved this problem, so much as bypassed it with a whitelist approach to nudity in general.)
From the article: “We index everything, as does Google, and we do the best job we can of screening it. We use a combination of PhotoDNA and human moderation but that doesn’t get us to perfect every time. We’re committed to getting better all the time.”
Is there a reason not block the engine from returning image results or suggesting related results based on the search terms? I realize that's probably naive but seems less difficult than starting by trying to classify the images.
Naive thought/question: if CP becomes easily accessible for free through search engines, won't that cause less CP to be produced, since the producers will be less able to make money from it?
Jevons paradox states that increases in production usually cause an increase in demand, and there might be similar effects here: The availability might make it more socially acceptable, which in turn might increase the demand, and therefore also the demand of original productions.
This is probably automated keyword suggestions... what the the police need to do is go after the source of the problem (take down and prosecute the ones hosting these files).
While this makes totally sense, it is just a part of solving this problem. Often, those files are hosted on foreign servers with next to no chance for law enforcement to get access and take those servers down. A filter will definitly help with the spread of these atrocitys, this also protects the dignity of the victims.
Google does not show these search results so it must be entirely possible to filter them out. Why can't MS do the same thing for Bing?
we have seen the fbi blocking content that was hosted in other countries... why put every and all companies in charge of policing? is law enforcement too lazy?
Because _Microsoft_ is sourcing and indexing these images, not the FBI or the general society. Going from there, i think it is entirely reasonable to see the responsibility with Microsoft.
I'm not saying that police should not go after the ones researching these pictures (even if the offenders are cops themselves)... only that Microsoft didn't program these keywords in the search engine.
I stopped using Bing to search for porn specifically because the recommendations were creeping me the fuck out.
I remember searching for something like "Blowjob Cumshot", and that recommendation bar started giving me options for "Preteen Blowjob Cumshot", "Middle School Blowjob Cumshot", and "Little Girl Blowjob Cumshot".
All of the thumbnails for these buttons were clearly lolicon/hentai, but it blew my mind that Bing was actually suggesting these search terms as alternatives to mine...
This was years ago, so it blows my mind that this is still a fucking problem...
It still rubs me the wrong way that simple possession of child pornography is illegal. It's obviously a good idea to criminalize production of child pornography, and possibly the sale of as well, assuming that sales of child porn incentivize production of it, but images and video do not actually harm any children, meaning criminalizing possession does nothing, and is in fact an example of censorious and moralistic government overreach.
Consider that it is perfectly legal to watch cartel or ISIS execution videos, and that the acts depicted in such videos are far worse than child rape. You disagree? Are you saying you would rather be flayed alive than be retroactively raped as a child and live with the trauma of it? Under what principle does it make sense that possession of gore videos is legal, and possession of child pornography is not?
>but images and video do not actually harm any children
Yes, they are. In the creation of that video, a child was harmed. Criminalizing the consumption of a good is aimed at stemming the production of a good.
Yes, but I am stating it's near guaranteed that criminalizing consumption in this case has no effect on the production of it. The "good" being child rape, do you actually believe there are people who have raped children purely to take pictures and videos, and distribute (not sell) them? Absent people doing that, criminalizing child pornography possession protects no children.
Note that child rape is not a required part of the definition of child porn.
It could be a teen, posing provocatively, taking a selfie. Maybe they are 17. Maybe they feel the photo will promote a pop star career, or that it could be sold to pay for college. Maybe they just want it for personal reasons, to have 40 years later.
There's also some nice secondary effects of criminalizing possession of child pornography. What little evidence we have about pedophiles suggests that the majority of consumers of child porn have also sexually abused children personally. And, while child sexual abuse tends to go un-reported and can be difficult to prove in court (usually the child is the only witness, limited physical evidence, etc.) possession of child porn is much easier to get a conviction on. So not only is criminalization stemming the production of child porn, it is helping to find and convict sexual abusers.
I would argue that this is a bias introduced by the legal and societal regime surround child pornography. We only have statistics for those that have been caught and the prosecution also has an incentive to make the charges as extreme as possible.
Also, hopefully, the police are going heavily after producers and achieving high conviction rates which would give that statistical trend.
I over-stated the evidence. I was thinking of a study claiming 85% of felons convicted of child porn possession also admitted to also committing child sexual abuse in anonymous reporting. But it seems there's a lot of controversy around that study: https://www.nytimes.com/2007/07/19/us/19sex.html .
That said, my understanding is the evidence does seem to indicate a correlation between the two, though in the 30-40% range. See https://web.archive.org/web/20080111204617/http://www.ndaa.o... for example. Not a majority, but still significant. And, of course, this sample is just people who were convicted of possession, not all people who ever viewed such material, so I'd have to qualify my statement to "convicted consumers of child porn"
I agree with this post. Lots of people viewing images of child sexual abuse will go on to commit a contact offence (or already have committed a contact offence).
But there are plenty of people who won't go on to commit a contact offence.
This is a problem for law enforcement because they need to monitor all people who've viewed images of child sexual abuse in order to stop them committing contact offences, and this second group inflates the numbers. Maybe it's only 20% more, but maybe it more than doubles the numbers.
Some (many?) people do kill themselves, being unable to cope with the trauma of being sexually abused as a child. I don't think it's a useful approach to state that execution is "worse" and to consider the legality of the images by comparison.
I think we should view "simple possession" of society-harming materials as a health issue and not a criminal one; same as with drugs of all kinds. Pedophiles have a mental disorder and if we as a society believe in the ability of individuals to overcome their primal predilections and behave legally, then we need to help them do so and not punish their thoughtcrime. But it's difficult for most people to stomach the risk that an innocent person will be hurt because we didn't take drastic measures to eliminate holders of drastic thoughts.
I don't know the actual chance that a pedophile-in-thought will become a pedophile-in-action, but consider: if it was a 50% chance, would you be okay letting a demonstrated pedophile-in-thought run free, until they actually demonstrated their inability to restrain themselves? Knowing that at least one (and probably more) vulnerable persons would be irreparably harmed before the damage could be contained.
Are you seriously suggesting we should preemptively lock people up, before they even attempt or conspire to commit a crime? Surely there are less drastic actions — increased surveillance and offers of psychiatric help, for instance, just like there are special programs in high crime areas (which I think we can agree shouldn't include arbitrarily harassing and arresting people).
Reread, I made no such suggestion. In our black-and-white culture, though, we don't have much of a bin for "that's not illegal but it's societally risky so we're going to have to keep an eye on you". Which is why we've made "simple possession" illegal, so we don't have to manage that nuance. My point was just that making it legal isn't the right move either.
I'm not much more comfortable with your response. You seem to be saying "simple possession" is not really harmful, but we'll pretend that is (and that's how the courts treat it — in fact, victims can recover damages from people in possession of their materials) in order to preemptively punish people that may actually do real harm.
I disagree. Where there is demand, there is profit to be made, and there will be supply. You really have to mitigate on both sides of the equation.
What I do think might merit a discussion is the definition of possession. If my hard drive caches an image served up by Bing, which was auto-suggested, I shouldn't be charged with possession.
I agree to an extent, but it's also easier to pressure people to give up their sources if possession is illegal. I have children, and I really can't accept other children being abused, so I'm willing to accept a little censorship if it can lead to fewer children being abused.
However, this is one of the few cases where I am okay with censorship, so I'm very open to proof that censorship of this content does not decrease prevalence of child abuse.
If the goal were really to stop non-consensual production, then we'd allow possession in exchange for letting law enforcement know where it comes from. People would register it, and then law enforcement would better be able to track down the source.
The ban on possession means that many people would rather destroy a hard drive than tell anybody. Coming forth as a witness is too risky. It seems our real goal is more along the lines of "out of sight, out of mind" and the demonization of people with curiosity or other disturbing urges.
>> so I'm very open to proof that censorship of this content does not decrease prevalence of child abuse.
Well, I guess people who molest children do it because they like molesting children - not because they want to produce CP. Producing CP is in addition to an illegal activity they already do anyway.
But yes, I'd like to see proof on whether the censorship is doing anything here, I just don't know how anyone could ever do any research into it.
Children have no choice in the matter, and this context will exist for the rest of their lives. It's pretty damaging.
It is interesting that photos of children shot to death is legal, and it shows the really weird relationship humans have with sex, which is it's own troublesome thing.
At the same time, the prohibition on the content may actually keep some people out of jail. I wonder how many famous people appear in photos of child abuse, who are powerful enough to avoid jail time. With the context being so controlled, juries simply may not recognize high ranking executives or Fortune 500 people in evidence at trials, and I wonder how much the control keeps abusers out of prisons.
>It still rubs me the wrong way that simple possession of child pornography is illegal.
Interesting to note that it's actually not, at least in the US. It's illegal to deliver or receive it across state lines. Also, child abuse is illegal. This is why parents can take baby photos without going to jail.
Federal law prohibits the production, distribution, reception, and possession of an image of child pornography using or affecting any means or facility of interstate or foreign commerce (See 18 U.S.C. § 2251; 18 U.S.C. § 2252; 18 U.S.C. § 2252A). Specifically, Section 2251 makes it illegal to persuade, induce, entice, or coerce a minor to engage in sexually explicit conduct for purposes of producing visual depictions of that conduct. Any individual who attempts or conspires to commit a child pornography offense is also subject to prosecution under federal law.
> using or affecting any means or facility of interstate or foreign commerce
That's a legal cliche that's basically necessary for the law to be constitutional. But what "interstate commerce" means in practice is not just products crossing state lines. An infamous Supreme Court decision ruled that growing your own wheat for personal consumption "affects interstate commerce", for example.
That's just because the Commerce Clause (and the Foreign Commerce Clause) puts limits to federal jurisdiction, so there has to be a nexus connecting the conduct to interstate or foreign commerce. At least in theory, because the Supreme Court considers almost everything to "affect" interstate commerce (see Gonzales v. Raich). But I'd be surprised if all the states didn't have similar offenses.
Who was the guy who found this and why the fuck was he searching "porn kids" and "omegle kids showing"? Like I get that it's an issue and all but this dude was looking up really sketchy shit that you wouldn't just stumble upon by accident.
They don't say exactly what the "anonymous tip" was. It might've just been "hey, child porn is easy to stumble on on Bing". If the filtering's this bad, maybe they searched "kids underwear" when trying to find some new undies for a toddler, and got a bit of a shock.
The specific keywords appear to be ones come up with by the researchers, who are (intentionally) trying to find keywords that come up with stuff.
Just searching for "sex" with safe search turned off results in child pornography search suggestions. You don't have to be looking for illegal content for this to surface.
...and in general, I've found Bing to be more useful for searching obscure things. There's more spam in the results, but at the same time you're more likely to find what you're looking for amongst them.
But unfortunately with articles like this, it seems that might change... as much as I'm against CP and abuse in general, I'm also against censorship and the degradation of search engine results to only the most mainstream/popular topics.