The opposite is true. The new law makes it considerably more risky for large companies because the law is specifically designed to hold them to account for conduct on their platforms. The (perceived) risk for small websites is unintended and the requirements are very achievable for small websites. The law is intended for and will be used to eviscerate Facebook etc. for their wrongs. We are far more likely to see Facebook etc. leave the UK market than we are see any small websites suffer.
A small website operator can keep child pornography off their platform with ease. Facebook have a mountain to climb — regardless of their resources.
> A small website operator can keep child pornography off their platform with ease. Facebook have a mountain to climb — regardless of their resources.
Facebook can actually train AI to detect CSAM, and is probably already doing so in cooperation with NCMEC and similar organisations/authorities across the world.
Your average small website? No chance. Obtaining training material actively is seriously illegal everywhere, and keeping material that others upload is just as bad in most jurisdictions.
The big guys get the toys, the small guys have to worry all the goddamn time if some pedos are going to use their forum or whatnot.
But you shouldnt need to use file uploading services! File upload doesnt require additional services, it has been a well understood part of HTTP for decades. You can do file upload using normal web form submission in your web server/CMS/Rails/Laravel/CGI program without paying a monthly subscription to some service at an exorbitant markup.
Also, those filters are obviously imperfect. Remember the man who got his Google account terminated because he took a photo of his son's rash to send to his doctor? Pedo alert, pedo alert, a child is naked in a photo. My parents must be pedos too, they took a photo of me sitting in the bath when I was a toddler. Call the police.
Have you run a forum, in, say, the last decade? The amount of spam bots constantly posting links to everything from scams to pints to guns is immense - and no, captchas don’t solve it.
“
We’ve heard concerns from some smaller services that the new rules will be too burdensome for them. Some of them believe they don’t have the resources to dedicate to assessing risk on their platforms, and to making sure they have measures in place to help them comply with the rules. As a result, some smaller services feel they might need to shut down completely.
So, we wanted to reassure those smaller services that this is unlikely to be the case“
“If organisations have carried out a suitable and sufficient risk assessment and determined, with good reason, that the risks they face are low, they will only be expected to have basic but important measures to remove illegal content when they become aware of it. These include:
easy-to-find, understandable terms and conditions;
a complaints tool that allows users to report illegal or harmful material when they see it, backed up by a process to deal with those complaints;
the ability to review content and take it down quickly if they have reason to believe it is illegal; and
a specific individual responsible for compliance, who we can contact if we need to.”
>they will only be expected to have basic but important measures to remove illegal content when they become aware of it. These include:
>easy-to-find, understandable terms and conditions; a complaints tool that allows users to report illegal or harmful material when they see it, backed up by a process to deal with those complaints; the ability to review content and take it down quickly if they have reason to believe it is illegal; and a specific individual responsible for compliance, who we can contact if we need to.”
All of those things are buttons to click and ship with every piece of forum software from the last decade. No forum can survive without moderation because of spam so these tools and policies will already be in place on every website with user generated content.
"We pinky swear to totes not enforce the law as written [unless and until we decide, with no notice or warning, to do so] up to and including criminal penalties". Not as reassuring as you claim it to be.
Exactly - the liability risk is huge, and relying on them not enforcing the law because they say they are 'unlikely' to on small sites is not a risk any sane person would take.
That's not what they are saying. What they are saying is that the law as written doesn't require the things that many small sites have been saying will be too expensive to comply with. The law as written only requires those things for large site and sites with elevated risk of certain harms. For most small sites any required changes will just be minor tweaks to things they are already doing.
We don’t need to trust what they say, we just need to engage in a little critical thinking. What’s the benefit for Ofcom in pursuing tiny websites? There’s no political benefit, no financial benefit… the guidance from Ofcom reaffirms the natural conclusion.
There is no political benefit to imposing liability on any online forum operator for content posted by others?
Governments can abuse their power to silence speech it doesn't like. Governments can use agitators to develop pretext for legal action. Governments can make examples out of small-time defendants to send chilling effects. Governments can have prosecutors who may not be evil, but merely overzealous and harmful.
At the end, it is about a default to freedom of speech and content online (short of objectively illegal content) or a default to self-censorship and closing the gates on open forums.
Sorry, but that's foolish beyond belief. The law allows and probably mandates them to do so. You can pretend that's not what the law says but it clearly does. And it was written with intent and advice, so that's what the writers intended as well.
But if it's so simple, volunteer. Take on the criminal penalties yourself and perform the reviews.
I'll remind you of two thing which a lot of people often forget with hobbies/volunteering and may make this argument moot for you: Just because someone gives time for free doesn't mean that time doesn't cost them or can easily be increased without significantly impacting the giver. Secondly that some parts of a hobby can be work that is required for the fun part of the hobby and changing the ratio of fun:work can kill any motivation for the hobby.
To your point even your extract from the link there are compliance costs.
>So, we wanted to reassure those smaller services that this is *unlikely* to be the case
Your source admits there are extra costs that will likely cause some small services to have to shutdown if the costs are to burdensome for them, they are just saying that they hope the costs are small enough that it doesn't put most small services in that position.
Even in your quote it explicitly lists extra costs. i.e. the cost of a compliant compliance tool. Obviously the government isn't going to implement it or spend the time moderating reports or abuse of reports. Which means the cost of extra hours moderating and setting it up are on the service provider.
"Must have an individual responsible for compliance". So either employ someone to take this risk or take on the risk and responsibility yourself and the associated due diligence costs (lawyers in the UK are only free if you're already losing hours of your life to the court system).
These costs will definetly push some people over the line to not wanting to host such services. Especially when the wording is so wide that you need to moderate out insults in your forum.
Jesus Christ! Your comment would probably be flagged as foreign propaganda to soft peddle broken UK policies, that is if the US had such rules. My comment should be flagged because that could be an insulting insinuation or the expletive at the start of this paragraph could be stirring up religious hatred by being needlessly blasphemous. And a moderator has to read the entire post to get to the non compliant part.
Many of the provisions of the act apply to all user-to-user services, not just Schedule 1 and Schedule 2 services.
For example, the site must have an "illegal content risk assessment" and a "children’s risk assessment". And the children's risk assessment is a four-dimensional matrix of age groups, types on content, ways of using the service and types of harm. And it's got to be updated before making any "significant" change to any aspect of a service’s design or operation. It also makes it mandatory to have terms of service, and to apply them consistently. The site must have a content reporting procedure, a complaints procedure, and maintain written records.
Now obviously the operator of a bicycling forum might say "eh, let's ignore all that, they probably don't mean us"
But if you read the law and interpret its words literally, a bicycling forum is a user-to-user service, and a public forum is almost certain to be read by children from time to time.
A small website operator can keep child pornography off their platform with ease. Facebook have a mountain to climb — regardless of their resources.