Remix.run Logo
graemep 4 days ago

The problem with the focus being on porn behind age verification as the main effect, is that it ignores all the other effects. Closing community forums and wikis. Uncertainty about blog comments.

It is actually (as noted in many previous discussion about the Online Safety Act) pushing people to using big tech platforms, because they can no longer afford the compliance cost and risk of running their own.

ekianjo 4 days ago | parent | next [-]

> pushing people to using big tech platforms

so big tech platforms will cheerfully embrace it. as expected, major players love regulations.

miohtama 4 days ago | parent [-]

GDPR killed small and medium online advertising businesses and handed everything to Google and Facebook.

freeone3000 4 days ago | parent | next [-]

It’s a shame it wasn’t able to get them all.

DaSHacka 4 days ago | parent [-]

I agree, an adtech monopoly is surely much better for society

AngryData 4 days ago | parent [-]

I think they were saying that all advertising is crap and likely 99% of it shouldn't exist.

the_other 4 days ago | parent | prev [-]

Frankly, that's their fault for pursuing individually targeted advertising. The sad thing isn't that some small shitty businesses lost out, it's that some large shitty businesses didn't.

Spivak 4 days ago | parent [-]

And since individually targeted ads perform better and are less expensive it pushes everyone to big US tech platforms.

This isn't small advertisers' faults, the law signed their death warrant. They made local grocery stores more expensive and worse quality but kept Walmart around untouched. No one could predict what would happen.

hnlmorg 4 days ago | parent | prev | next [-]

Those sort of sites already had better moderation than big tech because they’d have their own smaller team of volunteer moderators.

I suspect any smaller site that claims the Online Safety Act was a reason they closed, needed to close due to other complications. For example an art site that features occasional (or more) artistic nudes. Stuff that normal people wouldn’t consider mature content but the site maintainers wouldn’t want to take the risk on.

Either way, whether I’m right or wrong here, I still think the Online Safety Act is grotesque piece of legislation.

graemep 4 days ago | parent [-]

I think the impact is a lot worse than that. There are still compliance costs especially for volunteer run sites. Ofcom says these are negligible, because they its unlikely to be more than "a few thousand pounds". Then there are the risks if something goes wrong if you have not incorporated.

HN has already has discussed things like the cycling forum that hit down. lobste.rs considered blocking UK IPs. I was considering setting up a forum to replace/complement FB groups I help admin (home education related). This is enough to put me off as I do not want the hassle and risk of dealing with it.

I think what you are missing is that this does not just cover things like porn videos and photos. That is what has been emphasised by the media, but it covers a lot of harmful content: https://www.legislation.gov.uk/ukpga/2023/50/section/62

It took a fair amount of legal analysis to establish blog comments are OK (and its not clear whether off topic ones are). Links to that and other things here: https://www.theregister.com/2025/02/06/uk_online_safety_act_...

hnlmorg 4 days ago | parent | next [-]

Some good points. And I do agree with your general opinion of this law. Albeit not all of specific points you've made:

> I think the impact is a lot worse than that. There are still compliance costs especially for volunteer run sites. Ofcom says these are negligible, because they its unlikely to be more than "a few thousand pounds". Then there are the risks if something goes wrong if you have not incorporated.

What are these "compliance costs"? There's no forms that need to be completed. Sites don't have to register themselves. For smaller sites, the cost is just what I described: the time and effort of volunteer moderators who already moderate the site. If they're already removing adult content, then there's no extra work for them.

> HN has already has discussed things like the cycling forum that hit down. lobste.rs considered blocking UK IPs. I was considering setting up a forum to replace/complement FB groups I help admin (home education related). This is enough to put me off as I do not want the hassle and risk of dealing with it.

None of this proves your point though. It just proves that some sites are worried about potential overreach. It's an understandable concern but it a different problem to the one the GP was describing in that it doesn't actually make it any harder for smaller forums in any tangible way. Unless you called "spooked" a tangible cost (I do not).

> I think what you are missing is that this does not just cover things like porn videos and photos.

I didn't miss that. But you're right to raise that nonetheless.

There's definitely a grey area that is going to concern a lot of people but no site is going to be punished for mild, or occasional "breaches". What the government are trying to police is the stuff that's clearly inappropriate for under-18s. The UK (and EU in general) tends to pass laws that can be a little vague in definition and trust the police and courts to uphold "the spirit of the law". A little like how US laws can be defined by past cases and their judgments. This ambiguity will scare American sites because it's not how American law works. But the UK system does _generally_ work well. We do have instances where such laws are abused but they're infrequent enough to make national news and subsequently get dropped because of the embarrassment it brings to their department.

That all said, I'm really not trying to defend this particular law. The Online Safety Act is definitely a _bad_ law and I don't personally know of anyone in the UK (outside of politicians) who actually agrees with it.

andybak 4 days ago | parent [-]

> There's no forms that need to be completed.

One of us has completely misunderstood the legislation.

By my reading - there's a ton of red tape and paperwork. Heck, there's a ton of work even getting to the point of understanding what work you need to do. And dismissing the fear of life-changing financial liability as "being spooked" is not helpful.

I've got a open-source 3D sharing site almost ready to launch and I'm considering geo-blocking the UK. And I live in the UK.

hnlmorg 4 days ago | parent [-]

> By my reading - there's a ton of red tape and paperwork.

It might help if you referenced the section what defines those requirements.

I don’t recall seeing anything that required such red tape unless there was special circumstances after the fact (for example, reporting child porn that was uploaded to your site, or responding to a police or court order).

But these kinds of rules exist for freedom of information et al too.

Maybe I’ve missed something though?

> Heck, there's a ton of work even getting to the point of understanding what work you need to do.

That is a fair point.

Unfortunately it’s also not novel to this legislation. Running any site that allows for public contributions opens one’s self to lots of different laws from lots of different countries. For some counties in the EU, Nazi content is illegal. Different countries have different rules around copyright. Then there’s laws around data protection, consent, and so on and so forth.

This law certainly doesn’t make things any easier but there has been a requirement to understand this stuff for decades already. So it’s a bit of a stretch to say this one new law suddenly makes a burden to run a site insurmountable.

However I do agree with your more general point that it’s getting very hard to navigate all of these local laws at scale.

> And dismissing the fear of life-changing financial liability as "being spooked" is not helpful.

It’s an unfounded fear though, so my language is fair. You’d use the same language about any other unfounded fear.

This is the crux of the point. People are scared, and I get why. But it’s completely unfounded. If people still want to discriminate against UK IPs then that’s their choice as they have to weigh up the risks as they perceive them. But it doesn’t mean it’s any likelier to happen than, for example, being in a plane crash (to cite another fear people overcome daily).

———

That all said, maybe everyone blocking UK IPs could be a good thing. If everyone shows they don’t consider it safe to operate in the UK then our government might consider revoking this stupid law.

mytailorisrich 3 days ago | parent [-]

In principle you need to have records of your risk assessment(s) and you need to have good T&Cs (like for GDPR, etc). But that's about it for red tape.

Templates have popped up to help with both.

mytailorisrich 4 days ago | parent | prev [-]

Compliance costs are a couple of hours at most to do and record the assessments then the effort and discipline to moderate and react quickly to reports. That's it.

A lot of misplaced fear and over-reactions. For instance, lobste.rs could basically safely ignore the whole thing being a small, low risk forum based in the US.

> It took a fair amount of legal analysis to establish blog comments are OK (and its not clear whether off topic ones are)

It looks like it only took someone to actually read the Online Safety Act, as Ofcom's reply kindly points to the section that quite explicitly answers the question.

I don't think that the Online Safety Act is a good development but many of the reactions are over the top or FUD, frankly...

wizzwizz4 4 days ago | parent | prev | next [-]

If you have examples of this happening, please add them to the ORG list: https://www.blocked.org.uk/osa-blocks

bogdan 4 days ago | parent | next [-]

Ironically this is blocked at my workplace.

IshKebab 4 days ago | parent | prev [-]

I clicked on loads of those and only a minority of them are actually blocked for me. E.g. it lists lobste.rs as "Shutting down due to OSA" but it clearly isn't.

mytailorisrich 4 days ago | parent | prev [-]

I am very skeptical that the Online Safety Act forces community forums and wikis to close. By and large the Act forces forums to have strong moderation and perhaps manual checks before publishing files and pictures uploaded by users, and that's about it.

Likewise, I suspect that most geoblocks are out of misplaced fear not actual analysis.

ijk 4 days ago | parent | next [-]

It has caused many community forums to close, past tense.

Many cited the uncertainty about what is actually required, the potential high cost of compliance, the danger of failing to correctly follow the rules they're not certain about, and the lack of governmental clarity as significant aspects of their decision to close.

The fear may be misplaced, but the UK government has failed to convince people of that.

mytailorisrich 3 days ago | parent | next [-]

I don't think it is so much a failure of the government to communicate as a vocal opposition to this law that has focussed on and amplified the maximum penalties and caused fear.

Now, I don't think this is a positive law but it is not armageddon, either, and objectively many reactions do seem overblown. Time will tell.

hnlmorg 4 days ago | parent | prev [-]

It was misplaced but the UK government has a long history of incompetence when it comes to legislation regarding the use of technology. So I cannot blame people being erring on the side of caution.

I mean, it’s not like this particular piece of legislation isn’t stupid to begin with. So I cannot blame people for assuming the worst.

freeone3000 4 days ago | parent | prev [-]

“Strong moderation” and “manual checks” and pro-active age verification are exactly the burdens that would prevent someone from running a small community forum.

mytailorisrich 4 days ago | parent [-]

You do not need age verification in the vast majority of cases.

Moderation is part and parcel of running forums and all platforms and software provide tools for this, it's nothing new. If someone is not prepared to read submissions or to react quickly when one is flagged then perhaps running a forum is too much of a commitment for them but I would not blame the law.

In fact I believe that forum operators in the UK already got in legal trouble in the past, long before the Online Safety Act, because they ignored flagging reports.

pjc50 4 days ago | parent [-]

Right, so 24 hour coverage is already too expensive for most forums and your position is that small forums should not exist.

mytailorisrich 4 days ago | parent [-]

Small forums should and can exist. They are not required to have 24 hour coverage.

pjc50 4 days ago | parent [-]

You said above "If someone is not prepared to read submissions or to react quickly when one is flagged".

Does the Act specify "quickly"? Does several hours count as "quickly"?

hnlmorg 4 days ago | parent | next [-]

Not the OP but I don’t recall the actual law saying 24/7 moderation was required.

Given the UK already has a “watershed” time where terrestrial TV can broadcast mature content between set hours (from 9pm), I cannot see why the same expectation shouldn’t exist that moderation isn’t happening outside of reasonable hours.

Typically with laws in the UK (and EU too) is to use more generalised language to allow the law a little more flexibility to apply correctly for more nuanced circumstances. Such as what is practical for a small forum to achieve when its specialty isn’t anything to do with adult content.

You’ll definitely find examples where such laws are abused from time to time. But they’re uncommon enough that they make national news and create an uproar. Thus the case goes nowhere due to the political embarrassment that department draws to itself.

Though to be clear, I’m not defending this particular law. It’s stupid and shouldn’t exist.

mytailorisrich 4 days ago | parent | prev [-]

Reacting quickly means what's proportionate and reasonable. This is quite standard wording for a law.

The Act (section 10 about illegal content) says that "In determining what is proportionate for the purposes of this section, the following factors, in particular, are relevant—

(a) all the findings of the most recent illegal content risk assessment

(b) the size and capacity of the provider of a service."

"24 hour coverage" is the maximum that can be achieved so it's not going to be proportionate in many, if not most, cases. People have to ask themselves if it is proportionate for a one-man gardening forum to react within 5 minutes at 3am, and the answer is not going to be "yes".

Obviously you can also automatically hide a flagged submission until it is reviewed or have keywords-based checks, etc. I believe these are a common functionalities and they will likely develop more (and yes, a consequence might be to push more people towards big platforms).

People need to have a calm analysis, not hysteria or politically-induced obtuseness whatever one might think of this Act. If they are a small and not in the UK they can probably completely ignore in any case.