UK Expands Online Safety Act to Mandate Preemptive Scanning
(reclaimthenet.org)54 points by aftergibson 4 hours ago
54 points by aftergibson 4 hours ago
Well, yes, because it is designed to protect UK citizens. As much as GDPR applies "everywhere in the world" when interacting with EU citizens.
Just as much as my communications are scanned when interacting with US citizens with PRISM. I'd argue that is exponentially more dangerous and nefarious given it's apparently illegality and (once) top secrecy.
> The UK Department for Science, Innovation and Technology (DSIT)
It should be called the Ministry of Truth at this point.
> Unwanted
How do you know if a nude is unwanted? The premise itself makes no sense. The only way this could potentially work is if you had the whole context of the relationship somehow embedded in the messages and then if you deciphered the intent behind the messages. Even then what about sarcasm or double entendre?
>How do you know if a nude is unwanted? The premise itself makes no sense
If the app has sufficient permissions to infer user demographics a sufficiently jaded person should be able to come up with a set of rules that get you a 99% solution pretty easily.
In the future, phones will refuse to take pictures of dicks unless men register their height and income levels so that useful and relevant information can be added to the image metadata.
Perhaps there should be a setting "Allow X" that has to be set on a contact. By default it is set to disallow nudes.
I think this already exists by the way - screening potentially pornographic images and you have to explicitly confirm a choice to view it.
So wait - would this be something like... you trying to send a dickpic via WhateverMessenger, the content would be scanned first and you would be presented with a message along the lines of "This message cannot be sent as it violates our T&Cs"?
Don't buy into that framing either. Optional scanning - if a user wants to, they are free to download government spyware onto their phone/computer and do all the scanning they want, local or otherwise. No new laws needed.
Externally. When is anything ever scanned internally.
More likely it would just silently not be sent, and potentially a week later you get a visit from the cops. Censors hate drawing attention to their actions, that is why you never see a "this message censored on government request" as sender or recipient.
This is where someone conflates it with anti-spam and acts confused, because showing such a notice for every spam message would make a service unusable. As if spam is equivalent, as if users cannot be given the choice to opt in/out of however much anti-spam and other filtering that they want as recipients, and as if "This was censored" messages cannot be collapsed/shown per category, e.g. "Messages blocked: 12 spam, 4 unwanted sexual content, 5 misinformation/lacking context, 7 hate/harmful content". As a rule, when someone raises an objection that can be resolved with less than 60 seconds of thought, they are not being genuine.
But more importantly, it would make it illegal to provide any kind of messaging software without government approval, which is only given by letting government-designated censorship and surveillance services act as middle-men. And then the law can be more or less strictly applied, depending how much the government dislikes the general sentiment that is spread on your network, regardless of its legality, thus controlling discourse.
I am not speculating here - this is what the UK government has admitted they want:
First, we are told, the relevant secretary of state (Michelle Donelan) expressed “concern” that the legislation might whack sites such as Amazon instead of Pornhub. In response, officials explained that the regulation in question was “not primarily aimed at … the protection of children”, but was about regulating “services that have a significant influence over public discourse”, a phrase that rather gives away the political thinking behind the act. - https://archive.md/2025.08.13-190800/https://www.thetimes.co...
> To meet the law’s demands, companies are expected to rely heavily on automated scanning systems, content detection algorithms, and artificial intelligence models trained to evaluate the legality of text, images, and videos in real time.
this means either devices need to evolve to do this locally, or the items need to be sent to external service providers, usually based outside of the UK, to scan them unencrypted
I also assume this means the government here in the UK are okay with all whatsapp messages they send to be sent to an LLM to scan them for legality, outside the UK?
I understand the rage generated here, but what is the alternative?
If a service implements privacy invading 'features' then we have the choice not to use that service. Letting tech companies self-regulate has failed, and too many people leave morality at the door when engaging online, something which doesn't happen at scale IRL.
What are we to do if not monitor? And how to make that scalable if not to introduce automation?
I don't know what the alternative is, but I don't think I've ever found a situation yet where the solution has been His Majesty's Government being able to exercise more control over what people can see and hear.
but what is the alternative
If an app can be installed on someones hardware without their intervention launch it into the air and use it for target practice. If a website requires some crypto-crap to verify objects were scanned then upload to smaller platforms and let others link to the objects from the big platform. The big platforms can play whack-a-mole removing links, it's a fun game. The smaller sites can give the crawler alternate images. Better yet just use small semi-private self hosted platforms. Even better yet ensure those platforms are only accessible via .onion domains requiring a browser that is Tor enabled. People can then make sites that proxy/cache objects from Tor onion sites to easier to access sites.
> Letting tech companies self-regulate has failed, and too many people leave morality at the door when engaging online, something which doesn't happen at scale IRL.
I completely agree with this point.
We also have some tech companies (X) openly hostile to the UK Government. At what point does a sovereign country say "you're harming the people here and refuse to change, you're blocked".
>too many people leave morality at the door
Yep, that's life, if something bothers you and it's already a crime then report it.
There is precious little in life that can be undertaken without some risk of something unwanted however small (hah).
> Yep, that's life, if something bothers you and it's already a crime then report it.
I think that's the issue with this, and why we are seeing new laws introduced.
If someone is assaulted in real life, the police can intervene.
If people are constantly assaulted at a premises, that premise can lose it's license (for example a pub or bar).
When moving to the online space, you are now potentially in contact with billions of people, and 'assaults' can be automated. You could send a dick pic to every woman on a platform for example.
At this point the normal policing, and normal 'crime', goes out of the window and becomes entirely unenforcable.
Hence we have these laws pushing this on to the platforms - you can't operate a platform that does not tackle abuse. And for the most part, most platforms know this and have tried to police this themselves, probably because they saw themselves more like 'pubs' in real life where people would interact in mostly good faith.
We've entered an age now of bad faith by default, every interaction is now framed as 'free speech', but they never receive the consequences. I have a feeling that's how the US has ended up with their current administration.
And now the tech platforms are sprinting towards the line of free speech absolutism and removing protections.
And now countries have to implement laws to solve issues that should have just been platform policy enforcement.
There is for the governments, control of information and all that.
I don't think the fine is automatic like that, it's more if you don't have an appropriate mechanism to manage it. In other words you need a content policy that is enforced.
A mod who deletes nude pictures is probably enough to not get fined.
I think the real issue is what I just said... "probably enough"; that's the real problem with the online safety act. People are mostly illiterate on the law, and now asking them to understand a complex law and implement it (even when the actual implementation is not that much effort or any effort at all for well run spaces) is the real issue.
> What are we to do if not monitor?
Simple, you can choose to only use platforms that use the most stringent scanning technologies for you and your family.
You give the UK government (or the equivalent that applies to you) the right to continuously scan everything from pictures to emails to messages and then obviously you give them the right to prosecute you and come after you when one of their AI algorithms mistakenly detects child porn on your device or in your messages just like this guy: https://www.theguardian.com/technology/2022/aug/22/google-cs...
For the rest of us, we should be free to opt out from being surveilled by machines 24/7.
Then everyone is happy.
Edited: typos
Personally, I think this is the answer too - rather than mandating it across all platforms, they could have created a service which provides scanning so that there was an additional app people could choose to install (and would, presumably, present as an accessibility addon so it could access content in other apps).
That's not without its own issues though - creating external deps is more or less what they did the first time they tried to mandate age verification.
Although their plans fell through, they created an industry who'd expected a captive market and started lobbying heavily. Eventually, it worked and we've ended up with mandatory age verification.
> but what is the alternative?
We already have alternatives, this legislation is taking them away. If I want heavily censored discourse, I can go to reddit. If I want the wild west, I can go to 4chan. If I want privacy, I can use signal. And lots of services on different parts of that spectrum, or where different things are allowed.
But the UK government wants to eliminate that choice and decide for me. And most importantly, they don't want to call it censorship, but "safety". To keep women and girls "safe" (but nobody is allowed to opt out, even if they're not a woman or girl, or don't want this "safety")
How's that lawsuit with 4Chan going ofcom? Last checking, just now the site is still online.
Time to move my colocated servers out of the UK.
Most of these comments I think are off the mark. For some reason anything to do with EU or the UK legislating to protect citizenry is seen as some Orwellian conspiracy to mind control people. I agree some of the policies feel like always using a hammer - but I strongly suspect it's because the tech industry is clearly not playing ball.
Children being sent dick pics, or AI generated nudes of them being sent around schools, etc. are real problems facing real normal people.
I think people here need to be reminded that the average person doesn't care about technology. They will be happy for their phones to automatically block nude pictures by Government rule if the tech companies do not improve their social safety measures. This is the double edged sword: these same people are not tech savvy enough to lock down their children's phones, they expect it to be safe, they paid money for it to be "safe", and even if you lock a phone down, it doesn't stopped their class mates sending them AI porn of other class mates.
Musk is living proof that a non zero number of these giant tech companies are happy for child porn ("fake" or not) to be posted on their platform. If I was in his shoes, it would be pretty high up on my list to make sure Grok isn't posting pornography. It's not hard to be a good person.
The things you mention are already illegal. The effective proven solution is to enforce existing laws, to punish and deter bad behaviour like any other crime.
This incongruence is why a lot of people don't take the reasoning at face value and see it as only rhetorical justification for increased surveillance, which is widely understood as something the state wants do do anyway.
I posted a reply here https://news.ycombinator.com/item?id=46599842 that addresses why I think "this is already a crime" doesn't go far enough, and why these laws are being introduced.
Adobe isn't the creator of child porn when Photoshop is interacted with a child pornographer.
So why are you considering xAI the creator when it's the tool that's being interacted with?
The human child pornographer using tools is the one who's creating it, not the tools.
Sex Pistols are more actual than ever.
God save the Queen
The fascist regime
It made you a moron
Potential H-bomb
God save the Queen
She ain't no human being
There is no future
In England's dreaming
Don't be told what you want to want to
And don't be told what you want to need
There's no future, no future
No future for youNothing any government in my lifetime has done has arrested this feeling of decay, decline and desperation. It's like the occupational political class has a miserable vendetta and must afflict it upon the population. But I'm not actually miserable like you, I don't want to feel like you, we invented liberty in this country, now fuck off the lot of you thank you.
Tech industry walked right into this one, well done Musk.
UK government publicly making a fool of itself is probably not counter to the interests of Elon Musk at all... His political faction have been keen to insult the British government whenever possible. The more absurd their public enemies act, the more reasonable they look in comparison.
Musk is implicitly allowing child pornography on his platform. There's no way around that. Apple/Google should have removed X a while ago.
Come on now. That's obviously not true. CSAM is absolutely banned on twitter, and all other American platforms.
> Musk is implicitly allowing child pornography on his platform.
That is blatantly false and you know it. Musk has lot to answer for but we don't need to start making up imaginary crimes here.
> Apple/Google should have removed X a while ago.
Those who ask willy-nilly for censorship always end up being surprised when the systems comes after them in the end as it always does.
If tomorrow Apple and Google ban an app that you like, will you still agree that censorship is ok?
Okay, everyone here is talking about dick pics but let's be clear here the goal is
>A major expansion of the UK’s Online Safety Act (OSA) has taken effect, legally obliging digital platforms to deploy surveillance-style systems that scan, detect, and block user content before it can be seen.
Do we really believe that no government forever is not going to use this to prevent certain "misinformation" from circulating?
And by misinformation we mean things like MPs breaking COVID lock down rules or "problematic" information about the PM being involved in a scandal, or the list is endless.
Let's be clear this isn't at all and never has been about dick pics this is 100% about being able to control what you can see and share.
I don't understand the downvotes that you are getting.
There is a clear intent to muzzle the population that is going on in Europe with this new legislation and then with Chat control. Those who can't see that need to remove the blinders they have on.
First, it's the nudes and then it's something else. Once there is a capability to filter what can be shared between two adults in private message, then can anyone say that any government is not going to come back for more and ask more and more things to be removed or censored?
How will it know if the dick pic is wanted or unwanted?
Who knows, I haven't searched for unbiased data to be honest
But they can be more judicious with whom they share contact details, and use the block button. They are not forced to be the recipient of any message.
Do you think the only solution is for a government backdoor?
No, I don't. But does the government think the only solution is a backdoor? Yes.
They are the ones in power, not you & I.
Wow nobody saw this coming /s
They whipped up a mini pandemic of people being subject to an onslaught of unwanted dick pics (not mentioning even once about the "block" feature on every single platform) to justify it
This is the Ministry of Truth building up their toolset
> A major expansion of the UK’s Online Safety Act (OSA) has taken effect, legally obliging digital platforms to deploy surveillance-style systems that scan, detect, and block user content before it can be seen.
If this is implemented as it reads, just a note to everyone else, everywhere in the world:
For this policy to work, everything must be scanned. So now, every time you communicate with someone in the UK, your communications are no longer private.