I was looking for more details on who is considered in-scope, e.g. if the determination is decided at account creation time or at content access time. Especially considering how the AT protocol is a decentralized protocol that is not exclusive to a single region, it's not obvious how the enforcement will work.
Enforcement occurs at the application level. You can think of applications as analogous to search engines. They're aggregating view layers (which is why they're called AppViews). They are largely where moderation or regulatory requirements are enforced.
Multiple platforms have had that an an option. It's really odd as a concept to me, does it only work for very obvious cases and for people around 18-30 ask for an ID anyways?
That "reputation" is slander from haters and those who don't want others to have nice things
Another example is JD Vance, creating an account, making his first post to trans troll [1], and then crowing about how awful Bluesky was to him and how he got blocked. He's still posting as recently as July 4, but he's really just become mostly a social media troll, and Bluesky users generally block trolls instead of engaging them
Bluesky is one of the partners with Roost and working to address this societal level problem. Nothing particularly unique to any company when it comes to CSAM
It is a social media platform that allows image posts. Allowing adult content is by design. Draconian "internet safety" laws like this one really should not be a reasonable expectation for what you will have to put your users through to see nude photos on the internet.
I'm also not sure where that supposed reputation is coming from, because my experience is that they've been criticized for over-moderation, if anything...
I agree with most of what you say. Internet safety is only a cover for other intentions in such laws. But there is a legitimate need to protect kids from the dangers that lurk online. Things like nudity, drugs, violence, radicalism, vanity culture, fake news, etc have a disproportionate and lasting impact on children (and even for adults in many cases. But they at least have an incentive to avoid it on account of being held responsible for their own actions). But it's even worse than that because there are those that specifically seek out and target vulnerable population like children.
Parental supervision often isn't enough, because there are several circumstances where kids circumvent it, the least of which is that an average kid is way more crafty than an average adult. And education isn't enough either, because that only teaches them what is bad. Kids are naturally prone to trying out what they know to be risky actions (which is why they have less legal liability than adults).
What would be a proper and sincere solution to that problem?
You do not have 'civil liberties' in the UK. There is no Miranda rights and there bill of rights is strictly based on the Magna Carta and is not as comprehensive and the US Bill of Rights which was based off of the UK version.
Per the wiki:
Civil liberties have been gradually declining in the United Kingdom since the late 20th century. Their removal has been generally justified by appeals to public safety and National Security and hastened on by crises such as the September 11 attacks, the 7/7 bombings and the 2020 COVID-19 pandemic.[4][5][6] The pandemic oversaw the introduction of the Coronavirus Act 2020, which was described by former Justice of the Supreme Court Lord Sumption as "the greatest invasion of personal liberty in [the UK's] history."
> The pandemic oversaw the introduction of the Coronavirus Act 2020, which was described by former Justice of the Supreme Court Lord Sumption as "the greatest invasion of personal liberty in [the UK's] history."
It's these sort of transgressions that make people hesitant or opposed to lifesaving interventions like mass vaccination campaigns and pandemic-time mask mandates. If you give an inch in the name of safety, you'll end up giving a mile permanently in terms of freedom.
Original blog post: https://bsky.social/about/blog/07-10-2025-age-assurance
I was looking for more details on who is considered in-scope, e.g. if the determination is decided at account creation time or at content access time. Especially considering how the AT protocol is a decentralized protocol that is not exclusive to a single region, it's not obvious how the enforcement will work.
Enforcement occurs at the application level. You can think of applications as analogous to search engines. They're aggregating view layers (which is why they're called AppViews). They are largely where moderation or regulatory requirements are enforced.
What alternative AppViews are there? Anyone know?
deer.social zeppelin.social
I poked around in the Bluesky repos [1] to figure out where the source code is for this feature, but didn't find it. Anyone know?
[1] https://github.com/bluesky-social
Here is some of it: https://github.com/bluesky-social/atproto/pull/4030
Thanks!
Another discussion: https://news.ycombinator.com/item?id=44524571
> the platform says it will let users verify their age by scanning their face
Seriously?
Multiple platforms have had that an an option. It's really odd as a concept to me, does it only work for very obvious cases and for people around 18-30 ask for an ID anyways?
Error: 403 - Account holder's testicles have not sufficiently descended.
How do you determine that from a face scan?
From the first paragraph
> the platform says it will let users verify their age by scanning their face, uploading an ID, or entering a payment card.
Note the "or"
[flagged]
> reputation for cp and other forms of abuse
That "reputation" is slander from haters and those who don't want others to have nice things
Another example is JD Vance, creating an account, making his first post to trans troll [1], and then crowing about how awful Bluesky was to him and how he got blocked. He's still posting as recently as July 4, but he's really just become mostly a social media troll, and Bluesky users generally block trolls instead of engaging them
[1] https://blebbit.app/at/did:plc:tkspefzlu72575kljanhe3uj/app.... (3rd of the 3 part post)
> CSAM content on Bluesky has risen ten times in just a week
https://news.ycombinator.com/item?id=41474672
Peculiar hill you choose to die on.
And then you can see my reply to that right at the top, so thanks, newly created anon account, for helping to add clarity to the situation
Bluesky is one of the partners with Roost and working to address this societal level problem. Nothing particularly unique to any company when it comes to CSAM
https://roost.tools/partnerships
The roost crew has been great. I’m happy to be working with them
It is a social media platform that allows image posts. Allowing adult content is by design. Draconian "internet safety" laws like this one really should not be a reasonable expectation for what you will have to put your users through to see nude photos on the internet.
I'm also not sure where that supposed reputation is coming from, because my experience is that they've been criticized for over-moderation, if anything...
I agree with most of what you say. Internet safety is only a cover for other intentions in such laws. But there is a legitimate need to protect kids from the dangers that lurk online. Things like nudity, drugs, violence, radicalism, vanity culture, fake news, etc have a disproportionate and lasting impact on children (and even for adults in many cases. But they at least have an incentive to avoid it on account of being held responsible for their own actions). But it's even worse than that because there are those that specifically seek out and target vulnerable population like children.
Parental supervision often isn't enough, because there are several circumstances where kids circumvent it, the least of which is that an average kid is way more crafty than an average adult. And education isn't enough either, because that only teaches them what is bad. Kids are naturally prone to trying out what they know to be risky actions (which is why they have less legal liability than adults).
What would be a proper and sincere solution to that problem?
I am willing to give up all my civil liberties if it’s for the kids…
You do not have 'civil liberties' in the UK. There is no Miranda rights and there bill of rights is strictly based on the Magna Carta and is not as comprehensive and the US Bill of Rights which was based off of the UK version.
Per the wiki:
Civil liberties have been gradually declining in the United Kingdom since the late 20th century. Their removal has been generally justified by appeals to public safety and National Security and hastened on by crises such as the September 11 attacks, the 7/7 bombings and the 2020 COVID-19 pandemic.[4][5][6] The pandemic oversaw the introduction of the Coronavirus Act 2020, which was described by former Justice of the Supreme Court Lord Sumption as "the greatest invasion of personal liberty in [the UK's] history."
> The pandemic oversaw the introduction of the Coronavirus Act 2020, which was described by former Justice of the Supreme Court Lord Sumption as "the greatest invasion of personal liberty in [the UK's] history."
It's these sort of transgressions that make people hesitant or opposed to lifesaving interventions like mass vaccination campaigns and pandemic-time mask mandates. If you give an inch in the name of safety, you'll end up giving a mile permanently in terms of freedom.
Of course it's in the UK. I'm surprised they don't have an internet licence yet but I'm sure it's coming.