| ▲ | NeutralCrane 16 hours ago |
| I live in an area that has been declared among the safest in America. Two months ago a 17 year old girl from our city disappeared. Turns out she had been being groomed for a year over Discord and in Roblox by a 39 year old the next state over. He eventually convinced her to let him pick her up, after which he filmed himself having sex with her, killed her, and then dismembered her body. He apparently was grooming other underaged girls in a similar way as well. The digital age presents with it novel forms of danger for children, and for adults for that matter, and there is absolutely no way to effectively address these risks without some amount of reduction in privacy. And before someone inevitably says “where were the parents?” and wash their hands of the situation, a healthy society should care for and protect all children, especially those whose parents do not. It’s one thing to hold the opinion “I am willing to sacrifice some number of lives, in order to preserve privacy”. That is an honest and potentially justifiable opinion someone may hold. But declaring the situation to simply be a facade to harvest people’s data seems to me like a reflexive response to avoid uncomfortable truths regarding the situation. |
|
| ▲ | chinabot 13 hours ago | parent | next [-] |
| If the government knew every single user on the internet's name, address, phone number and what they had for breakfast, it would not stop monsters like this, or even slow them down. |
|
| ▲ | fc417fc802 12 hours ago | parent | prev | next [-] |
| There will always be weird tail risks. The law should only get involved where there are widespread systemic problems. People are occasionally hospitalized due to self, family, or friends handling food improperly. That doesn't warrant a legal intervention whereas dining establishments do. > before someone inevitably says “where were the parents?” and wash their hands of the situation Nope, that's exactly what I say. The law cannot reasonably replace responsible parenting if society is to remain a pleasant place to live. |
| |
| ▲ | defrost 11 hours ago | parent [-] | | I live in extremely fire prone areas. Many of us are pretty damn okay at beating back the flame and controlling the flow of the worst of things away from homes, but nobody is perfect. We don't expect every family and parent in these areas to have fire fighting skills, self evacuation is recommended. Parents every where now find themselves surrounded by the delibrately laid gasoline of addictive social media and grooming risks et al. and it's infeasible to expect every parent be skilled in defensive cyber secuirty. It's reasonable to expect communities to want simple barriers and means of protection, the existance of reasonable control and throttling options for parents. | | |
| ▲ | fc417fc802 11 hours ago | parent [-] | | I agree with that however I'm puzzled by your comment because in the context that you're responding to I don't think I said anything that would imply otherwise. Being particularly skilled in "defensive cyber security" isn't a requirement to avoid grooming of your child in the general case - some combination of communication, supervision, and filtering is. > It's reasonable to expect communities to want simple barriers and means of protection, the existance of reasonable control and throttling options for parents. I agree 100%! However ID verification is not a reasonable (or even particularly effective) solution to that. I apologize if I've misconstrued your intended meaning but given the broader context that's what it seems like you're implying. Realistically there's no way to prevent grooming other than keeping tabs on your child. The least labor intensive (but also most intrusive) way to do that is probably whitelist parental controls and watching for unauthorized devices. It is not even remotely realistic to expect a communication platform to detect that a child is speaking with an adult they don't know (as opposed to one they do) and also that it isn't a benign interaction (such as a gaming group or etc) and then somehow act on that information (how?) without manufacturing an absurd dystopia in the process. When it comes to filtering I think it would be reasonable to impose a standard self categorization protocol on all website operators. That would make non-whitelist filtering much more reliable (a boon to parents, educators, and employers) without negatively impacting privacy or personal freedoms. | | |
| ▲ | defrost 9 hours ago | parent [-] | | Okay, in the specific upthread context; * there are very few urban population clump on the planet that don't face the threat of child grooming and exploitation, both before and after the digital device explosion. * that threat vector significantly increased and morphed with the spread of personal digital devices for children; the threat comes no longer from potentially any personal with contact in real life, it has now expanded to include potentially the entire digital world and now can be automated via groomGPT * A simple "where were the parents" response on a per parent basis is unfair in the sense that spotting grooming in a digital device world is a difficult challenge .. even a simple constrained playground with stock babytalk language construction can be socially backdoored (See: "I want to stick my long-necked Giraffe up your fluffy white bunny" ) * Concerned parents will look for solutions, communities, at local, state, and federal levels should devote resources to providing solutions in informed contexts and graduated levels. * Unaware parents will exist, and will likely dominate the demographics, or not? * Is the correct _default_ social policy here (answer varies by country and culture) to shield the less cyber aware from the worst of the worst with filters ... that the better informed can bypass or deselect? I guess where we diverge on PoV is where the perimeter of swiss cheese protection should extend to. |
|
|
|
|
| ▲ | choo-t 7 hours ago | parent | prev | next [-] |
| > But declaring the situation to simply be a facade to harvest people’s data seems to me like a reflexive response to avoid uncomfortable truths regarding the situation. Well, your example wouldn't be solved by age verification in any way. They could still legally access Roblox or a discord private chat (or even another private chat method) after this law. So the example show how it is about irrational fear and not protection in any way. And this is an tragic edge case, if you want to take this kind of edge case in consideration, you also have to take in consideration what the age verification would imply as tragic edge case. |
| |
| ▲ | imtringued 4 hours ago | parent [-] | | I'm here wondering why it would make a difference whether the girl is under 18 or not. You could argue that the criminal has to cover up his crime by getting rid of the evidence (murder) because the girl wasn't 18 yet and therefore it makes sense to stop under 18 year old girls from using the platform because they are living evidence, but it actually sounds more like a problem caused by the law itself. After all, dating apps are an even more extreme version of this. If you're attractive enough, you get to have many one night stands and many murder opportunities. |
|
|
| ▲ | AJ007 14 hours ago | parent | prev | next [-] |
| Discord & Roblox - no encryption, privacy, or anonymity on either of those platforms, by the way. |
|
| ▲ | heavyset_go 10 hours ago | parent | prev | next [-] |
| I'm sure the same government that held the Epstein class responsible will get right on to making sure his proteges are brought to justice, we just need to give up more freedoms first. |
|
| ▲ | mindslight 13 hours ago | parent | prev | next [-] |
| Still none of that necessitates the type of mandatory partial-ID verification being pushed by these laws. Roblox can straightforwardly require ID verification on their own, of both the parent responsible for the account, as well as the children directly (request documentation from their school, birth certificate, etc. Yes, high touch to verify these documents. But we're talking protecting children here, right?) If anything this type of legislation is about absolving them of the responsibility of doing so!. Imagine a company making their offering "for adults only", with de facto kid usage as parents relent and just let their kid use an older age on the computer. |
|
| ▲ | weird_tentacles 16 hours ago | parent | prev [-] |
| [dead] |