| ▲ | Bender 6 hours ago |
| The one and only method I will participate in is server operators setting a RTA header [1] for URL's that may contain adult or user-generated or user-contributed content and the clients having the option to detect that header and trigger parental controls if they are enabled by the device owner. That should suffice to protect most small children. Teens will always get around anything anyone implements as they are already doing. RTA headers are not perfect, nothing is nor ever will be but there is absolutely no tracking or leaking data involved. Governments could easily hire contractors to scan sites for the lack of that header and fine sites not participating into oblivion. I a small server operator and a client of the internet will not participate in any other methods period, full-stop. Make simple logical and rational laws around RTA headers and I will participate. Many sites already voluntarily add this header. It is trivial to implement. Many questions and a lengthy discussion occurred here [1]. I doubt my little private and semi-private sites would be noticed but one day it may come to that at which point it's back into semi-private Tinc open source VPN meshes for my friends and I. [1] - https://news.ycombinator.com/item?id=46152074 |
|
| ▲ | rpdillon an hour ago | parent | next [-] |
| This is exactly the way it should be done. Device with parental controls enabled disables content client-side when the header is detected. As far as I can tell, it's a global optimum, all trade-offs considered. |
| |
| ▲ | SoftTalker 44 minutes ago | parent [-] | | Well why haven't all the big tech companies done it then? They have only themselves to blame. They had years to fix the problem of inappropriate content being delivered to kids and their response was sticking their fingers in their ears and saying "blah blah blah parenting blah blah blah" And it really should be the opposite. Assume content is not kid-safe by default, and allow sites to declare if they have some other rating. | | |
| ▲ | jonplackett 18 minutes ago | parent | next [-] | | The reason is that this whole push for age verification is nothing to do with actually stopping kids seeing the content. If it was then this kind of solution would be being legislated for. It’s just about making everyone identifiable. | |
| ▲ | fc417fc802 40 minutes ago | parent | prev [-] | | Because it isn't in their financial interest. They've either done nothing or actively lobbied for these ID laws. You can plausibly explain it in a number of ways, including regulatory capture, deanonimization, spam reduction, etc. |
|
|
|
| ▲ | LooseMarmoset 18 minutes ago | parent | prev | next [-] |
| An outstanding idea. Those lobbying for age verification hate it though, because they want to be the arbiters of age, and all that juicy PII that they can analyze and resell. |
|
| ▲ | kyledrake 2 hours ago | parent | prev | next [-] |
| Interesting, I've never heard of this. I see an example that involves an HTTP response header "Rating: RTA-5042-1996-1400-1577-RTA". But does this actually still get used by parental controls? I didn't run into a lot of documentation about this, including on the very badly designed RTA web site https://www.rtalabel.org/ For anyone curious about the value, the numbering on the value is just a fixed number everybody decided to use for some reason that isn't clear to me. I would deeply prefer to do it this way, but my goodness the RTA org needs a serious brush up of their web site and information on how to use this. |
| |
| ▲ | Bender an hour ago | parent [-] | | But does this actually still get used by parental controls? Some parental control applications will look for it but it is not yet legislated to be mandatory on a majority of user-agents. All I am suggesting is we legislate the header to be added to URL's that may contain material not appropriate for small children and mandate the majority of user-agents the ones that are default installed on tablets and operating systems look for said header to trigger optional parental controls. Child accounts created by parents on the device should not be able to install alternate user-agents or bypass the controls (at least not easily). Parents should be guided through this on device setup. Indeed their site is old and rarely touched. The ideas and concepts have not changed. It really could just be a static text site formatted in ways that law makers are used to or someone could modernize it. |
|
|
| ▲ | traderj0e 3 hours ago | parent | prev | next [-] |
| Or could have a header saying this is not adult-only content, and a parentally-controlled device will block things that don't participate. |
| |
| ▲ | fc417fc802 30 minutes ago | parent | next [-] | | Yes, the RTA header was primarily a solution specific to porn sites. The broader problem is that parental controls don't have reliable standardized signals to filter on which has led to the current nonfunctional mess. So ideally you want a standardized header that can be used to self classify content into any number of arbitrary and potentially overlapping categories. The presence of that header should then be legally mandated with specific categories required to be marked as either present or absent. So for example HN might be "user generated T, social media T, porn F" or similar with operators being free to include arbitrary additional categories (but we know from experience that most of them won't). While this would be required by law, I imagine browser vendors might also drop support to load sites that don't send the header in order to coerce global compliance. | |
| ▲ | Bender 3 hours ago | parent | prev | next [-] | | That's a good idea. There could be two headers, the existing RTA header that adult sites use today [1] and another static header that explicitly states there shall be no adult content. [1] - https://www.shodan.io/search?query=RTA-5042-1996-1400-1577-R... [THESE ARE ADULT SITES, NSFW] | | |
| ▲ | bluGill 2 hours ago | parent [-] | | What is adult content? I know parents who have no problem with their kids seeing porn. I know parents who give their kids a beer. I know parents who take their kids to violent movies. I used to know parents who will give their kids cigarettes. Most parents I know will disagree with their kids doing one of the above. I know songs that were played on the radio in 1960 that would not be allowed today, even though today we allow some swearing on the radio. | | |
| ▲ | briffle 6 minutes ago | parent | next [-] | | That was our struggle with implementing "blocking" tech at a school I worked at. Is a kid looking up how to do a breast self exam porn? What about a self testicular exam.. What about actual Sex Ed kinds of sites? | |
| ▲ | Bender 2 hours ago | parent | prev | next [-] | | That's between parents and their local governments. Yes when I was a kid my mom let me watch whatever and go wherever. The parent in my example ultimately decides what a kid may or may not do which is in alignment with existing laws. If the parent is endangering their kid that is up to them and their government to sort out. Point being, put the controls entirely into the hands of the device owner. Options can be to default to: - Block everything by default unless header states otherwise. - Block only sites that state they are adult. - Do nothing. Obey the operator. (Controls disabled on child accounts or make them an adult or otherwise unrestricted account on the device). I think the options are just limited to our imagination. | |
| ▲ | mikestorrent 2 hours ago | parent | prev | next [-] | | > I know parents who have no problem with their kids seeing porn. Surely you mean at least teenagers, and not literally children, right? Consider the prevalence of violence, racial stereotyping, and escalation of fetishism into degeneracy that clearly exists within this medium; what's the line that these parents draw? Are they making sure it's only something vanilla? Or is there no line whatsoever? | | |
| ▲ | bluGill an hour ago | parent [-] | | They don't care. The kids won't think to ask until they are teens, and they are not showing it until then, but it is technically available. |
| |
| ▲ | aqme28 2 hours ago | parent | prev | next [-] | | Then those parents can turn off their browser/client’s age protections. I think that’s actually a decent argument for the solution posed by this thread. | | |
| ▲ | traderj0e 2 hours ago | parent [-] | | There is such a thing as making the "kid ok" header so rare or "18+" so eager that nobody takes it seriously, so that'd need to be kept in mind. |
| |
| ▲ | traderj0e 2 hours ago | parent | prev | next [-] | | There are already laws defining this. Had to draw the line somewhere, and they did. | | |
| ▲ | lokar 2 hours ago | parent [-] | | In which legal jurisdiction and culture? Many or most website are have users from many locations. Is the header a json encoded map from country code to age rating? | | |
| ▲ | traderj0e 2 hours ago | parent [-] | | The US. If they want to serve users in other countries, or if certain states make their own rules, it's business as usual whether to serve different content there or serve a different header or take the legal risk. | | |
| ▲ | lokar 2 hours ago | parent [-] | | That seems unworkable and a practical matter | | |
| ▲ | fc417fc802 42 minutes ago | parent | next [-] | | It's the exact same problem that age verification faces. There are different laws in different jurisdictions and operators have to figure out how to comply with the ones that matter to them. Think of the (current) header as meaning "we would have blocked you if we saw you were under 18" or whatever equivalent and it should make sense. | |
| ▲ | traderj0e 2 hours ago | parent | prev [-] | | They already do this, like there's Victoria's Secret's US website vs Qatar. |
|
|
|
| |
| ▲ | tristor 2 hours ago | parent | prev [-] | | > I know parents who have no problem with their kids seeing porn. I don't agree with showing actual children porn, but I also totally expect teenagers to find some way to get access to it in the age of the Internet. Part of the challenge with this is cultural. Different places in the world think about sex, sexuality, and even the concept of what is a child differently. In the US, showing a woman's bare breasts to a person under 18 is generally considered wrong, and in many cases is illegal. In most of Europe it wouldn't even raise an eyebrow, because bare breasts are on television, sometimes in commercials even. Set aside for a moment the question of age verification and age limits, we cannot even agree in any sort of universal sense what even qualifies as porn or adult content, and at what age someone should be able to see it. There's a difference between a 7 year old and a 17 year old seeing the same type of content, and there's also a difference between a photographic nude and a video of people engaged in coitus. The story is basically the same for everything else you listed. These age verification laws in many ways are trying to use the most heavy-handed mechanism possible to enforce American cultural norms on the entire planet. That's clearly wrong to do. What the GP suggested using RTA headers though puts the control into the parent's hands, which is as it should be. | | |
| ▲ | traderj0e an hour ago | parent | next [-] | | We don't need to care what France or China thinks when we make our laws that are about our own citizens. They do the same over there. > These age verification laws in many ways are trying to use the most heavy-handed mechanism possible to enforce American cultural norms on the entire planet. That's clearly wrong to do. Yes there's a chance our rules spill over there naturally, and I don't consider that wrong either. | |
| ▲ | hirvi74 an hour ago | parent | prev [-] | | I considered many of the same points you mentioned. Though, one area I am still struggling to grasp is the harm that governments are trying to mitigate. If a child were to see inappropriate material, then what harm can truly arise? Also, why do governments need to enact such laws when the onus of protecting children should be on their parents? I am not trying to start any kind of flame war, but I really cannot see any other basis for all this prohibition that is not somehow traceable back to Western religious beliefs and the societies born and molded from such beliefs. |
|
|
| |
| ▲ | Induane 2 hours ago | parent | prev [-] | | I always love seeing pros and cons of whitelist vs blacklist sorts of strategies in different scenarios. | | |
| ▲ | traderj0e 2 hours ago | parent | next [-] | | Yeah, and this is a good one. Blacklist is less likely to be ignored by parents. Both have risks of corps doing CYA strats, but less so with the blacklist. Whitelist has the advantage of being more feasible without an actual law, and also better matching how parenting works. Generally kids are given whitelists irl. | |
| ▲ | 2 hours ago | parent | prev [-] | | [deleted] |
|
|
|
| ▲ | big85 5 hours ago | parent | prev | next [-] |
| Back in the late 90s or so, there was a proposal to have sites voluntarily set an age header, so parents/employers/etc could use to block the site if they wish. People said it would never work, because adult sites had a financial incentive not to opt in to reduce their own traffic. |
| |
| ▲ | masfuerte 5 hours ago | parent | next [-] | | The porn companies already set the RTA header. It was designed by an organisation funded by the porn companies. https://en.wikipedia.org/wiki/Association_of_Sites_Advocatin... | | |
| ▲ | motbus3 4 hours ago | parent [-] | | It seems there is a GitHub repo somewhere mapping Meta money to lobbyists inside other companies
Which is at least interesting |
| |
| ▲ | thesuitonym 5 hours ago | parent | prev | next [-] | | What, in the same way movie studios wouldn't comply with the Hayes Code, or comic book publishers wouldn't comply with the CCA, or games publishers wouldn't comply with the ESRB? The financial incentive is to police yourself, because government policing is much, much worse. | | |
| ▲ | nine_k 5 hours ago | parent | next [-] | | There's a great relevant quip: "If you think that the cost of compliance is high, try noncompliance". | | | |
| ▲ | breezybottom 4 hours ago | parent | prev [-] | | Sure but the government doesn't police corporations in the US anymore. The Hayes code was before neoliberalism. | | |
| ▲ | shevy-java 4 hours ago | parent [-] | | Quite true. The US corporations act like a giant global rabid dog. Fake legislation appears in the USA - lo and behold, it is copy/pasted into the EU. At the least lobbyists are getting rich right now. | | |
| ▲ | htek 3 hours ago | parent [-] | | At least the EU has GDPR. In the US, our personal data is collected by every app and website and company and packaged, sold and sifted through by a vast collection of private data brokers which the government already ingests. |
|
|
| |
| ▲ | iamnothere 5 hours ago | parent | prev | next [-] | | You’d think that one could simply block sites that don’t have the age header set on child computers. This may block kids from hobbyist sites that don’t bother to set their headers as kid-friendly, but commercial sites would surely set their headers properly. Over time sending proper rating headers would become more normalized if they were in common use. This still isn’t perfect, as it creates an incentive for legislators to criminalize improper age header settings and legislate what is considered kid-appropriate. But it’s still better than this age verification crap. | | |
| ▲ | Scaled 4 hours ago | parent [-] | | Yes, that's how parental filters already work. They use a combination of rta tags and external data to block pages. Even works with Google safe search, firewall devices, etc. The rta ecosystem is already built out and viable. | | |
| ▲ | nativeit 2 hours ago | parent [-] | | I think the better tack is to stop acting like these laws are being pushed by honest actors with good faith intentions of protecting children. |
|
| |
| ▲ | Bender 5 hours ago | parent | prev | next [-] | | What I am suggesting could address most of that. If they do not participate they get fined. The government loves to fine companies. This assumes they put enough "teeth" into a law that prevents companies from accepting fines as the cost of doing business. This would also require legislation that could block sites that operate from countries that do not cooperate with US laws. Mandatory subscriptions to BGP AS path filters, CDN block-lists which already exist, etc... People could still bypass such restrictions with a VPN but that would not apply to most small children. Sanctions and embargoes are always an option. | | |
| ▲ | Barbing 5 hours ago | parent [-] | | >fined Exactly. If you’re hurting kids to make more money selling porn videos, straight to jail. I’m glad there are solutions that won’t ruin the Internet. Now the uphill battle to convince our legislators (see: encryption & fundamentally technically ignorant calls for backdoors). I’m here to die on this hill! |
| |
| ▲ | btilly 4 hours ago | parent | prev | next [-] | | People were wrong. We pay money online mostly through credit cards. Credit card transactions can be reversed. If children spend money on porn, those payments are likely to be reversed. This is really bad for the ability of the porn sites to continue receiving credit card payments, and continue making money. An age header is a trivial step that can reduce the odds of the adult site receiving payments that later get reversed. Win, win. But if someone is willing and able to pay, then the adult industry wants the choice of whether to access content to be up to them. If government tries to regulate them, they'll engage in malicious compliance - do the minimum to not be sued, in a way that they can still reach customers. For example Utah tried to institute age verification. The porn industry blocked all IP addresses from Utah. Business boomed for VPN companies in Utah. Everyone, including porn companies, knows that a lot of that is for porn. But if you show up with a Nevada IP address, the porn's position is, "You're in Nevada. Utah law doesn't apply." Even if the credit card has a Utah zip code. If you live in Utah, and you're able to purchase a VPN, the porn companies want your money. | | |
| ▲ | scythe 3 hours ago | parent | next [-] | | >But if someone is willing and able to pay If someone is willing and able to pay, they have a source of money. If they aren't allowed to buy something, that control should be applied at the level where they get the money. If the child is using an adult's credit card, responsibility lies with the adult. If children need to have their own credit cards, the obvious point of control is the credit card itself. But also, most porn is ad-supported, pirated or free. Directly paid content is a small fraction. So all of this is moot for porn. | |
| ▲ | numpad0 2 hours ago | parent | prev [-] | | There was a random comment here on HN few days back that adult contents have lower chargeback rates than everything else. So ig stop spreading hallucinatory misinformations? |
| |
| ▲ | Lammy 5 hours ago | parent | prev [-] | | > Back in the late 90s or so, there was a proposal This one: https://www.w3.org/PICS/ | | |
| ▲ | Bender 5 hours ago | parent [-] | | PICS was very complicated and attenpted to cover all possible "categories" of adult content. It was confusing, incomplete and only a handful of sites voluntarily labelled their sites with it. RTA is one simple static header that any site operator could add in seconds unless they get more complicated with it by dynamically adding it to individual videos say, on Youtube which means in that case the server application would need to send that header for any video tagged as adult. I added PICS to my forums but it was missing many categories of adult content. I ended up just selecting everything as I could not predict what people may upload which made for a very long header. | | |
| ▲ | dylan604 4 hours ago | parent [-] | | > unless they get more complicated with it by dynamically adding it to individual videos say, on Youtube YT already does this. I never watch YT signed in, and I often see videos that require you to be logged in as the video is age restricted. | | |
| ▲ | Bender 4 hours ago | parent [-] | | Agreed though in my example the point would be to set the header in the case the child is logged in but for whatever reason the site does not know their age. Instead of a third party site, a header is sent with the video tagged as adult that triggers parental controls if they are enabled by the device owner. |
|
|
|
|
|
| ▲ | snvzz 7 minutes ago | parent | prev | next [-] |
| >I a small server operator and a client of the internet will not participate in any other methods period, full-stop. You will however follow the law if it mandates you to do else. Which is we "age verification" should be stopped before it's too late. |
|
| ▲ | hooverlabs 4 hours ago | parent | prev | next [-] |
| Servers can then infer user’s ages by whether or not the client renders pages given those headers or not no? See if secondary page requests (e.g images, scripts) are made or not from a client? A bad actor could use this to glean age information from the client and see whether the person viewing the page is a small child. That should be scary |
| |
| ▲ | Bender 3 hours ago | parent | next [-] | | I disagree. The ability to render a page could simply mean that parental controls were not enabled on the device. Some parents have assessed the situation and trust their children to be psychologically ready for adult situations. The client could be literally any age. Today devices do not default to accounts being child accounts. Some day this may change and may require an initial administrator password or something to that affect but this can evolve over time. | | |
| ▲ | NoMoreNicksLeft 3 hours ago | parent [-] | | >I disagree. The ability to render a page could simply mean that parental controls were not enabled on the device. Not being able to detect all children doesn't mean that being able to detect 80% of them is somehow less disturbing. | | |
| ▲ | Bender 3 hours ago | parent [-] | | The point and overall goal should be to not signal anything to the server operator unless a credit card is being used. Everyone is whomever they claim to be as far as anyone is concerned, until payments are required which today means sharing identity and age (via the credit card information on file with the financial institution and is shared today). In the case of RTA the only signalling taking place is a server header being transmitted to the client. The client could be anyone at any age. Nothing to explicitly leak or disclose. Server operators can guess all they desire as some do using AI based on user behavior of which they sometimes get wrong. |
|
| |
| ▲ | nirava 3 hours ago | parent | prev | next [-] | | That's true. But leaking an age threshold is not the same as private companies being able to link all your online activities to a single legal person. | |
| ▲ | e44858 3 hours ago | parent | prev [-] | | Adults could also use this to filter out unwanted content without needing to rely on outdated filter lists. |
|
|
| ▲ | _ink_ 5 hours ago | parent | prev | next [-] |
| How are they supposed to fine sites out of their jurisdiction? |
| |
| ▲ | Bender 5 hours ago | parent [-] | | One possible method [1] though I am sure the network and security engineers here on HN could come up with simpler methods. Just blocking domains on the popular CDN's would kill access for most people as by default most browsers are using them for DoH DNS. [1] - https://news.ycombinator.com/item?id=47950843 | | |
| ▲ | filoleg 4 hours ago | parent [-] | | The question was about fining entities outside of the original jurisdiction, so I am not sure what you have in mind that could be done by network/security engineers here. | | |
| ▲ | Bender 4 hours ago | parent [-] | | In terms of fines if they do not pay the fine their country is at risk of sanctions or embargoes which is probably a bit heavy handed but may incentivize their government to also enforce the rules, collect fines keeping some for themselves and passing the original fine back to the countries implementing child safety controls. | | |
| ▲ | filoleg 3 hours ago | parent [-] | | This is extremely naive and short-sighted. There is a literal example of this happening rn, and hopefully you will see why your approach isn't that good. UK's OFCOM is currenly issuing legal threats to 4chan, for allegedly serving adult content and not willing to implement age verification. 4chan's lawyer tells them to pound sand[0], on the basis that 4chan is hosted in the US and has zero business presence in the UK, and UK is more than welcome to ban the website on their end through UK ISPs. The saga has been ongoing for a while, and the lawyer has been pretty prolific online talking about the case. Anyway, following your approach, UK should embargo US over 4chan not willing to implement age verification as required by UK law? I plainly don't see this happening, or even being considered, ever. 0. https://www.bbc.com/news/articles/c624330lg1ko | | |
| ▲ | Bender 2 hours ago | parent [-] | | 4chan servers are in the US and the owner is in Japan. If the US wanted to they could seize all the servers but they will not because they have real time monitoring of all activity on the boards and have ever since Christopher testified before congress and the site was sold. If anything 5-eyes want that site to be unrestricted. 4chan has been a goldmine of people self reporting for wanting to shoot up or bomb places, as has Reddit leading to many body-cam videos of the site users and in some cases the moderators being busted. The IP addresses are all captured by Cloudflare. It is literally next to impossible to post on 4chan without enabling javascript on Cloudflare or buying a 4chan-pass which leaves a money trail not perfect, nothing is but most mentally unstable people do not think these things through. Should legislation be added to require the RTA header 4chan could and likely would add it in a heart-beat. They already have some decent security headers in place. |
|
|
|
|
|
|
| ▲ | kevin_thibedeau 4 hours ago | parent | prev | next [-] |
| > fine sites not participating into oblivion. That would also amount to compelled speech. |
| |
| ▲ | Bender 4 hours ago | parent | next [-] | | That would also amount to compelled speech. I disagree. The legal requirement to apply a warning label is a well known, understood and accepted process that is applied to a myriad of hazards to children and adults. As just one example businesses in some states, most notably California are compelled to add warning labels to foods and other products that could cause cancer. | | |
| ▲ | SpaceNoodled 3 hours ago | parent | next [-] | | That's not the best example, since the levels set for Prop 65 warnings are so low that the warnings are effectively useless; every single commercial building in CA now somehow causes cancer. | | |
| ▲ | Bender 2 hours ago | parent [-] | | Surely we both understand the point I was making in that labels are already compelled by laws today. Fine, cigarettes must be labelled as being a risk of causing cancer. The punishment for failing to do this is both civil and federal penalties including massive fines and federal prison time. |
| |
| ▲ | sailfast 3 hours ago | parent | prev [-] | | Do you believe using the Internet should require a license? Isn’t that what covers these product warning labels? | | |
| ▲ | Bender 3 hours ago | parent [-] | | I never implied an internet license. Rather if a server operator a business has content that may be adult in nature they must label their site. Businesses require a license already but that is unrelated to this. |
|
| |
| ▲ | Ekaros 3 hours ago | parent | prev | next [-] | | Clients could refuse to show content that does not have headers set. On other hand servers might choose to lie. After all that is their free speech right. So maybe you need some third party vetting list. Ofc, that one should be fully liable for any damages misclassification can cause... But someone would step up. | |
| ▲ | AlienRobot 3 hours ago | parent | prev [-] | | Compelled to disclaim facts is good compelled speech, though. |
|
|
| ▲ | duped 4 hours ago | parent | prev | next [-] |
| This doesn't address the wider array of age-verification related problems that people want to solve, like social media where age verification is needed to police interactions between users. |
| |
| ▲ | jdasdf 4 hours ago | parent | next [-] | | Such censorship shouldn't exist in the first place. | |
| ▲ | Bender 4 hours ago | parent | prev | next [-] | | I could be misunderstanding the context but to me that sounds like a moderation issue assuming we even want small children on social media in the first place. There should probably be a dedicated child-safe social media site that limits what communication can take place for small children and has severe punishments for adults pretending to be children for the purposes of grooming. | | |
| ▲ | duped 3 hours ago | parent [-] | | Moderation is like law enforcement, it doesn't prevent crimes from happening it just punishes the people they can catch. There exist severe punishments for the kinds of behavior I'm talking about, but unsurprisingly, this does not stop kids from being harmed and it doesn't undo it. This isn't hypothetical, by the way. There are adults catfishing kids into producing CSAM [0], kidnapping and assaulting minors [1], [2], and in the most extreme case, there's a borderline cult of crazy young adults who do terrorize people for fun [3]. It is a constant game of whackamole by moderators/admins to keep this behavior out of online spaces where kids hang out. I recognize that this is a "think of the children" argument, but indeed that's the point. The anonymous web was created without thinking about the children, just like how all social media was created without thinking about how it could be used to harm people. Age verification is the smallest step towards mitigating that harm. Now I disagree very strongly with the laws proposed (and indeed, I've been writing/calling/talking with state reps about this locally, because I don't want my state's bill passed). But the technical challenge needs to address the real problems that legislators are trying to go after. [0] https://www.justice.gov/usao-wdnc/pr/discord-user-who-catfis... [1] https://www.nbcnews.com/news/us-news/kidnapping-roblox-rcna2... [2] https://www.nbcmiami.com/news/local/nebraska-man-charged-wit... [3] https://www.fbi.gov/contact-us/field-offices/boston/news/ope... | | |
| ▲ | Bender 2 hours ago | parent [-] | | I am only interesting in protected the majority of children which I believe my proposal more than covers. There will always be exceptions. Today teens share porn, warez, pirated movies and music with small children in rated-G video games. I am not proposing anything for that. It is up to businesses to detect and block such things. Point being, there will be a myriad of exceptions. I am not looking to address the exceptions. Those can be a game of whack-a-mole as they are today. I am proposing something that would prevent the vast majority of children from being exposed to the trash we today call social media and of course also porn sites. | | |
| ▲ | trinsic2 2 hours ago | parent | next [-] | | Look, please don't sideline/marginalize people by using the "whataboutism" term. Thats being used more and more to silence dialog from people that see problems outside the focus of a specific area. Its important that we see ALL sides of the problem. | | |
| ▲ | Bender an hour ago | parent [-] | | Fair enough. Even though I do not perceive it that way I removed it in the event a majority of others have come to this conclusion. | | |
| ▲ | trinsic2 an hour ago | parent [-] | | Thank you for understanding. I know sometimes topics can get out of hand with comments about related things, but I this case. We might be better off looking at all the extremities. |
|
| |
| ▲ | duped an hour ago | parent | prev [-] | | These aren't exceptions or whataboutism. It's the debate being had on the floors of state legislatures. > It is up to businesses to detect and block such things. Which is exactly why age verification legislation is hitting the books. No one (serious) cares about whether kids can download porn and R rated movies. Parental controls already exist if the threat model is preventing access to specific content that is able to report itself as _being_ that content. Your proposal also doesn't address the other domain that these legislators are targeting, which is addictive content. They define specifically what classifies as an addictive stream and put the onus on service providers to assert that they're not delivering addictive streams of media to kids. An HTTP header isn't enough, because it's not about the content being shown to kids but the design patterns of how it's accessed. Essentially: age verification isn't about porn. 18+ content stirs the pot a bit with the evangelical crowd but it's really not what people are worried about when it comes to controlling digital media access with age gates. | | |
| ▲ | Bender an hour ago | parent [-] | | Your proposal also doesn't address the other domain that these legislators are targeting, which is addictive content. That sounds simple to me. If a type of content is addictive then require the RTA header. - Adult content, or possible adult content. - User contributed or generated content (this covers most of social media) - Site psychological profiles that are deemed addictive (TikTok and their ilk) Overall we are describing things that are harmful to the development of the minds of small children. If adults wish to avoid such content they can create a child account on their device for themselves to be excluded from this behavior as well. I use a child account in a couple of popular video games to avoid most of the trash talking and spam. I'm not hiding my age as the games have my debit card information but rather I opt-in to parental controls. |
|
|
|
| |
| ▲ | svachalek 4 hours ago | parent | prev [-] | | This is assuming children should be on social media at all, which I for one would debate. |
|
|
| ▲ | crabbone 2 hours ago | parent | prev [-] |
| How would this work with sites like YouTube which allow sharing of content, potentially not appropriate for children, but the content is generated by the site's users? Who will be fined for "violations"? And how would such a fine be levied, especially internationally? |
| |
| ▲ | Bender 42 minutes ago | parent [-] | | I think that initially the onus would be on Youtube to figure this out. They have some very intelligent engineers. For example, if the Youtube client is receiving affiliate funds then they are easy to ID and fine. If they are random people then Youtube would have to share the violation data with the other countries and the US or UK would have to pressure those countries to participate in fining the end user. There could be financial incentives for the foreign country to participate. They can also just force label a video to be adult as they do today when enough people report it which is admittedly not uniformly applied. |
|