Remix.run Logo
tqi 2 days ago

I think it would be helpful to engage with the possibility that they are neither stupid nor ignorant, rather that they simply have different values and priorities than the early internet users.

2 days ago | parent | next [-]
[deleted]
Levitz 2 days ago | parent | prev | next [-]

And what would those values and priorities be? Because it doesn't seem to me that they align with what they actually do.

For example, it seems to me there is a whole lot of worry around megacorporations, often related to capitalism and the inequalities it brings.

In that context, if you don't place privacy as a priority, how are you not either stupid or ignorant? Is my premise just wrong?

ndriscoll 2 days ago | parent | next [-]

You can be in favor of privacy while simultaneously thinking porn, gambling, and advertisers shouldn't be targeting children. The age verification bills I've read have steep penalties for retaining information, so that seems fine since that's literally more protection than you get in person.

It's really more just concluding that those corporations should be liable for their behavior. It also has nothing to do with "the Internet" which is largely unaffected. Except of course ideas for forcing OS behavior coming out of California which are obviously bad.

I actually think things could be a lot simpler if we just made the laws like alcohol: it's illegal (with criminal liability) for a non-parent adult to provide <restricted thing> to a child. Simple enough. Seems to work fine as-is for Internet alcohol purchases. Businesses dealing in restricted industries can figure out how to avoid that liability. That's entirely compatible with making it illegal for businesses to stalk everyone, which we should also do!

choo-t 2 days ago | parent | next [-]

> The age verification bills I've read have steep penalties for retaining information, so that seems fine since that's literally more protection than you get in person.

The best way (and only way) to prevent retaining information is to not share them in the first place.

> You can be in favor of privacy while simultaneously thinking porn, gambling, and advertisers shouldn't be targeting children.

There are other method to achieve this without mandatory identification. You can force these content to be served with an HTTP header providing their legal minimum age of consultation or type of content, and blocking them browser side. Governments could maintain filter lists for different age bracket and release them to everyone, allowing easy compliance on the device parental control settings.

ndriscoll 2 days ago | parent [-]

Headers could maybe work in a world where the technology were ubiquitous and people knew how to set it up (c.f. v-chip's failures), and kids couldn't just buy their own device for $20 and use it on the actually ubiquitous free pubic wi-fi to avoid any restrictions.

And actually I think it's a better world where kids can obtain e.g. a raspberry pi that they completely control no questions asked and free public wi-fi exists all over, and the onus is on service providers to not deal with children if they're not supposed to. Basically, a high trust society.

In any case, "don't retain records" is actually a pretty easy task. Trivial, actually (use a device with no disk to handle PII, an API that just returns yes/no to the rest of the system, and heavily restrict the firewall, e.g. no outbound connections). Or you buy a token/gift card in person with ID check. If you think the penalties aren't steep enough to get compliance, just raise them (e.g. business ending fines plus jail time).

fc417fc802 2 days ago | parent | prev [-]

If you implemented that simple solution the expected outcome is businesses collecting ID at the door. But unlike the age verification bills there'd be no prohibition of or penalty for misuse of the collected information. It's a strictly worse outcome.

You can make intentional targeting illegal without criminalizing the accidental. And mandating self categorization of content by service providers would enable standardized filtering that was broadly effective.

The above won't get kids off of social media and it won't serve the purposes of the surveillance state but it will meet the stated goals of those pushing these measures.

Keeping children off of social media is a much trickier problem. I think we'd be better served by banning certain sorts of algorithmic feeds.

ndriscoll 2 days ago | parent [-]

Okay, so make it illegal for them to record any information which is what the actual laws do (or better, explicitly criminalize all the other current stalking). The point is you don't need to be prescriptive about how to prevent children from accessing the sites. Just make it so you can face massive fines and be arrested if you don't. They can figure out how to comply with the law, and they can be effective or be shut down.

They're not actually owed a solution for how to make their business model work. They can just be told that what they're doing is unacceptable, and they can figure out what they'd like to do next. If you're worried they might react with some other unacceptable thing, we can clarify that that's not okay either.

fc417fc802 2 days ago | parent [-]

I agree that open ended requirements are better than the imposition of prescriptive solutions. But I don't want online ID verification and that's where your proposal logically leads so I am equally opposed to it.

> They're not actually owed a solution for how to make their business model work. They can just be told that what they're doing is unacceptable,

You listed a few different things previously. Which one are we talking about here?

I think the rest of us are owed a solution where we can still do what we want without having our privacy violated. Regulations need to take the end user into account.

I already proposed what I think would be a workable solution to achieve the stated goals without unduly eroding the status quo. Do you have any response to it?

ndriscoll 2 days ago | parent [-]

Self categorization has been the status quo since the 90s and has proven to be insufficient. More generally, assuming people agree that something is a social problem/should be restricted, I don't think "have a third party come up with a solution that people can buy to filter us" makes sense. The liability belongs on the people dealing in the restricted item.

We don't give kids special debit cards that detect and block purchases of cigarettes and alcohol and say "make sure your kids don't get cash". We make it a crime to sell those things to a child.

Why is online ID verification a problem for e.g. porn and gambling but it's fine for alcohol? Why should it be fully anonymous? Should we also allow anonymous porn and cigarette vending machines in person? Why is online special?

This whole idea of anonymous access can't even work in a world where you actually pay for things online, which makes the whole proposition even more dubious. If you're an adult and spending money online, you already told them who you are (modulo darknet markets with crypto). Or you could buy a porn gift card in person with an ID flash like other restricted physical items if you're uncomfortable with online payments. And treat the gift card as restricted as well: giving it to a minor is a crime. So then what's the problem exactly? Ad supported porn specifically somehow is important enough to be special?

More to the point: as far as I know, if you perform a sex act in plain view inside of a private establishment that's open to the general public with no restrictions, then that's public indecency/lewd conduct, a criminal act, even if the owner consents. If children are present it can become a felony and you're going on the sex offender list along with jail time. Why is an unrestricted public website different?

Why are you "owed" this privacy online when someone running an open to all, fully anonymous, unchecked porn theatre in person would be arrested? How about the privacy you are owed is that your business stays between you and whomever you interact with, and even they can be asked/required not to keep or share notes about you? But they can still be expected to know you are an adult before they sell you adult services.

fc417fc802 2 days ago | parent [-]

TBH I think this is all either fundamentally flawed or incredibly weak except for your final paragraph. That one actually poses a somewhat interesting question - why the seeming disparity between online and offline porn regulations in the US? Still, it fails to address (or even acknowledge) the differences in the impact of requiring ID between those scenarios.

Also I think you have this entire thing exactly backwards. It's not on the rest of us to convince the other camp that ID shouldn't be required. Rather it's on the other camp to put forward a convincing case that ID should be required - that there is no realistic alternative and that the tradeoffs are worth the cost. Otherwise the current status quo wins out.

> Self categorization has been the status quo since the 90s and has been proven to be insufficient.

What are you on about? Legally mandated self categorization has never been tried and would presumably work if there were penalties for violations. You don't even need 100% compliance, you just need high enough compliance that the default becomes to filter out any site that fails to do so.

Voluntary self categorization isn't particularly useful because almost no operators bother to do it. So you're left with no (workable) option other than whitelist filtering.

> have a third party come up with a solution that people can buy to filter us

I never suggested anything of the sort.

> The liability belongs on the people dealing in the restricted item.

The items are not currently restricted and I don't agree with you that they should be. However I would agree to changing things to make all providers liable for accurately self categorizing the content they serve up by means of a standardized header format or some other protocol.

> Why is online ID verification a problem for e.g. porn and gambling but it's fine for alcohol?

Presumably because you have to take receipt of the shipment so the vendor is already going to collect your PII.

Why is legally requiring that a gambling website send a header categorizing itself as such unworkable yet somehow it's all going to work out just fine if we require them to do the much more complicated thing of securely handling and accurately verifying identification documents? That seems like an obvious contradiction to me.

> Why should it be fully anonymous? Should we also allow anonymous porn and cigarette vending machines in person?

Don't we effectively do exactly that? There's no requirement for ID retention on sale of alcohol or cigarettes and until recently the norm was for the clerk to briefly eyeball your license. They also didn't used to bother checking ID if you looked old enough. (That's changed at the major retailers around here lately but that's a different matter.)

Anyway I never claimed the brick and mortar way of doing things was ideal so arguing as though I've agreed to that seems rather disingenuous.

> If you're an adult and spending money online, you already told them who you are

But I did not give them a copy of my ID or any otherwise unnecessary PII and do not want to be required to do so. Also there are plenty of ways to pay for things online without readily revealing your identity to the couterparty. I expect you are well aware of that fact.

> Why is an unrestricted public website different?

For practical reasons I'd imagine. Analogies are great and all but at the end of the day a global electronic communication network has rather different properties than a physical brick and mortar location that you walk into.

Regardless, the reputable services all seem to agree with you (as do I) and thus go out of their way to send headers marking them as adult only. It's roughly equivalent to a shop hanging a "no under 18 allowed" on the door but then not bothering to ID anyone. If parents can't be bothered to configure even the most basic of controls on their children's devices why should the rest of society be made to suffer for that?

ndriscoll 2 days ago | parent [-]

Sending a header is unworkable because nothing obeys it, there are embedded browsers all over, and even if you mandated that every app/browser do so, kids can get a computer/phone for $20 with no restrictions.

There's no requirement for ID retention online either. In fact, unlike in person, it is banned. And a framework where you just say "you are liable for what you provide to children" actually allows for a site employee to briefly eyeball your ID or just look at you and decide you look old enough (though that doesn't really work with realtime video generation).

Record retention is a different question from checking. I think I and the actual relevant laws have been pretty clear that we should disallow that. No, we do not have anonymous cigarette vending machines (at least anywhere I've been in the US). They are always behind a counter with an ID check.

Except for crypto, I don't think I am familiar with any way to pay for something online without revealing my identity. I'm pretty sure 100% of online purchases I've made over the last 20 years have required name/address and usually phone number as part of payment details. Even with crypto, as far as I know common wisdom on darknet markets is (or was?) to use your real name/address as that's the least suspicious. I don't actually know a single place where you don't give that info to your counterparty. I can't imagine it's common.

What parental controls? As far as I know, Safari is the only modern browser that checks RTA headers (if it still does). There are no options for Chrome or more importantly Firefox, which is the only browser that's fit for purpose with malware blocking (especially for children). Similarly Android has no controls.

I don't see what part of being online makes it less practical to check ID. It seems more practical to me. It's just cheaper not to, and online businesses are big on avoiding labor. That's not some fundamental right of theirs.

fc417fc802 2 days ago | parent [-]

The browsers don't support it because only a few major sites bother to send it. The issue here is not support by client software it is lack of participation. That could be fixed via legal mandate, no different than requiring ID checks or anything else.

Right now if you want to build out a filtering solution there's nothing to base it on. We could fix that via regulation and then filtering would just work.

> kids can get a computer/phone for $20 with no restrictions.

At that point ID checks are no good either. They can just visit a site from a different country that doesn't respect our legal framework or hop on tor or bittorrent or whatever else.

In fact when it comes to ID checks if you don't enable parental controls and filtering then they will be able to bypass it in the exact same way as above except using their regular device that you gave to them! No need to go purchase a new one!

So you're inevitably going to end up needing a client side filtering solution regardless. As I keep telling you, the solution you're gunning for here is strictly worse than content filtering based on mandatory headers.

> Except for crypto, I don't think I am familiar with any way to pay for something online without revealing my identity.

There are also virtual credit card services. Or gift cards (which you yourself mentioned earlier).

Of course anything shipped needs a name and address (and likely phone number) but there are plenty of services you can pay for that don't involve shipping a physical item.

> That's not some fundamental right of theirs.

Never said or even implied that to be the case. I think I've been pretty clear that I see it as a threat to privacy, that I don't personally want it, and that I don't think it's the best (or even a particularly good) solution for the stated problems.

It's bizarre to me. You are putting all this effort towards advocating for new regulation that would require a change to how services operate. Simultaneously you argue against a less intrusive solution on the basis that no one currently does it. For some reason everyone can start checking IDs but sending a header is a bridge too far? It's inconsistent.

ndriscoll 2 days ago | parent [-]

> They can just visit a site from a different country that doesn't respect our legal framework

That's called noncompliance. This is why a simpler framework is better: do you demonstrably serve content to children in this jurisdiction illegally? Then you'll incur fines and a warrant here. Better not have revenue or visit here. And we could put the same liability on advertisers funding it so there's just no financial incentive for anyone.

Bittorrent is trivial to block, other countries are easy to block on your router, and it would be simple enough to just say running an open proxy incurs liability for anything you front if you obscure the originating location or allow international traffic. Again the basic principle is "are you providing access to the general public with no gating to restricted material?" In any case, obscure Russian forums you can access through Tor are an afterthought compared to e.g. Reddit, which hosts both Roblox forums and porn today with no wall between them. There's no reason to allow that.

Note also that provider liability doesn't mean we can't also have filtering. Liability just creates the correct incentives for providers to help ensure the solution actually works. If liability with no prescription for a solution would lead to ID checks and not working with vendors to have working filters, that kind of reveals what we think would actually work.

As far as virtual cards go, do they not still require payment information? Surely business don't want to deal with anonymous purchases since that's begging for fraud? In any case, service provider liability is still compatible here. I didn't say they need to check ID. Neither does e.g. the Texas law. It says someone needs to verify age. They can use a commercial service for it. The virtual card provider or gift card retailer could provide that service and assume or share liability.

I'm not even necessarily advocating for a new regulation. I'm saying recognize public indecency/lewd behavior for what it is, and ban things like gambling in children's games. Recognize that public websites with no access gates are public spaces and act accordingly. And yes I consider checking ID for a handful of specific services to be less intrusive than everyone supporting some header. I don't consider the former to be intrusive at all really. The latter is basically impossible if for no other reason than there are already billions of devices that don't. It's a fantasy non solution that basically amounts to "do nothing".

jart 2 days ago | parent | prev [-]

I don't know why I'm the only person online willing to steelman this, but...

The early Internet users weren't people who subscribed to AOL to look at porn in the 90's. They were the people who were granted access to the ARPANET to work in the 80's. The Internet was an exclusive community back then. You had government employees, knowledge workers, and elite university students who had all passed institutional screening processes. You were only allowed to use the ARPANET if you were using it to do something useful and aligned. Therefore you could feel reasonably assured that anyone you talked to online was going to be better than the average person you'd find going outside and walking down the street. If you wanted to know who they were, you could just finger their username. If you wanted to know who owned a domain, you could whois it, get their name and then even write them mail or call them.

People have wanted that old Internet back for a long time. i.e. the one that existed before Eternal September. Those are the people who run your tech companies. The ones who remember what it was like. These people understand what people actually want isn't always the same thing as what they say they want. They understand why the only truly successful Internet spaces on the modern Internet are the ones like Facebook that got people to be non-anonymous. Another example is the best places to work that folks desperately want to get into are the companies like Google whose intranets are much more like the old Internet. These are really the only Internet spaces that normal people want to use. Because people want to interact with other people who are similar to them. Because people want to know who other people are. Otherwise we can't operate as the social creatures that evolution designed us to be. I don't think any civilization in history has operated its public square as a gigantic red light district where everyone is required to wear a mask. So why should we?

Overcoming the anonymous religion problem that somehow glommed onto the hacker and cyberpunk movements is more important and urgent now than it's ever been, because the Internet has been filling up with billions of AI agents. It's gonna be Eternal September in overdrive. Humanity is really facing a tradeoff where you'll have to have gatekeeping again and won't be allowed to conceal who you are, or you can be gaslit by machines forever in your own robot fantasy.

sillysaurusx 2 days ago | parent | prev | next [-]

I’m not sure it’s possible to have different priorities without being stupid or ignorant of history. Once you concede a certain right, such as a right to privacy, you rarely if ever get it back. Most people seem not to care about this, despite ample evidence that it’s something worth caring about. Stupid is the obvious term for it, though obtuse could work as well.

Of course, I don’t blame them. They haven’t lived in a context where they need to care. All of the reasons they’ve heard to care have come from stories of people who lived before them. But ignoring warnings for no good reason is still dumb.

A better thing to engage with is whether we can meaningfully change the situation. It might still be possible, but it requires an effective immune response from everybody on this particular topic. I’m not sure we can, but it’s worth trying to.

Kim_Bruning 2 days ago | parent | next [-]

> They haven’t lived in a context where they need to care.

You might believe you don't need opsec, and then new laws are passed, or your national supreme court overturns the case that gave you your rights, or someone invades; and now suddenly you're wanted for anything from overstaying a visa, outright murder, or simply existing.

USA, right now, peoples lives are being destroyed because the wrong people got their data. Lethal consequences exist in Russia, Ukraine, Israel, Palestine, Lebanon, Iran.

Certain professions per definition: Journalists, Lawyers, Intelligence, Military.

Certain Ethnicities. (Jewish, Somali) ; Faiths...

It doesn't need to be quite this dramatic though. But you might accidentally have broken some laws and don't even know about it yet. Caught a fish? Released a fish? Give the wrong child a bowl of soup [1]. Open the door, refuse to open the door. Signed a register; didn't sign a register. The list of actual examples is endless. The less people know about you, the less they can prosecute.

[1] A flaw in the Dutch Asylum Emergency Measures Act (2025) that would have criminalized offering even a bowl of soup to an undocumented person. The Council of State confirmed this reading. A follow-up bill was needed to fix it.

closeparen 2 days ago | parent [-]

There is no world where a totalitarian government’s law enforcement ambitions on some object-level question are thwarted by the same government’s enforcement of privacy law. Countries with GDPR that are thinking of rounding up and kicking out the refugees know perfectly well who and where the refugees are.

gzread 2 days ago | parent | next [-]

The law is irrelevant in that case but the actual situation is not. If people have never put their personal information online, the bad government can't get it from online. A new phone coming out during the time of the bad government, that says the government requires you to enter your name and address, will not be received as well as if it comes out during good government times.

nandomrumber 2 days ago | parent [-]

> will not be received as well as if it comes out during good government times.

What bearing does that have on anything.

fc417fc802 2 days ago | parent [-]

Making the point that people tend to engage in short term thinking. The reception of the same law, product, or practice will be colored by the current government as opposed to potential future ones.

Kim_Bruning 2 days ago | parent | prev [-]

You're not entirely wrong; ultimately if they put enough resources towards it they can probably catch quite a number of people. But governments have limited resources and really don't track everyone all the time. Not even in 2026 are they able to do that yet. It helps if you maintain some level of opsec. If they really want to get you, they can get close, but see eg Ed Snowden; who managed to stay ahead of the US government just long enough to reach relative safety (FSVO).

nandomrumber 2 days ago | parent [-]

Snowden’s experience doesn’t generalise to, well, anyone really.

Kim_Bruning 2 days ago | parent [-]

Well, I wouldn't personally recommend single-handedly taking on the most powerful nation on earth, myself.

But turns out that if your opsec is decent, and even using mostly publicly available tools like Snowden did, you might survive even that.

In the nuanced case, normal people applying more normal opsec can handle more normal things, would seem to follow.

closeparen 2 days ago | parent | prev | next [-]

I have the right to my own senses, my own observations, my own memories. I have the right to photograph what I can see with my eyes, and to write down what I can remember. Unless enjoined by a specific duty of care (doctor/patient, attorney/client, security clearance, etc) I have the right to discuss my memories with others. This obtains even when using electronic tools and even when working in association with others.

I don’t intend to give up or accept limitations on these rights because you consider yourself to have “privacy rights” or ownership interests in my records, my memories, my perceptions, or the reality in front of me. I find the notion of the government or another person interfering in this process, the perception and recollection of reality, to be creepy and totalitarian by itself.

In 1984, it is not only that the government is aware of Winston, but that it routinely tampers with or destroys evidence of the past & demands to control the perception of the present. I do not think we should let a government do that, even for a good reason like “protect your privacy” any more than we should let it destroy general purpose computing “for the children.”

Kim_Bruning 2 days ago | parent | next [-]

I'm actually fine with that; so long as that is restricted to your own senses, observations, and memories; and doesn't somehow spill over and somehow pertain to mine. Basically the typical freedom to swing your fists ends at the tip of my nose argument. This is probably a solvable problem between reasonable people; give or take.

fc417fc802 2 days ago | parent | prev [-]

It can remain legal to operate a security camera while being illegal to upload unencrypted footage to any third party. I'm not worried about individuals, only about big business and the government.

> This obtains even when using electronic tools and even when working in association with others.

I think it is reasonable to place limits on public "speech" (ex uploading videos of people) without interfering with private (in the case of electronics E2EE) communications.

gzread 2 days ago | parent | prev [-]

There are many people rights people don't have and they're okay with that and even support not having the right to stab people, not having the right to steal from a store, not having the right to take nude pictures of children... What if this one is like that?

micromacrofoot 2 days ago | parent | prev [-]

they are saddled with more problems that they can reasonably care about and broader issues like privacy drop off of their radars because they've never had it