Remix.run Logo
cortesoft 7 hours ago

While I agree with the premise, I do wonder how you can write a law that would stop the behavior we want to stop without hurting beneficial features or allowing the law to be too easily bypassed.

How do you describe in a legal way the difference between a useful feature people want and an addictive feature they don’t want?

ryandrake 5 hours ago | parent | next [-]

> How do you describe in a legal way the difference between a useful feature people want and an addictive feature they don’t want?

I don't know how you'd write it in a law either, but if you're in a meeting at your tech company, and the product owner or tech lead uses language like "We need to get users to do..." and "We need to incentivize..." and "It should be easy to do X and hard to do Y..." then do whatever is in your power to steer/stop. You're not really building a product users want, you're pushing a behavior-modification scheme onto users.

pbasista 5 hours ago | parent | next [-]

> It should be easy to do X and hard to do Y

> you're pushing a behavior-modification scheme onto users

In general I think that your comment is reasonable. I just would like to point out that such "behavior-modification" schemes are sometimes introduced for genuinely good and ethical reasons.

For instance, it is in my opinion desirable to make it more difficult for users to delete all their photos by e.g. having to confirm their decision in a dialog first. Because it prevents them from accidentally doing something they might not want to do and which is potentially impossible to revert.

cortesoft 5 hours ago | parent | prev [-]

I feel like they will just frame it differently: “Users aren’t getting the full value from product x, so let’s change the workflow to help enable them to get more value with no additional effort” or “Users are losing out on a ton of value by cancelling their subscriptions without realizing what they are losing out on, so let’s implement feature x to make them less likely to mistakenly cancel”

akersten 6 hours ago | parent | prev | next [-]

> How do you describe in a legal way the difference between a useful feature people want and an addictive feature they don’t want?

For laws like this it always boils down to "I'll know it when I see it" which is such a shockingly poor way to write legislation that I'm flabbergasted it doesn't immediately fail any amount of rudimentary scrutiny. Not to mention the latitude it grants for selective enforcement. It's basically Washington asking (through the Economist) for a leash on platforms that host their critics that they can yank at any time the population gets too rowdy, with the convenient justification that the algorithm is too good and our attention spans are in danger or whatever.

conductr 6 hours ago | parent | prev | next [-]

Agree. My first thought is most people in early days didn’t even want to start using PCs for work to begin with. The businesses generally had to mandate it. I imagine many people are facing this today with AI.

traderj0e 6 hours ago | parent | prev | next [-]

One way is intent. If a company's internal communications show that they're intentionally making it addictive, or worse they know it causes harm, you have the smoking gun. This of course doesn't catch all the abuse, but at least it makes it much harder to do this down an entire reporting chain. They have to get really good at winking.

One famous case was Apple suing Samsung over patents. Hard to prove until internal comms surfaced showing intent to copy the iPhone.

cortesoft 5 hours ago | parent [-]

Companies are onto this, though, and do training with their staff about how to phrase things in emails to make it look better.

traderj0e 3 hours ago | parent [-]

Yeah I've done those trainings. That's expected. Even if people learn to say things without saying them, it's a lot harder to communicate across multiple people. And some people are still loudmouths, like at Samsung evidently.

Seattle3503 2 hours ago | parent | prev | next [-]

You create an agency and give it a mandate that requires it to balance concerns.

octoberfranklin 2 hours ago | parent [-]

This answer can be applied to pretty much any social question.

If it were so easy, we'd do this all the time. We already do it a lot, and there are heaps of examples where it goes wrong.

general1465 6 hours ago | parent | prev | next [-]

Very simple - force companies into data interoperability. That will allow users to move to competition without any data loss. I.e. nobody actually cares that GitHub is constantly down because you can move your repos to a different git provider or to your own server.

Aurornis 5 hours ago | parent [-]

> I.e. nobody actually cares that GitHub is constantly down because you can move your repos to a different git provider or to your own server.

I honestly can't tell if this is serious or satire, so apologies if missed the joke.

Pushing a git repo to a new server is built into git itself.

Github project data is easy to export: https://docs.github.com/en/issues/planning-and-tracking-with...

There are import tools for many competing projects that will transfer it over in various ways.

octoberfranklin 2 hours ago | parent [-]

> Github project data is easy to export: https://docs.github.com/en/issues/planning-and-tracking-with...

Only the project owner can do that.

Aurornis an hour ago | parent [-]

As a project owner, I don't want random individuals exporting my project data and cloning it somewhere else

y0eswddl 5 hours ago | parent | prev | next [-]

dark patterns are pretty well documented and understood at this point. I don't think identifying them is all that hard.

Infinite scroll is one obvious one. As well as forcing algorithmic feeds of accounts we don't follow.

thaumasiotes 7 hours ago | parent | prev [-]

Well, you could look to the gambling market for inspiration and let people voluntarily sign up for a blacklist on that feature.

That would be a lot of extra work for the platforms, but I think the results would be interesting. It amounts to legislating that certain features have to be optional and configurable.