Remix.run Logo
stingraycharles 2 days ago

This is one of those things that don’t translate to legal reality very well, as then you have to define “what is an algorithm”.

Is adding advertisements an algorithm?

Is including likes an algorithm?

Is automatically starting the next video after a previous one has finished an algorithm?

Is infinite scroll an algorithm?

Etc

andybak 2 days ago | parent | next [-]

This kind of complex leglislation already exists in many areas of the law: revenue collection being the most obvious one. We could choose to treat "societal harm" the way we treat "tax collection".

I'm not saying there aren't infinite edge cases and second-order effects - but we tolerate those already for many things. I'm not pretending this is simple or even desirable - I'm merely stating it's possible if we want to do it.

My biggest fear is that (like the UK Online safety act) this acts to favour the huge corporations because they are the only ones that can afford a team of lawyers. Any legislation should aim to carve out exceptions to avoid indirectly helping monopolies.

stingraycharles 2 days ago | parent [-]

Great example. These companies are already experts at circumventing taxes, what makes you think they can’t weasel their way around some arbitrary written law?

Just look at the malicious compliance that Apple and Google have around the App Store stuff, they’ll find a way to comply with the law and implement different addictive dark patterns.

I’m not saying that I disagree that these companies need to be regulated, I absolutely do. I just think it’s going to be a complicated process, and not “oh just ban everything that’s an algorithm”.

And I have absolutely 0 faith in companies like Meta willfully complying.

soVeryTired 2 days ago | parent | next [-]

I have a feeling taxes are possible to circumvent only because a government tends to have one arm that wants to collect taxes, and another that wants to reduce them to encourage certain outcomes (like having a business setting up shop within its borders).

The US may have this dual incentive structure since it wants to build its tech giants while limiting their control, but the EU doesn't. The arrival of a foreign tech social media giant might make the legislation a bit more palatable to pass.

It will undoubtedly be complex to regulate all dark patterns away. But there are a few obvious, easy wins. It'd be a shame to make perfect the enemy of good.

bootsabota 2 days ago | parent | prev | next [-]

Yeah it’s a tough situation to figure out.

But here’s the real problem: people don’t care. And I say that as someone who hasn’t used social media since 2014.

My observation of people’s behavior indicates that when all is said and done, people don’t care—they would rather get the endorphins from posting, liking, following, etc.

But the solution is to allow people to control their own algorithm, and to have open source solutions where communities manage their own social network.

It’s not the algorithm that is the problem it is that people don’t have the choice to curate their own content.

tsunamifury 2 days ago | parent [-]

People don’t care because the broader reality offers no alternative today.

We have failed at creating a society with hope for the future.

This is like prohibition talk. It was all the avoid the fact that America in that period was a social hellscape.

AndrewKemendo 2 days ago | parent | prev [-]

Regulated by who?

There’s no political organization (yes Mamdani actually out-raised cuomo so let that sink in) that isn’t being actively bribed

bee_rider 2 days ago | parent [-]

Although it should be noted that Mamdani’s average donation size skewed much smaller than Cuomo’s, so it is possible that Mamdani was “bribed” by the general public.

kubb 2 days ago | parent | prev | next [-]

This is some kind of a meme where people believe things can’t be defined in legal terms and therefore can’t be regulated. These people are usually not lawyers.

Does anyone know where it’s coming from? I can certainly believe that incompetent jurisdictions have a ton of issues with people misapplying the law and using loopholes.

biophysboy 2 days ago | parent | next [-]

Albert Hirschman wrote a great book about the rhetoric people use to stifle policy proposals 35 years ago. “It’s futile; it won’t ever work” is one common argument. It’s not a meme so much as a cynical reflexive intuition

AdamN 2 days ago | parent [-]

One that's reinforced by those against whichever legislation or regulation is being proposed.

naravara 2 days ago | parent | prev | next [-]

> This is some kind of a meme where people believe things can’t be defined in legal terms and therefore can’t be regulated. These people are usually not lawyers.

No they’re engineers who think rules have to function as rigidly in every field as they do in programming.

They either can’t or don’t want to accept that the law is a social construct and what it actually means to you is determined by the weight of precedent, as applied by judges and regulatory bodies. Things are vaguely worded in the law all the time. If people want to dispute how the enforcement is done they sue and judge decides how the rule should be applied.

SpicyLemonZest 2 days ago | parent | prev | next [-]

The point isn't that it can't be regulated. What the original comment said was

> This is pretty easy to solve. If you present data by algorithm, you are no longer an impartial common carrier and are liable for the content you present.

But this is not in fact easy. It's hard to define what "present data by algorithm" means in a coherent way, and it's hard to extend liability for the content you present to liability for the manner in which you present it. You could make it work, if for some reason you really wanted to, but it's easier to pursue the strategy described in the source article of regulating specific abusive patterns.

specialist 2 days ago | parent [-]

Who has agency in the relationship, server or client.

Said another way: push vs pull.

owebmaster 2 days ago | parent | prev [-]

It probably comes from the same pockets that influences legislation

throwawayffffas 2 days ago | parent | prev | next [-]

"By algorithm" can be easily defined.

The easy benchmark to setup can easily be, that any feed that displays the data in a way other than the following is considered an editorial choice and thus the platform is liable as a publisher:

1. In a chronological order, and only filtered based on user selected options.

2. In any other order explicitly selected by the user.

An exception can be made to allow filtering out content that violates the platforms terms and conditions.

Alternatively there can be no exception, effectively making these platforms unworkable. This is also a choice. We do not need these platforms, including this one.

tmvphil 2 days ago | parent [-]

If the user selects "sort by algorithm" then I don't see how you've changed anything other than the default. I think it's pretty obvious just changing the default won't work.

xigoi 2 days ago | parent | next [-]

Changing the default makes a huge difference because 99% of people don’t change settings.

tmvphil 2 days ago | parent [-]

That's because the default is 99% the way the app is designed to be used. If the default is regulated, then they will just say "sorry the default is boring, click here to bring back the feed" and everyone will just click.

stingraycharles 2 days ago | parent [-]

As a matter of fact, the social media companies will then have an incentive to make the default really bad, which is absolutely what they will do. This would be the malicious compliance I was referring to elsewhere.

I think most people over here are oversimplifying this and underestimating the ability of these companies to get what they want.

throwawayffffas 2 days ago | parent | prev [-]

Sort by algorithm is not explicit, hence the explicit in my wording. They can have that option if they want but they must be liable for the content.

orbital-decay 2 days ago | parent | prev | next [-]

"Algorithm" is a method of selecting the content to display. You're listing presentation types, not selection types. Presentation has nothing to do with supervised selection. Selecting the next video in the infinite scroll would be the algorithm, not the infinite scrolling mechanism itself.

itissid 2 days ago | parent | prev | next [-]

Instead, a regulation could mandate the administration an anonymized unbiased mandatory eval test at the end of every week/bi-week/month just like instruments for psych evaluation (e.g. do you feel your <mental-health-metric> has become worst in the last <time-period> on <scale>. Did you have <mental-health-marker> after watching content on social media?).

The said regulation can then mandate that after calibration and correction the feed pull back by training the algorithm to adjust it in a rapid A/B test.

This is all doable by the companies themselves, but since they wont, the key is to mandate it and publish the aggregate results regularly — like make it part of the quarterly share holder's SEC reporting requirement or something.

itissid 2 days ago | parent [-]

I would say its naive to regulate the algorithm than its effects. The effects are all that matter at the end.

randunel 2 days ago | parent | prev | next [-]

Everything other than sorting the list of entities by a standard measurement unit (time, length, mass, temperature, amount) needs to be covered by this law.

The moment you add other entities to the list (e.g. ads inbetween posts), then it's also subject to the same restrictions.

stingraycharles 2 days ago | parent [-]

This effectively means “every online platform ever” and would also have included MySpace and the OG Yahoo etc, and as such would not really single out the truly bad actors.

And then we’ll end up with with another cookie-banner style law which had good intentions but actually missed the point entirely.

bee_rider 2 days ago | parent | next [-]

Maybe MySpace should be covered. I mean, MySpace probably(?) had the technical capacity to act maliciously in the manner that modern social media sites do, then business model just hadn’t evolved to the modern toxic state yet.

The cookie banner law is fine for the most part. Sites that do the malicious-compliance thing of over-prompting the user for permissions are providing a strong signal that they are bad actors. It’s about as much as we can expect without banning them entirely…

randunel 2 days ago | parent | prev | next [-]

I stopped using facebook around 2015-ish, when they stopped allowing sorting by date. Prior to this, hi5 and the likes also allwoed sorting by date. So no, not every online platform ever.

progval 2 days ago | parent | prev [-]

It even includes email providers with a spam filter.

3form 2 days ago | parent | prev | next [-]

This doesn't differ much from the legal reality that I've seen. Terms need to be defined, yes. It will require work to do so. And that work should be done even if it's a bother.

tzs 2 days ago | parent | prev | next [-]

New York did a pretty good job in their law that limits addictive feeds. Here's what their law says:

> "Addictive feed" shall mean a website, online service, online application, or mobile application, or a portion thereof, in which multiple pieces of media generated or shared by users of a website, online service, online application, or mobile application, either concurrently or sequentially, are recommended, selected, or prioritized for display to a user based, in whole or in part, on information associated with the user or the user's device, unless any of the following conditions are met, alone or in combination with one another:

> (a) the recommendation, prioritization, or selection is based on information that is not persistently associated with the user or user's device, and does not concern the user's previous interactions with media generated or shared by other users;

> (b) the recommendation, prioritization, or selection is based on user-selected privacy or accessibility settings, or technical information concerning the user's device;

> (c) the user expressly and unambiguously requested the specific media, media by the author, creator, or poster of media the user has subscribed to, or media shared by users to a page or group the user has subscribed to, provided that the media is not recommended, selected, or prioritized for display based, in whole or in part, on other information associated with the user or the user's device that is not otherwise permissible under this subdivision;

> (d) the user expressly and unambiguously requested that specific media, media by a specified author, creator, or poster of media the user has subscribed to, or media shared by users to a page or group the user has subscribed to pursuant to paragraph (c) of this subdivision, be blocked, prioritized or deprioritized for display, provided that the media is not recommended, selected, or prioritized for display based, in whole or in part, on other information associated with the user or the user's device that is not otherwise permissible under this subdivision;

> (e) the media are direct and private communications;

> (f) the media are recommended, selected, or prioritized only in response to a specific search inquiry by the user;

(> g) the media recommended, selected, or prioritized for display is exclusively next in a pre-existing sequence from the same author, creator, poster, or source; or

> (h) the recommendation, prioritization, or selection is necessary to comply with the provisions of this article and any regulations promulgated pursuant to this article.

baggachipz 2 days ago | parent | prev | next [-]

Ok so then the "algorithm" must be made available to authorities (or even better, the public at large) and be approved or rejected based on a court or a law. Obviously an algorithm based on "engagement" or "narrative" should be rejected with prejudice every time.

pessimizer 2 days ago | parent | prev [-]

I don't see a single difficult example here. The answer is "NO." It's strange that you couldn't even find one.

I mean "Is including likes an algorithm?" You might as well ask if having a dog in the video is an algorithm. Any question about "likes" would be if you're manipulating the video selection based on likes, or is the user given a control to manipulate the video selection based on likes. If it's you it's an algorithm. If it's the user, it's a control. If you lie about the likes, then it's an algorithm. If you're transparent about the likes, then it is a control.

The other ones aren't even worth discussing. You might as well ask if having a blue logo is an algorithm, or if Comic Sans is an algorithm. "It's all so complicated!"

-----

edit: that being said, the EU does not care about this issue at all, and has had plenty of mandate and plenty of time to have done something about it if it did. They are also going to say "it's all so complicated." Because their problem is the unpopularity of center-left neolib governments that are just barely holding on with extreme minority support through bureaucratic means because they wrote the regulations. They want to keep what's came for British Labour during the recent council elections from coming for them.

So I guarantee that content will somehow become an "algorithm." The goal is to keep people who don't like them from speaking to each other.