Remix.run Logo
Aurornis 2 days ago

> If you present data by algorithm, you are no longer an impartial common carrier and are liable for the content you present

Hacker News is a site that presents data by algorithm. Under your definition, Hacker News goes away, too.

A more accurate framing would be that they’re going after personalized recommendation algorithms. It’s not obvious that offering a recommendation algorithm would mean that the site is no longer an impartial common carrier.

another-dave 2 days ago | parent | next [-]

Goes away, or is liable for the content promoted to the frontpage under the OP's take?

But I'd agree, that it's personalisation rather than just curation that's the issue.

I think even requiring sites to have a "bring your own algo" version (and where ads are targetted to the algorithm, rather than the person) would cure a lot of ills.

As is, even with something like Spotify where you _are_ paying there's no easy way to "reset" your profile to neutral recommendations

Aurornis 2 days ago | parent [-]

> Goes away, or is liable for the content promoted to the frontpage under the OP's take?

Same thing. There is no Hacker News if Y Combinator becomes liable for user submitted content.

It’s an obvious backdoor play to make sites go away. If a site becomes liable for content posted, you cannot allow users to post content without having the site review and take responsibility for every comment and every post.

The people proposing it haven’t considered how damaging that would be for the ability of individuals to share ideas and their content. When every site with “an algorithm” is liable for content posted, nobody is going to allow you to post something. It’s back to only reading content produced and curated by companies for us. Total own-goal for the individual internet user.

SoftTalker 2 days ago | parent | next [-]

I think you could finesse it by saying that on HN, the users submit the content and the users also determine (by voting) what is popular. Ycombinator doesn't promote or bury any particular post with their own algorithms; they don't exercise any editorial review or control. (I don't think that's exactly true today, but it could be).

But to the larger point, I would actuall agree that sites should "review and take responsibility for every comment and every post." They are the ones amplifying and distributing this content, why should they have zero responsibility for it?

Yes that would dramatically change what gets published online, but I think that would be a good thing.

pibaker 2 days ago | parent | next [-]

And how do you think any other website decides what to recommend you, if not other users' actions? Remember the Netflix prize? The data set they gave you is how other people rated movies. You can absolutely build a recommendation system without manual input from the operator.

And HN absolutely does promote submissions at the moderators' discretion. The moderators sometimes give old but overlooked submissions a second chance, they also turn the flamewar detector on some stories that they think deserve more attention which effectively promotes them against users's will.

AlecSchueler 2 days ago | parent | prev | next [-]

> users also determine (by voting) what is popular

The algorithm considers various other things such the ratio of votes to comments, age of the post etc.

Just compare how different the front page is to /active

> Ycombinator doesn't promote or bury any particular post with their own algorithms

Certain things do get put above the popular stuff if they're fresh enough and your account is deemed to be a taste setter.

> they don't exercise any editorial review or control.

They can decide things like overturning the flagging of a post or burying something even without the flag etc.

fc417fc802 2 days ago | parent [-]

Importantly all except one of those things is impartial to the user, and even that one is merely binning based on a single category. Algorithm here is a red herring IMO people are objecting to a couple fairly specific things. One being personalization carried out by the other party, the other designs that introduce partisanship or are detrimental to the end user (ie addiction and other dark patterns).

voxic11 2 days ago | parent | prev | next [-]

So do you think the same logic applies to ISPs? Should they be reviewing all the content that they allow to transit their network and ban you if you try to evade their controls by using uncrackable encryption because if they mess up and allow you to distribute copyrighted or defamatory material they will be held liable? Remember that section 230 was originally enacted to protect them from liability.

SoftTalker 2 days ago | parent [-]

No I don't think it applies to ISPs. They aren't involved in selecting or soliciting the content, or providing the sofware and platform that creates or distributes the content. They are "just pipes." Their purpose is to move bits.

voxic11 a day ago | parent [-]

This is not a correct understanding of ISPs though. They do already have certain obligations to restrict content on their networks. In particular they are required to remove subscribers when they become aware that those subscribers are participating in copyright infringement.

singleshot_ 2 days ago | parent | prev | next [-]

> They are the ones amplifying and distributing this content, why should they have zero responsibility for it?

If LinkedIn started allowing hardcore pornography, many of their advertisers would leave.

With that in mind, are you certain LinkedIn takes “no responsibility” for the content they distribute? It would seem they have a multimillion-dollar stake in the outcome of their efforts to shape their commercial product.

charcircuit 2 days ago | parent | prev [-]

And on TikTok users vote what is popular by giving videos watch time. It is no different.

fc417fc802 2 days ago | parent [-]

Is TikTok really so straightforward? I don't believe your assertion is correct but I'm open to evidence.

charcircuit 2 days ago | parent [-]

The main difference is that HN uses time to segregate cohorts and TikTok uses interests to segregate cohorts. If enough people within these cohorts upvote / give watch time then the content is shown to more cohorts.

fc417fc802 2 days ago | parent [-]

I understand the basic principle. Clearly that's one of the inputs. What I'm questioning is your implied assertion that there's nothing else to it.

I don't for a second believe that tiktok (or facebook or any of the others) employs a primitive algorithm that impartially orders results based on a simple and straightforward metric without consideration for their own interests.

nemothekid 2 days ago | parent [-]

>I don't for a second believe that tiktok (or facebook or any of the others) employs a primitive algorithm

Is your contention that whatever future law have some mechanism to decide the complexity of the algorithm? How would you design a law such that the reddit ranking algorithm is primitive, but tiktok's algorithim is "advanced".

fc417fc802 2 days ago | parent | next [-]

You're changing the subject. I said nothing about the law, only objected to a claim about the internal mechanisms of tiktok.

If we're discussing hypothetical laws then my preference is for several. Banning various dark patterns (what the EU is doing here), banning opaque individualization outside the control of the individual in question, and banning motivated editorialization (such a intentionally promoting a particular political position). And yes, a straightforward application of what I wrote there would make the netflix recommendation algorithm as it currently stands illegal. I have no problem with that.

dTal 2 days ago | parent | prev [-]

Reddit is as bad as the others, now.

andrewjf 2 days ago | parent | prev | next [-]

I agree with what OOP said. But it’s not my intent to “shut sites down.” I have this view to try to increase diversity of media consumption and break people out of echo chambers. If your business model is so shit you have to exploit weaknesses in human brains to keep people viewing ads and can’t adapt, then that’s your problem.

If you have an algorithm whose sole purpose is to “engagement” with your own platform (by intentionally and purposely pushing clickbait, ragebait, and media that keeps reinforcing your clicks) you should no longer get section 230 protections - you are no longer a neutral party. These algorithms exist to create echo chambers and keep you clicking so you can consume more ads.

I would love to hear other ways of solving the problems of social media.

Aurornis 2 days ago | parent | next [-]

> I have this view to try to increase diversity of media consumption and break people out of echo chambers.

Making sites liable for all user-posted content would do the reverse of this. Every platform that lets people submit content would have to stop doing that, because it’s an impossible liability to manage.

You’d have to host your own site. You wouldn’t be able to share anything about it on a social media site because its user-generated content. No visitors unless you advertise it through paid contracts with companies that can review it and decide to accept the liability.

ryandrake 2 days ago | parent | next [-]

Newspaper "Letters to the Editor" manage to do this. Users "submit" things to the newspaper, the editor curates and decides what to keep and what not to, and then the newspaper publishes the user generated content. Just like social media: Users submit things to the site, TheAlgorithm curates and decides what to keep and what not to, and then the site publishes the user generated content.

If web sites and social media can't "scale" to do this, then maybe they should scale down. "Making sites liable for all user-posted content" would not kill social media, but would definitely scope it down to what can be effectively curated.

throwaway902984 2 days ago | parent [-]

I don't think there are enough dangs to effectively curate much of the internet, and scaling it back by how much would be the result? 95%? That is before settling on definitions of effectively curate I suppose.

fc417fc802 2 days ago | parent [-]

"Effectively curate" here simply means "willing to take legal responsibility for" (although in practice I assume there would be an insurance policy involved because that's just how things are done).

fc417fc802 2 days ago | parent | prev | next [-]

I notice that parent describes "engagement" algorithms and you somehow jump to "all sites". So I think we'd see "engagement" algorithms disappear and very primitive approaches with prominent transparency measures in place would replace them. I expect we'd all be better off were that to happen.

freejazz 2 days ago | parent | prev [-]

>Every platform that lets people submit content would have to stop doing that, because it’s an impossible liability to manage.

This is a huge assumption that is offered constantly, and always, without any evidence at all.

throwaway902984 2 days ago | parent [-]

"letters to the editor" curated by employees would become a part of their business model and regular contributions would go away? Why would that assumption be incorrect? I wouldn't run a website where a casual user having a moment could result in my imprisonment. I would only allow non-lbtq content that didn't mention race or immigration, as the chilling effect there is real. A DA would for sure come after me if my site became influential.

freejazz 2 days ago | parent [-]

Was hoping for a more reasoned opinion from OP

thfuran 2 days ago | parent | prev [-]

Ban third-party advertising.

freejazz 2 days ago | parent | prev | next [-]

>Same thing. There is no Hacker News if Y Combinator becomes liable for user submitted content.

Why is this assumed to be true?

NewsaHackO 2 days ago | parent [-]

If YCombinator has to officially approve every article submitted, then it will become a publisher of a news site, not a social media site. Essentially, it would be a New York Times site with unpaid writers.

freejazz 2 days ago | parent [-]

And? The New York Times website exists, last I checked.

NewsaHackO 2 days ago | parent [-]

I guess I am not seeing your point. A site that is completely a blank page exists also.

freejazz 2 days ago | parent [-]

Well the argument was that Hackernews would no longer exist, and I asked why and your response was that it would be like the NY Times, but the NY Times website does exist so I don't understand what point you are trying to make then.

NewsaHackO 2 days ago | parent [-]

Got it. If the page doesn't fulfill the original purpose that people wanted to go to it, it ceases being interesting. The fact that the page merely exists is meaningless, much like a blank website.

freejazz 2 days ago | parent [-]

Well, you pointed to the NYTimes which, again, has not changed, so what is your point? Maybe the NYTimes is not a good example? I don't know, you brought it up. Are you saying the NYTimes is not an interesting website? It seems to also have the news and discussion of the news, so what exactly am I missing?

2 days ago | parent [-]
[deleted]
weregiraffe 2 days ago | parent | prev | next [-]

> It’s back to only reading content produced and curated by companies for us

I didn’t know only companies can have websites.

buu700 2 days ago | parent [-]

It's a matter of resources, not corporate status per se. For better or for worse, the current status quo largely democratizes content promotion. You and I can post these two comments here and put our ideas and names in front of a bunch of strangers for $0.

In a world where the risk-adjusted cost of allowing third-party comments on your platform shoots up, someone has to pay that cost. A personal blog hosted on your server might struggle to find any significant reach without a real advertising budget, because distributing speech/content that promotes your platform would no longer be ~free.

I don't necessarily believe that the major social media platforms would fully evaporate, but I'd expect some or all of these changes across the ecosystem:

* Massively scaled up LLM-based moderation/censorship.

* Replacement of direct user content posting with an LLM-based interface (to chat with an LLM about what you want it to write on your behalf).

* Payment-gated public posting, e.g. monthly or per-post fees to cover liability/insurance and/or LLM inference costs. Possibly higher fees for direct authorship vs LLM pair posting.

* Massive rise in adoption of decentralized architectures, either via current mainstream platforms if legally tolerated or via anonymous dark web platforms otherwise. Maybe Tor becomes as normalized as VPNs, or maybe the Western legal environment shifts hard against general-purpose computing.

I understand where this sentiment is coming from, but I think it's taking a lot of the current status quo for granted. What you guys are proposing isn't necessarily a targeted change that would simply make bad guys stop doing bad things. It's more likely a massive structural change that would dramatically alter the social and economic fabric of the internet as we know it, and not in a way that most of us would like.

buellerbueller 2 days ago | parent | prev [-]

>It’s an obvious backdoor play to make sites go away.

Oh no.

tencentshill 2 days ago | parent | prev | next [-]

The algorithm is not personalized. It's the same for every user. No issue there.

tolerance 2 days ago | parent | next [-]

But still an algorithm. The difference is that we (at least some of us) place a greater trust in the integrity behind how information surfaces on HN. I think that some parts of it are open source, and the moderators are transparent enough about what isn't public + there is a mix of folk knowledge that explains how HN works under the hood.

Depersonalized algorithms or recommender systems aren't inherently better than personalized ones. HN is an exceptional example of the former but I think at scale people would come up with a different crop of complaints for them.

tencentshill 2 days ago | parent [-]

Yes it's still an algorithm. Cable TV programming is another example. Everyone sees the same content. The ads are changed at the local broadcaster level but are not tailored to the individual, and are not harmful in the ways the EU is regulating. If anything, everyone watching the same thing is good for social cohesion. Everyone discusses the latest TV episode the next day at the office.

tolerance 2 days ago | parent [-]

Right. Withholding the fact that cable television doesn't appear to be the typical distribution method anymore, how do broadcasters select/schedule their programming?

fc417fc802 2 days ago | parent [-]

What's your point? It seems like you're pedantically focusing on a single word without regard for the actual meaning of the broader statement. No one is proposing to regulate things done in the traditional manner of cable tv, nor other uniform and impartial approaches.

tolerance 2 days ago | parent [-]

@conception (root): "If you present data by algorithm, you are no longer an impartial common carrier and are liable for the content you present"

@Aurornis: "Hacker News is a site that presents data by algorithm. Under your definition, Hacker News goes away, too."

@Aurornis (cont'd): "When every site with “an algorithm” is liable for content posted, nobody is going to allow you to post something. It’s back to only reading content produced and curated by companies for us. Total own-goal for the individual internet user."

@Aurornis (cont'd): "If a site becomes liable for content posted, you cannot allow users to post content without having the site review and take responsibility for every comment and every post."

@tencentshill: "The algorithm is not personalized. It's the same for every user. No issue there..."

Me: "But still an algorithm".

@tencentshill: "Yes it's still an algorithm. Cable TV programming is another example."

Me: "...how do broadcasters select/schedule their programming?"

***

If the "broader statement" that you're referring to is @conception's, then I agree with @Auronis that this would have negative effects on how websites like Hacker News operate. Failing to distinguish personalized recommendation systems from depersonalized ones and proposing regulation that affects them the same is an impartial approach.

The speculated consequence is that platforms (e.g., Hacker News) will not want to assume liability for the content that users share. [0] If this were to happen only a few platforms would exist, at least on the clear/open web. The general online experience would become something like a pastiche of 60s cable television with three or four providers authorized to broadcast media.

With the direction that democracy is trending across the world that would mean state-run or state-approved media. Or all online communities will have to organize and operate like more traditional institutions like this biking community in London is doing: https://www.lfgss.com/conversations/401988/.

[0]: Some parts of this community already suspect that moderation conveniently buries controversial or subversive submissions. See this one from today! https://news.ycombinator.com/item?id=48110927

fc417fc802 2 days ago | parent [-]

Legislation needs to be clear and unambiguous, sure. Nonetheless no one had chronological sort or raw vote count or whatever else in mind when they used the term "algorithm" here so pretending they did is obtuse and pedantic. Misinterpreting the position of the other party does not typically make for enlightening or insightful conversation.

Cable TV is an example of something that no one is objecting to. The EU is targeting specific practices (particularly addictive UX patterns). Some people (myself included) would also like to see algorithms that provide personalized (on the individual or small cohort level) output banned. HN is clearly not that.

I think there's an interesting discussion to be had about where exactly the line is between a general class and a small cohort. Certainly applying more than a few general classes simultaneously can quickly land you back in near-individual territory.

tolerance 2 days ago | parent | next [-]

> Nonetheless no one had chronological sort or raw vote count or whatever else in mind when they used the term "algorithm" here so pretending they did is obtuse and pedantic.

No one until you it seems.

> Cable TV is an example of something that no one is objecting to.

@tencentshill's reference to cable TV originates from the question of whether Hacker News operates via algorithm and would be subject to the sweeping regulation proposed by @conception. The answer is yes.

If I wanted to be pedantic I'd try to argue that cable TV operates according to its own kind of algorithm. And I almost did, so you got me there at least. But there's enough factors that contribute to television programming that it's debatable how far it is from using one (or a recommender system, rather) and whether under different circumstances the EU's issue with "endless scroll" and "autoplay" would be aimed at TV.

Of course the main difference is that television in Europe is probably regulated different than the internet.

I'm not objecting to the internet being regulated like television. For the record, I don't hold one to the same standard of utility as the other. I'm speculating on what would happen if the internet were to be regulated like television according to the combined scenarios advanced by @conception and @tencentshill. Do you follow?

fc417fc802 2 days ago | parent [-]

I believe you are obtusely misinterpreting the other two commenters. The pedantry is your insistence on an overly zealous interpretation of the use of the word "algorithm" by @conception (well really it was @aurornis with the pedantry but you followed on). It's clear enough that the original wording was sloppy; forcing analysis of an unintended scenario isn't fruitful.

> I'm speculating on what would happen if the internet were to be regulated like television ...

You've lost me. Previously you were arguing that all platforms with user generated content would disappear. Now you appear to acknowledge that the scenario as described permits platforms that operate analogously to cable TV, which is to say they don't present individualized content.

I'm no longer clear what your current position is nor what you might be attempting to communicate to others or advocate for here.

tolerance a day ago | parent [-]

Great, it's settled! Everything I have to say is an affront to your intelligence. Let's avoid each other.

2 days ago | parent | prev [-]
[deleted]
AlecSchueler 2 days ago | parent | prev | next [-]

> It's the same for every user

It isn't. Users who vote and flag more often are more likely to have things from /new surfaced on their main page for example.

achenet 2 days ago | parent | prev | next [-]

The facebook/meta algo might be same for all users, but it had different inputs for each user.

HN, on the other hand, everyone has the same front page. If I like a post I can favorite it to 'bookmark' it, but HN won't modify my front page based on what I favorite, whereas facebook will.

I think the GP's argument is, when it comes to social media, "one size fits all" might be less addictive than "custom made" :)

tencentshill 2 days ago | parent [-]

Thats what I was saying. Referring to HN, not facebook.

jimbob45 2 days ago | parent | prev [-]

And also the algorithm here is title-blind. The content of the story bears no sway over its place in the rankings. I do not believe dang cherry-picks either except for the very rare sticky?

AlecSchueler 2 days ago | parent [-]

> the algorithm here is title-blind

It's that true? I thought I had seen it said that there were keyword penalties to discourage things like political posts that could be turned off and on

cameldrv 2 days ago | parent | prev | next [-]

It would be a lot better if the user just had more control over the recommendation algorithm, either to replace it with an alternative or tune it. For example, I never want to watch YouTube shorts. Every time I see them, I click "show less often" since it is the only way I can express this preference, and still YouTube shows me them.

Obviously YouTube knows that even among people who do this, they still get good engagement out of YouTube shorts, so they keep showing them, but these users have explicitly asked YouTube to not show them.

It would be like a recovering alcoholic whose landlord comes by every week and leaves free samples of booze, because they get paid by the booze company, even though the alcoholic has asked them to stop.

alkonaut 2 days ago | parent | prev | next [-]

> Hacker News is a site that presents data by algorithm

Does it though? I mean by "algorithm" in this context we mean "personalized algorithm meant to maximize engagement and retention".

Not e.g. "sort by upvotes and decay by time" or even "filter content based on coarse user location".

Does HN show me a different front page than everyone else based on which articles I have read or upvoted? That would make me feel worse about the site because I don't want a personalized HN feed I want to read what everyone else is reading (which is incidentally why I refuse to give up linear TV).

Aurornis 2 days ago | parent [-]

> Does it though? I mean by "algorithm" in this context we mean "personalized algorithm meant to maximize engagement and retention".

I addressed that in the second half of my comment already.

But yes, HN qualifies as a site that displays by algorithm. If you mean personalized recommendation algorithm then it’s important to call that out. The last thing we want is regulation so broad that it catches every site that ranks things.

alkonaut 2 days ago | parent [-]

No one _ever_ even considers "algorithms" in the CS sense here (such as "sorting"), and even bringing that notion up would be deliberately dumbing down the discussion (yet it keeps happening in this thread over-and-over-again because people are for some reason very "well ackshually sorting is an algorithm").

"Algorithm" in this context is very clear what it is. It is not what the word means in Computer Science or in general. Just from the context and without any clarification needed "algorithms" in social media means "addictive personalized feeds".

ryandrake 2 days ago | parent [-]

I think we need a different word, so that Computer Science grads stop getting wrapped around this axle. We're obviously not talking about Quicksort when we're talking about social media algorithms and other recommendation/discovery algorithms. Heck if I know what that word would be.

alkonaut 2 days ago | parent | next [-]

Yes absolutely. Sadly I think that ship has sailed. Now if you ask 100 people in the street what "algorithms" are, I bet a majority among those who answer anything at all will answer it's something related to evil social media corporations.

addaon 2 days ago | parent | prev | next [-]

We can call them crypto. Won't make any less sense than the current usages, and it's a really good indicator to stop listening.

AlecSchueler 2 days ago | parent | prev [-]

Personalised algorithms.

xigoi 2 days ago | parent | prev | next [-]

> Under your definition, Hacker News goes away, too.

It doesn’t have to go away, just switch to chronological sorting.

Analemma_ 2 days ago | parent | next [-]

Have you ever browsed by New and seen the firehose of shit which doesn’t make it to the front page? HN sorted by new is effectively useless and you might as well shut the site down at that point.

“Chronological only” might work for something like Twitter where you’re choosing to follow specific individuals to see their posts, it can’t work for curation sites like HN/Reddit.

xigoi 2 days ago | parent [-]

That could be solved by allowing users to filter by score or number of comments.

throwaway902984 2 days ago | parent | next [-]

Which would lead to everyone having their own personalized front page? Not controlled by dang so much but still.

xigoi 2 days ago | parent [-]

Not being controlled by the website owner is the point.

throwaway902984 2 days ago | parent [-]

Yeah for sure, I see what you were saying. Changing that part might not achieve the desired effect though is what I was saying. Context dependent on the site here of course, but in a general sense I could see meta et al. being nonplussed by this to a significant extent.

nemothekid 2 days ago | parent | prev [-]

I think you are just reflexively trying to argue your point without even thinking about it.

30 million tiktoks are posted by day. What do you mean your are going to allow "users to filter". This "regulation" will be trivially defeated by TikTok-Videos LLC uploading videos and TikTok-DataScience, providing the most popular filtering algo.

At the end of the day, many children will simply default to using the best algorithm, and all this regulation helps no one.

shimman 2 days ago | parent [-]

Wow clearly this a problem that can never be solved, better to not regulate these tech giants that have anti-democratic and anti-human beliefs. We're simply too powerless to regulate these entities!

nemothekid a day ago | parent [-]

Yeah, I clearly prefer non-hamfisted regulation under the guise of "protect the children". Half-assed attempts like this are worse than useless.

Just like with GDPR, the tech giants will put their foot on the scale and continue to operate how they fit, and the smaller guys will die under a thousand paper cuts. I'd rather not add more regulation that cements Meta as the sole media platform of the internet.

veeti 2 days ago | parent | prev [-]

Sorting is not an algorithm?

xigoi 2 days ago | parent [-]

Not what people are talking about when complaining about algorithmic feeds.

kjkjadksj 2 days ago | parent | prev | next [-]

Hacker news front page would go away but not new or any top ranking. It would be nice to have a hn without the second chance queue imo.

2 days ago | parent | prev | next [-]
[deleted]
dangus 2 days ago | parent | prev | next [-]

The difference is you can’t prove that hacker news has a bunch of psychologists on staff who are dreaming up ways to make the website addictive.

If you take TikTok to court and go through discovery you’re going to find internal communications of people talking about ways to get people to stay on the app longer, ways to make the content more addictive, ways to maximize ad reach, etc.

Hacker news just tossed a simple upvote downvote system and called it a day.

Plus it has no endless scroll, no graphics at all, limits your comment frequency, has no push notifications, etc.

2 days ago | parent | prev | next [-]
[deleted]
jackdoe 2 days ago | parent | prev | next [-]

> Hacker News goes away, too.

so be it.

vasco 2 days ago | parent [-]

This is a strange thing to comment on HN. If you truly believed it why would you be here?

buellerbueller 2 days ago | parent | next [-]

The majority of terminally addicted people I have interacted with at length have both recognized the terminal nature of their addiction and been unable to do anything about it.

That's the nature of addiction.

coffeefirst 2 days ago | parent | prev [-]

Honestly the damage done by TikTok et al is so severe that I’m okay with a little collateral damage. We will build new things.

But I also see no reason you can’t separate out forums with upvoting from the personalized engagement optimized feed. They are fundamentally different designs. (In other words, Subreddits are safe, the Reddit homepage is regulated unless it changes.)

2 days ago | parent | prev | next [-]
[deleted]
SideburnsOfDoom 2 days ago | parent | prev [-]

When we talk about "The Algorithm" in terms of social media, the term has just about taken on a meaning opposite to the original one.

My coder's view of the simplest possible algorithm: "People that I follow, their posts, ordered by most recent first." it's transparent, easy to understand, consistent, and a few lines of code.

Big Social media's "The Algorithm" : a complex, utterly opaque, personally targeted, frequently shifting internal set of rules that they manipulate to maximise engagement and revenue, while hiding this from the users whole attention is being monetised, designed according to business priorities.

Clearly, if you use the second kind of Algorithm then you are no longer an impartial common carrier.