Remix.run Logo
blibble 2 hours ago

I think it's worse, cigarettes never threatened democracy

the solution is real easy, section 230 should not apply if there's an recommendation algorithm involved

treat the company as a traditional publisher

because they are, they're editorialising by selecting the content

vs, say, the old style facebook wall (a raw feed from user's friends), which should qualify for section 230

jcgrillo 23 minutes ago | parent | next [-]

> As interpreted by some courts, this language preserves immunity for some editorial changes to third-party content but does not allow a service provider to "materially contribute" to the unlawful information underlying a legal claim. Under the material contribution test, a provider loses immunity if it is responsible for what makes the displayed content illegal.[1]

I'm not a lawyer, but idk that seems pretty clear cut. If you, the provider, run some program which does illegal shit then 230 don't cover your ass.

[1] https://www.congress.gov/crs-product/IF12584

jballanc an hour ago | parent | prev | next [-]

The problem with this is that section 230 was specifically created to promote editorializing. Before section 230, online platforms were loath to engage in any moderation because they feared that a hint of moderation would jump them over into the realm of "publisher" where they could be held liable for the veracity of the content they published and, given the choice between no moderation at all or full editorial responsibility, many of the early internet platforms would have chosen no moderation (as full editorial responsibility would have been cost prohibitive).

In other words, that filter that keeps Nazis, child predators, doxing, etc. off your favorite platform only exists because of section 230.

Now, one could argue that the biggest platforms (Meta, Youtube, etc.) can, at this point, afford the cost of full editorial responsibility, but repealing section 230 under this logic only serves to put up a barrier to entry to any smaller competitor that might dislodge these platforms from their high, and lucrative, perch. I used to believe that the better fix would be to amend section 230 to shield filtering/removal, but not selective promotion, but TikTok has shown (rather cleverly) that selective filtering/removal can be just as effective as selective promotion of content.

intended 6 minutes ago | parent | next [-]

Platforms routinely underinvest in trust and safety.

T&S is markedly more capable in the dominant languages (English is ahead by far).

Platforms make absurd margins when compared to any other category of enterprise known to man.

They operate at scales where a 0.001% error rate is still far beyond human capability to manually review.

Customer support remains a cost center.

Firms should be profitable and have a job to do.

We do not owe them that job. Firms are vehicles to find the best strategies and tactics given societal resources and goals.

If rules to address harms result in current business models becoming unviable, then this is not a defense of the current business model.

Currently we are socializing costs and privatizing profit.

Having more customer support, more transparency, and more moderation will be a cost of doing business.

Our societies have more historical experience thinking about government capture than flooding the zone style private capture of speech.

America developed the FDA and every country has rules on how hygiene should be maintained in food.

People still can start small, and then create medium or large businesses. Regulation is framed for the size of the org.

Many firms fail - but failure and recreation are natural parts of the business cycle.

arcticfox an hour ago | parent | prev [-]

Even if they can't afford it... Too bad for them?

I am kind of rooting for the AI slop because the status quo is horrific, maybe the AI slop cancer will put social media out of its misery.

nobody_r_knows an hour ago | parent | prev [-]

[dead]