| ▲ | blibble 2 hours ago | |||||||||||||
I think it's worse, cigarettes never threatened democracy the solution is real easy, section 230 should not apply if there's an recommendation algorithm involved treat the company as a traditional publisher because they are, they're editorialising by selecting the content vs, say, the old style facebook wall (a raw feed from user's friends), which should qualify for section 230 | ||||||||||||||
| ▲ | jcgrillo 23 minutes ago | parent | next [-] | |||||||||||||
> As interpreted by some courts, this language preserves immunity for some editorial changes to third-party content but does not allow a service provider to "materially contribute" to the unlawful information underlying a legal claim. Under the material contribution test, a provider loses immunity if it is responsible for what makes the displayed content illegal.[1] I'm not a lawyer, but idk that seems pretty clear cut. If you, the provider, run some program which does illegal shit then 230 don't cover your ass. | ||||||||||||||
| ▲ | jballanc an hour ago | parent | prev | next [-] | |||||||||||||
The problem with this is that section 230 was specifically created to promote editorializing. Before section 230, online platforms were loath to engage in any moderation because they feared that a hint of moderation would jump them over into the realm of "publisher" where they could be held liable for the veracity of the content they published and, given the choice between no moderation at all or full editorial responsibility, many of the early internet platforms would have chosen no moderation (as full editorial responsibility would have been cost prohibitive). In other words, that filter that keeps Nazis, child predators, doxing, etc. off your favorite platform only exists because of section 230. Now, one could argue that the biggest platforms (Meta, Youtube, etc.) can, at this point, afford the cost of full editorial responsibility, but repealing section 230 under this logic only serves to put up a barrier to entry to any smaller competitor that might dislodge these platforms from their high, and lucrative, perch. I used to believe that the better fix would be to amend section 230 to shield filtering/removal, but not selective promotion, but TikTok has shown (rather cleverly) that selective filtering/removal can be just as effective as selective promotion of content. | ||||||||||||||
| ||||||||||||||
| ▲ | nobody_r_knows an hour ago | parent | prev [-] | |||||||||||||
[dead] | ||||||||||||||