|
| ▲ | sgnelson 2 hours ago | parent | next [-] |
| I've got a book for you to read... https://en.wikipedia.org/wiki/The_medium_is_the_message |
|
| ▲ | InitialLastName an hour ago | parent | prev | next [-] |
| If I have trillions of monkeys on typewriters generating every possible combination of characters, and then from what they "produce" I carefully select what I want to show everyone who comes to my website, how responsible am I for what my visitors see? |
|
| ▲ | notatoad an hour ago | parent | prev | next [-] |
| they pay people to create content for their platform, and use their editorial control to determine what gets surfaced for you to see. how is that not "producing content"? |
|
| ▲ | jounker 2 hours ago | parent | prev | next [-] |
| And yet people struggle to get Elon Musk out of their feeds on Twitter. |
| |
|
| ▲ | some_furry 2 hours ago | parent | prev [-] |
| No, but they decide the moderation policy that incentivizes the content produced (by nature of selecting which users feel comfortable using their product and which do not). For example, I do not feel comfortable using the same platform as people that post child sexual abuse material. X's Grok is infamous for generating such content on demand. I opt to use platforms that do not have this as a first-class feature. X has selected against my participation and for the participations of people who hold a contrary opinion to me. Even if Grok stops producing CSAM, that selection bias will persist. |