| ▲ | AreShoesFeet000 3 hours ago | |||||||
I agree, but what would be the actual mechanism that would allow that? I believe we’re out of ideas. TikTok’s crime was just be firmly successful because of good engineering. There’s no evil sauce apart from promotional content and occasional manipulation, which has nothing to do with the algorithm per se. And about whitelisting, I honestly don’t think you’re comparing apples to apples. The point of the algorithm is dynamically recommending new content. It’s about discovery. | ||||||||
| ▲ | cwillu 20 minutes ago | parent | next [-] | |||||||
We're allowed to create laws to avoid a result we don't like, regardless of how many good intentions paved the road that brought us to that result. | ||||||||
| ▲ | Jensson 2 hours ago | parent | prev [-] | |||||||
> I agree, but what would be the actual mechanism that would allow that? Governments saying "if you are a social content platform with more than XX million users you have to provide these options on recommendation algorithms: X Y Z". It is that easy. > And about whitelisting, I honestly don’t think you’re comparing apples to apples. The point of the algorithm is dynamically recommending new content. It’s about discovery. And some people want to turn off that pushed discovery and just get recommended videos from a set of channels that they subscribed to. They still want to watch some tiktok videos, they just don't want the algorithm to try to push bad content on them. You are right that you can't avoid such algorithm when searching for new content, but I don't see why it has to be there in content it pushes onto you without you asking for new content. | ||||||||
| ||||||||