| |
| ▲ | Jensson 2 hours ago | parent | next [-] | | Why is it good that you need self control to not get slop? Its much better if you can just turn that off and relax rather than having to stay alert to avoid certain content that it tries to trick you to serve you more slop. Distancing yourself from temptations is an effective and proven way to get rid of addictions, the programs constantly trying to get you to relapse is not a good feature. Like imagine a fridge that constantly puts in beer, that would be very bad for alcoholics and people would just say "just don't drink the beer?" even though this is a real problem with an easy fix. | | |
| ▲ | james_marks 2 hours ago | parent | next [-] | | Basically, I want to set boundaries in a healthy frame of mind, and have that default respected when my self control is lower because I’m tired, depressed, bored, etc. “The algorithm” of social media is the opposite. | | |
| ▲ | AreShoesFeet000 an hour ago | parent [-] | | I think your reply has me convinced. You really can’t expect to have such self control all of the time. Damn. |
| |
| ▲ | AreShoesFeet000 2 hours ago | parent | prev [-] | | It’s because content curation is inherently impossible to reach the same level of relevance as direct feedback from user behavior. You mix in all kinds of biases, commercial interests, ideology of the curator, etc, and you inevitably get irrelevant slop. The algorithm puts you in control a little bit more. | | |
| ▲ | Jensson 2 hours ago | parent [-] | | > The algorithm puts you in control a little bit more. Why not let you choose to get a less addictive algorithm? Older algorithms were less addictive, so its not at all impossible to do this, many users would want this. | | |
| ▲ | AreShoesFeet000 2 hours ago | parent | next [-] | | I just don’t think that the addiction is exclusively due to the algorithm. There’s really a lack of affordable varied options for learning trade and entertainment. We say in Portuguese: You shouldn’t throw the baby away along with the water you used to bathe. | | |
| ▲ | Jensson an hour ago | parent [-] | | I don't see any harm that could come from saying "a less addictive algorithm needs to be available to users"? For example, lets say there is an option to only recommend videos from channels you subscribe to, that would be much less addictive, why isn't that an option? A regulation that forces these companies to add such a feature would only make the world a better place. | | |
| ▲ | AreShoesFeet000 an hour ago | parent [-] | | I agree, but what would be the actual mechanism that would allow that? I believe we’re out of ideas. TikTok’s crime was just be firmly successful because of good engineering. There’s no evil sauce apart from promotional content and occasional manipulation, which has nothing to do with the algorithm per se. And about whitelisting, I honestly don’t think you’re comparing apples to apples. The point of the algorithm is dynamically recommending new content. It’s about discovery. | | |
| ▲ | Jensson an hour ago | parent [-] | | > I agree, but what would be the actual mechanism that would allow that? Governments saying "if you are a social content platform with more than XX million users you have to provide these options on recommendation algorithms: X Y Z". It is that easy. > And about whitelisting, I honestly don’t think you’re comparing apples to apples. The point of the algorithm is dynamically recommending new content. It’s about discovery. And some people want to turn off that pushed discovery and just get recommended videos from a set of channels that they subscribed to. They still want to watch some tiktok videos, they just don't want the algorithm to try to push bad content on them. You are right that you can't avoid such algorithm when searching for new content, but I don't see why it has to be there in content it pushes onto you without you asking for new content. | | |
| ▲ | AreShoesFeet000 an hour ago | parent [-] | | Fair enough. I’m not really a fan of regulation. The capitalist State is a total mess, but I really think we should try your idea. |
|
|
|
| |
| ▲ | kylecazar 2 hours ago | parent | prev [-] | | They're optimizing for time spent on the platform. | | |
| ▲ | Jensson an hour ago | parent [-] | | And that is why these algorithms needs to be regulated. People don't want to pick the algorithm that makes them spend the most time possible on their phones, many would want an algorithm that optimizes for quality rather than quantity on the app so they get more time to do other things. But corporations doesn't want to provide that because they don't earn anything from it. |
|
|
|
| |
| ▲ | Forgeties79 2 hours ago | parent | prev [-] | | I don’t agree tbh. This is part of how people wind up down extremist rabbit holes. If you’re just lazily scrolling it can easily trap you in its gravity well. | | |
| ▲ | AreShoesFeet000 2 hours ago | parent [-] | | But you can get into extremist rabbit holes independently of control surface. Remember 4chan? Dangerous content is a matter of moderation regardless of interfacing. | | |
| ▲ | Forgeties79 2 hours ago | parent [-] | | 4chan is nothing like TikTok, though yes I agree heavy moderation is necessary for both. |
|
|
|