| |
| ▲ | SoftTalker 2 days ago | parent | next [-] | | I think you could finesse it by saying that on HN, the users submit the content and the users also determine (by voting) what is popular. Ycombinator doesn't promote or bury any particular post with their own algorithms; they don't exercise any editorial review or control. (I don't think that's exactly true today, but it could be). But to the larger point, I would actuall agree that sites should "review and take responsibility for every comment and every post." They are the ones amplifying and distributing this content, why should they have zero responsibility for it? Yes that would dramatically change what gets published online, but I think that would be a good thing. | | |
| ▲ | pibaker 2 days ago | parent | next [-] | | And how do you think any other website decides what to recommend you, if not other users' actions? Remember the Netflix prize? The data set they gave you is how other people rated movies. You can absolutely build a recommendation system without manual input from the operator. And HN absolutely does promote submissions at the moderators' discretion. The moderators sometimes give old but overlooked submissions a second chance, they also turn the flamewar detector on some stories that they think deserve more attention which effectively promotes them against users's will. | |
| ▲ | AlecSchueler 2 days ago | parent | prev | next [-] | | > users also determine (by voting) what is popular The algorithm considers various other things such the ratio of votes to comments, age of the post etc. Just compare how different the front page is to /active > Ycombinator doesn't promote or bury any particular post with their own algorithms Certain things do get put above the popular stuff if they're fresh enough and your account is deemed to be a taste setter. > they don't exercise any editorial review or control. They can decide things like overturning the flagging of a post or burying something even without the flag etc. | | |
| ▲ | fc417fc802 2 days ago | parent [-] | | Importantly all except one of those things is impartial to the user, and even that one is merely binning based on a single category. Algorithm here is a red herring IMO people are objecting to a couple fairly specific things. One being personalization carried out by the other party, the other designs that introduce partisanship or are detrimental to the end user (ie addiction and other dark patterns). |
| |
| ▲ | voxic11 2 days ago | parent | prev | next [-] | | So do you think the same logic applies to ISPs? Should they be reviewing all the content that they allow to transit their network and ban you if you try to evade their controls by using uncrackable encryption because if they mess up and allow you to distribute copyrighted or defamatory material they will be held liable? Remember that section 230 was originally enacted to protect them from liability. | | |
| ▲ | SoftTalker 2 days ago | parent [-] | | No I don't think it applies to ISPs. They aren't involved in selecting or soliciting the content, or providing the sofware and platform that creates or distributes the content. They are "just pipes." Their purpose is to move bits. | | |
| ▲ | voxic11 a day ago | parent [-] | | This is not a correct understanding of ISPs though. They do already have certain obligations to restrict content on their networks. In particular they are required to remove subscribers when they become aware that those subscribers are participating in copyright infringement. |
|
| |
| ▲ | singleshot_ 2 days ago | parent | prev | next [-] | | > They are the ones amplifying and distributing this content, why should they have zero responsibility for it? If LinkedIn started allowing hardcore pornography, many of their advertisers would leave. With that in mind, are you certain LinkedIn takes “no responsibility” for the content they distribute? It would seem they have a multimillion-dollar stake in the outcome of their efforts to shape their commercial product. | |
| ▲ | charcircuit 2 days ago | parent | prev [-] | | And on TikTok users vote what is popular by giving videos watch time. It is no different. | | |
| ▲ | fc417fc802 2 days ago | parent [-] | | Is TikTok really so straightforward? I don't believe your assertion is correct but I'm open to evidence. | | |
| ▲ | charcircuit 2 days ago | parent [-] | | The main difference is that HN uses time to segregate cohorts and TikTok uses interests to segregate cohorts. If enough people within these cohorts upvote / give watch time then the content is shown to more cohorts. | | |
| ▲ | fc417fc802 2 days ago | parent [-] | | I understand the basic principle. Clearly that's one of the inputs. What I'm questioning is your implied assertion that there's nothing else to it. I don't for a second believe that tiktok (or facebook or any of the others) employs a primitive algorithm that impartially orders results based on a simple and straightforward metric without consideration for their own interests. | | |
| ▲ | nemothekid 2 days ago | parent [-] | | >I don't for a second believe that tiktok (or facebook or any of the others) employs a primitive algorithm Is your contention that whatever future law have some mechanism to decide the complexity of the algorithm? How would you design a law such that the reddit ranking algorithm is primitive, but tiktok's algorithim is "advanced". | | |
| ▲ | fc417fc802 2 days ago | parent | next [-] | | You're changing the subject. I said nothing about the law, only objected to a claim about the internal mechanisms of tiktok. If we're discussing hypothetical laws then my preference is for several. Banning various dark patterns (what the EU is doing here), banning opaque individualization outside the control of the individual in question, and banning motivated editorialization (such a intentionally promoting a particular political position). And yes, a straightforward application of what I wrote there would make the netflix recommendation algorithm as it currently stands illegal. I have no problem with that. | |
| ▲ | dTal 2 days ago | parent | prev [-] | | Reddit is as bad as the others, now. |
|
|
|
|
|
| |
| ▲ | andrewjf 2 days ago | parent | prev | next [-] | | I agree with what OOP said. But it’s not my intent to “shut sites down.” I have this view to try to increase diversity of media consumption and break people out of echo chambers. If your business model is so shit you have to exploit weaknesses in human brains to keep people viewing ads and can’t adapt, then that’s your problem. If you have an algorithm whose sole purpose is to “engagement” with your own platform (by intentionally and purposely pushing clickbait, ragebait, and media that keeps reinforcing your clicks) you should no longer get section 230 protections - you are no longer a neutral party. These algorithms exist to create echo chambers and keep you clicking so you can consume more ads. I would love to hear other ways of solving the problems of social media. | | |
| ▲ | Aurornis 2 days ago | parent | next [-] | | > I have this view to try to increase diversity of media consumption and break people out of echo chambers. Making sites liable for all user-posted content would do the reverse of this. Every platform that lets people submit content would have to stop doing that, because it’s an impossible liability to manage. You’d have to host your own site. You wouldn’t be able to share anything about it on a social media site because its user-generated content. No visitors unless you advertise it through paid contracts with companies that can review it and decide to accept the liability. | | |
| ▲ | ryandrake 2 days ago | parent | next [-] | | Newspaper "Letters to the Editor" manage to do this. Users "submit" things to the newspaper, the editor curates and decides what to keep and what not to, and then the newspaper publishes the user generated content. Just like social media: Users submit things to the site, TheAlgorithm curates and decides what to keep and what not to, and then the site publishes the user generated content. If web sites and social media can't "scale" to do this, then maybe they should scale down. "Making sites liable for all user-posted content" would not kill social media, but would definitely scope it down to what can be effectively curated. | | |
| ▲ | throwaway902984 2 days ago | parent [-] | | I don't think there are enough dangs to effectively curate much of the internet, and scaling it back by how much would be the result? 95%? That is before settling on definitions of effectively curate I suppose. | | |
| ▲ | fc417fc802 2 days ago | parent [-] | | "Effectively curate" here simply means "willing to take legal responsibility for" (although in practice I assume there would be an insurance policy involved because that's just how things are done). |
|
| |
| ▲ | fc417fc802 2 days ago | parent | prev | next [-] | | I notice that parent describes "engagement" algorithms and you somehow jump to "all sites". So I think we'd see "engagement" algorithms disappear and very primitive approaches with prominent transparency measures in place would replace them. I expect we'd all be better off were that to happen. | |
| ▲ | freejazz 2 days ago | parent | prev [-] | | >Every platform that lets people submit content would have to stop doing that, because it’s an impossible liability to manage. This is a huge assumption that is offered constantly, and always, without any evidence at all. | | |
| ▲ | throwaway902984 2 days ago | parent [-] | | "letters to the editor" curated by employees would become a part of their business model and regular contributions would go away? Why would that assumption be incorrect? I wouldn't run a website where a casual user having a moment could result in my imprisonment. I would only allow non-lbtq content that didn't mention race or immigration, as the chilling effect there is real. A DA would for sure come after me if my site became influential. | | |
|
| |
| ▲ | thfuran 2 days ago | parent | prev [-] | | Ban third-party advertising. |
| |
| ▲ | freejazz 2 days ago | parent | prev | next [-] | | >Same thing. There is no Hacker News if Y Combinator becomes liable for user submitted content. Why is this assumed to be true? | | |
| ▲ | NewsaHackO 2 days ago | parent [-] | | If YCombinator has to officially approve every article submitted, then it will become a publisher of a news site, not a social media site. Essentially, it would be a New York Times site with unpaid writers. | | |
| ▲ | freejazz 2 days ago | parent [-] | | And? The New York Times website exists, last I checked. | | |
| ▲ | NewsaHackO 2 days ago | parent [-] | | I guess I am not seeing your point. A site that is completely a blank page exists also. | | |
| ▲ | freejazz 2 days ago | parent [-] | | Well the argument was that Hackernews would no longer exist, and I asked why and your response was that it would be like the NY Times, but the NY Times website does exist so I don't understand what point you are trying to make then. | | |
| ▲ | NewsaHackO 2 days ago | parent [-] | | Got it. If the page doesn't fulfill the original purpose that people wanted to go to it, it ceases being interesting. The fact that the page merely exists is meaningless, much like a blank website. | | |
| ▲ | freejazz 2 days ago | parent [-] | | Well, you pointed to the NYTimes which, again, has not changed, so what is your point? Maybe the NYTimes is not a good example? I don't know, you brought it up. Are you saying the NYTimes is not an interesting website? It seems to also have the news and discussion of the news, so what exactly am I missing? | | |
|
|
|
|
|
| |
| ▲ | weregiraffe 2 days ago | parent | prev | next [-] | | > It’s back to only reading content produced and curated by companies for us I didn’t know only companies can have websites. | | |
| ▲ | buu700 2 days ago | parent [-] | | It's a matter of resources, not corporate status per se. For better or for worse, the current status quo largely democratizes content promotion. You and I can post these two comments here and put our ideas and names in front of a bunch of strangers for $0. In a world where the risk-adjusted cost of allowing third-party comments on your platform shoots up, someone has to pay that cost. A personal blog hosted on your server might struggle to find any significant reach without a real advertising budget, because distributing speech/content that promotes your platform would no longer be ~free. I don't necessarily believe that the major social media platforms would fully evaporate, but I'd expect some or all of these changes across the ecosystem: * Massively scaled up LLM-based moderation/censorship. * Replacement of direct user content posting with an LLM-based interface (to chat with an LLM about what you want it to write on your behalf). * Payment-gated public posting, e.g. monthly or per-post fees to cover liability/insurance and/or LLM inference costs. Possibly higher fees for direct authorship vs LLM pair posting. * Massive rise in adoption of decentralized architectures, either via current mainstream platforms if legally tolerated or via anonymous dark web platforms otherwise. Maybe Tor becomes as normalized as VPNs, or maybe the Western legal environment shifts hard against general-purpose computing. I understand where this sentiment is coming from, but I think it's taking a lot of the current status quo for granted. What you guys are proposing isn't necessarily a targeted change that would simply make bad guys stop doing bad things. It's more likely a massive structural change that would dramatically alter the social and economic fabric of the internet as we know it, and not in a way that most of us would like. |
| |
| ▲ | buellerbueller 2 days ago | parent | prev [-] | | >It’s an obvious backdoor play to make sites go away. Oh no. |
|