Remix.run Logo
mikeocool 16 hours ago

Kinda seems like we’re rapidly headed for the complete collapse of the internet as we know it.

Every site that is driven by user posting seems to be headed towards being overrun by AI bots chatting with each other, either for sake of promoting something or farming karma.

And there’s really not much point in publishing good content anymore, since AI is just going slurp it up and regurgitate it without driving you any traffic.

Though it’ll be interesting to see what happens to ChatGPT and the like once the amount of quality content for them to consume slows to a trickle. Will people still use ChatGPT to get product recommendations without Reddit posts and Wirecutter providing good content for those recommendations?

deanc 6 hours ago | parent | next [-]

The bot problem cannot be solved. Even if you strongly authenticate, people are letting bots act on their behalf (moltbook is a great example of this) and what's to stop people doing that in the future. Build your identity and reputation autonomously with the benefits that come with that.

This happens now on Onlyfans too. Content creators hire agencies which in the best case outsource chatting to "customers" to armies of cheap labour in Asia, and the worst case use bots.

The dead internet theory [1] is probably not just a theory anymore. HN recently made a policy to not allow AI posting and posters, but do you honestly think that's going to work? I would place a bet that a top HN poster within the next year is outed as using AI for posting on their behalf.

[1] https://en.wikipedia.org/wiki/Dead_Internet_theory

gzread an hour ago | parent | next [-]

"content creators" https://fgiesen.wordpress.com/2025/07/06/content-creator/

echelon 18 minutes ago | parent [-]

The word "content" is gross.

"Creator", on the other hand, is beautiful. It means you don't have to pick a lane. Anything can be creative. Documentary filmmaking, stop motion, dance, costume work, historical reenactment, indie animation, economics essays, game dev...

The problem is we don't have a nice word that holistically captures the output of creators. They're not all making films or illustrations. So what do you call it? "Art" is awkward.

"Content" works, but it sounds like slop. We need a better alternative word that elevates creative output.

tlonny 3 hours ago | parent | prev | next [-]

Indeed - the future is RL meet-ups and small, intimate online communities.

Perhaps not the worst thing in the world?

JimDabell 4 hours ago | parent | prev [-]

> people are letting bots act on their behalf (moltbook is a great example of this) and what's to stop people doing that in the future.

Verifiable credentials; services can get persistent pseudonymous identifiers that are linked to a real-world identity. Ban them once and they stay banned. It doesn’t matter if a person lets a bot post inauthentic content using their identity if, when they are caught, that person cannot simply register a new account. This solves a bunch of problems – online abuse, spam, bots, etc. – without telling websites who you are or governments what you do.

gzread an hour ago | parent | next [-]

The ability to make a new account is an important defense against abusive bans. You don't want it to be possible for Google to unperson you.

dom96 3 hours ago | parent | prev | next [-]

This is exactly right. The problem is the friction that this kind of system adds.

Even so, I implemented this and I wrote about it here: https://blog.picheta.me/post/the-future-of-social-media-is-h...

flomo 3 hours ago | parent | prev | next [-]

IMO this is inevitable. HN is freaking about about the end of the anonymous internet, but it's already over and we're just figuring it out. Eventually the bots will find their 90s cyberpunk cosplay IRC channel too.

Terr_ 4 hours ago | parent | prev [-]

I'd rather have a system where there's a small investment cost to making an account, but you could always make another.

Imagine A system where there's a vending machine outside City Hall, you spend $X on a charity for choice, and you get a one-time, anonymous token. You can "spend" it with a forum to indicate "this is probably a person or close enough to it."

Misuse of the system could be curbed by making it so that the status of a token cannot be tested non-destructively.

gzread an hour ago | parent | next [-]

Something Awful made you pay $10 for an account. Directly to the forum. If you got banned you could pay another $10 to try again. Somehow this didn't lead to that bad incentives even though you'd think it would.

tlonny 3 hours ago | parent | prev | next [-]

I’d love something like this implemented for email.

Sending an unsolicited email to a random person X requires you to pay a small toll (something like 50p).

Subsequent emails can then be sent for free - however person X can “revoke” your access any time necessitating a further toll payment.

You would of course be able to pre-authorise friends/family/transactional emails from various services that you’ve signed up for.

This would nuke spam economics and be minimally disruptive for other use cases of email IMO…

jimmydorry an hour ago | parent [-]

>transactional emails from various services that you’ve signed up for

These are one of the main culprits of unwanted emails... and a toll system would make them all the more valuable for the even worse actors to take advantage of.

JimDabell 4 hours ago | parent | prev [-]

Do you think there is a price point that locks out spammers without locking out poor people?

bobthepanda 3 hours ago | parent [-]

probably not, the problem is that spammers/scammers are looking for whales, and if you are talking about draining the retirement accounts of an American who's been saving all their life, that's quite a big payout in the six or seven figures.

phtrivier 18 minutes ago | parent | prev | next [-]

Asking money to people in order to read stuff, and promoting the one people are actually ready to part with real money to read, is a first interesting step. (See: substack, Patreon,etc...)

I know this is going to sound horrible, but : how about asking money to contribute, period ? Maybe have a free tier of a couple comments, etc... But if you want to build a troll factory, sure... Show us the cash ?

kdheiwns 7 hours ago | parent | prev | next [-]

With AI running rampant, it seems security through obscurity is basically the best thing we have. Everyone knows reddit, facebook, xitter, etc so any clown can and does have bots running loose. HN is "obscure" in that most normies don't know about this place, and so it's relatively safe from the floods of spam. But I think it's just a matter of time until non-tech people start looking for those few bastions of human comments online, come across this place, and a great flood begins and it'll never be undone. After that, I guess it'll be a rise of invite-only forums like we had in the early 2000s all over again.

tlonny 3 hours ago | parent | next [-]

HN may not be “mainstream” but it is certainly _very_ vulnerable to bot spam given the topics discussed and the make-up of the audience.

You can already see it happening now - at least the bots that write like vanilla Claude/ChatGPT. Presumably there is a much larger hidden cohort of bots that are instructed to talk more naturally and thus are better adept at flying under the radar…

r721 5 hours ago | parent | prev | next [-]

Dang told me in 2019 that HN gets 150M page views a month, so it's not that obscure actually:

https://news.ycombinator.com/item?id=21201120

ahofmann 5 hours ago | parent [-]

150m page views a month is peanuts and very far away from the "social" networks numbers. I don't have those numbers, but I know how many page views we had 2011 while running a german browser game community.

armchairhacker 3 hours ago | parent [-]

The internet seems to have grown massively within the past couple years (unfortunately, almost certainly because of bots). I bet the number today is orders of magnitude higher.

firecall 4 hours ago | parent | prev | next [-]

> After that, I guess it'll be a rise of invite-only forums like we had in the early 2000s all over again.

Which would be totally fine with me TBH.

Rather amusingly, invite-only torrent sites might be the only semi-public authentically human hangouts left on the internet!

ultratalk 6 hours ago | parent | prev | next [-]

Eternal AI september.

JetSetIlly 5 hours ago | parent [-]

Eternal LLMber

Mountain_Skies 6 hours ago | parent | prev [-]

I've asked ChatGPT a question about something I read in a thread here and it responded with a comment from that thread, even though the thread was less than an hour old. HN is well known in the tech community and there are certain subjects, especially anything involving Israel or India, that nearly instantly result in a flood of comments from bad actors. HN isn't Reddit but it's also a shadow of what it once was, which is driving away more of the productive participation in favor of agenda-based posting.

gzread an hour ago | parent | next [-]

Search engines seem to index HN in near real time. They must have custom scraping code to follow the incrementing post IDs.

WesolyKubeczek 4 hours ago | parent | prev [-]

Note that these topics often involve comments which you can predict very easily. Internet users are like that, agenda or no. Wasn’t it in the heyday of forums that you could recognize the most prolific/annoying members by their style and vocabulary? A model should have no problem pulling such things off.

october8140 5 hours ago | parent | prev | next [-]

The future is human curated content. Provide the same experience people get today but without the noise. Give them just the good stuff and don't let just anyone make a post. A book has an author, a movie has a director, maybe websites can have webmasters again who filter through the garbage for you.

spicymaki an hour ago | parent | next [-]

The future is meeting in person and watching performers actually perform live.

nicbou 3 hours ago | parent | prev | next [-]

AI is sucking up that content and denying traffic to its creators. This model is becoming obsolete.

Gud 4 hours ago | parent | prev | next [-]

It’s what I’m trying to accomplish with my website(link is in my profile). Just trying to crank up the signal to noise ratio.

rambambram 3 hours ago | parent [-]

Nice. I like how clicking a tag also makes the word 'tag' light up.

Gud an hour ago | parent [-]

Thanks for the kind words!

I got encouraged by another HN poster a few days ago, let me know if you have any suggestions.

I’m always open to criticism.

kaizenb 3 hours ago | parent | prev | next [-]

A curator with a great taste and judgement is king.

kolinko 3 hours ago | parent | prev | next [-]

human curated -> human moderated. I, for one, don't care if it's ai, or human-written. I care if it's interesting/useful.

kaizenb 3 hours ago | parent [-]

results are important, not the tools or process. (on this matter)

jacamera an hour ago | parent [-]

Results over time are important. Or at least they should be.

b112 5 hours ago | parent | prev [-]

Yes, precisely.

This means that only sites which verify identity will have any value in the future. And by verified, that means against government ID and verified as real.

No amount of sign up fee works as an alternative.

Note that a site can verify identity, prevent sock puppets, ban bad actors and prevent re-registration, all while keeping that ID private.

You still get a handle and publicly facing nick if you want it.

The company which handles this correctly will have a big B after it. Digg actually has a chance at this.

It has no users, so the outrage won't exist in the same capacity. Existing platforms will be pummeled in the market if they try to convert to this type of site, as their DAU will likely drop a thousandfold, just due to the eliminated bots.

But Digg could relaunch this way. And as exhibited, this is now the only way.

The age of the anonymous internet is over, it's done. People not realizing this are living in the past.

Note, I don't like this, but acknowledging reality is vital. Issues with leaked databases, users, hacking of Pii are all technical and legislative issues, and not relevant to whether or not this happens.

Because it will happen, and is happening.

It should be noted that falsifying ID is a crime. Fake ID coupled with computer fraud laws will eventually result in hefty jail time. This is sensible, if people want a world where ecommerce, and discourse is online... and the general public does.

And has exhibited a complete lack of care about privacy regardless.

visarga 2 hours ago | parent | prev | next [-]

> Though it’ll be interesting to see what happens to ChatGPT and the like once the amount of quality content for them to consume slows to a trickle.

Creative loop moves inside the agentic chat room, where we do learning, work, art, research, leisure, planning, and other activities. Already OpenAI is close to 1B users and puts multiple trillion tokens per day into our heads, while we put our own tokens into their logs. An experience flywheel or extended cognition wheel of planetary size. LLMs can reflect and detect which of their responses compound better in downstream activities and derive RLHF-RLVR signalling from all our interactions. One good thing is that a chat room is less about posing than a forum, but LLMs have taken to sycophancy so they are not immune, just easier to deal with than forums. And you can more easily find another LLM than a replacement speciality forum.

shellfishgene 5 hours ago | parent | prev | next [-]

This could be positive. So far things were gamed and manipulated to some extent, with some fake content, but it was never too obvious, and a bit of a cat and mouse game with filters and whatnot. Now, it's so easy to fake content that robust systems will have to evolve, or most social media sites will become worthless, and advertisers will catch up eventually when they are paying for bot-only sites. The downside of course is that these robust systems are hard to imagine without complete loss of anonymity of the users.

armchairhacker 3 hours ago | parent [-]

Web of trust weakens anonymity, but doesn’t eliminate it.

- You know who your online invitees are, but not your invitees-of-invitees-of-…

- You can create an account, get it invited, then create an alt account and invite it. Now the alt account is still linked to you, but others don’t know whether it’s your friend or yourself. (Importantly, you can’t evade bans with alts; if your invited users keep getting banned, you’ll be prevented from inviting more if not banned yourself)

diacritical 13 hours ago | parent | prev | next [-]

> And there’s really not much point in publishing good content anymore, since AI is just going slurp it up and regurgitate it without driving you any traffic.

You just published good content knowing AI will slurp it up and not give you any traffic in return. I'm now replying to you with more content with the same expectations about AI and traffic. Why care about AI or traffic or recognition? Isn't the content the thing that matters?

It's like answering technical questions in an anonymous/pseudonymous chat or forum, which I'm sure you've done, too. We do it to help others. If an AI can take my answer and spread it around without paying me or mentioning one of my random usernames I change every month or so, I would be happy. And if the AI gives me credit like "coffeecup543 originally posted that on IRC channel X 5 years ago", I couldn't care less. It would be noise to the reader. Even if the AI uses my real name, so what?

The people who cared about traffic and money from their posts rarely made good content, anyway. Listicles and affiliate marketing BS and SEO optimizations and making a video that could be 1 minute into 10 minutes, or text that could've been 5 articles into a long book - all existed from before AI. With AI I actually get less of this crap - either skip it or condense it.

wibbily 9 hours ago | parent | next [-]

It's two different problems. People who run review sites and blogs and such care about traffic, and not getting attribution will kill their desire to participate. People who post here and on Reddit etc. care about talking with other human beings, and feeling ignored in a sea of botspam will kill *their* desire to participate.

NitpickLawyer 7 hours ago | parent [-]

> feeling ignored in a sea of botspam will kill their desire to participate.

The bots are not really that bad, they're (still) pretty easy to spot and not engage with. I'm more perplexed about the negativity filled comments sections, and I'm pretty sure most posters are real grass-fed certified humans.

I don't get why negative posts get so upvoted, get so popular on the front page, and people still debate with outdated arguments in them. People come in and fight other deamons, make straw-man arguments and in general promote negative stuff like there's no tomorrow. I think you can get so much more signal from posititve examples, from "hey I did a thing" type posts, and so on. Even overhyped stuff like the claw-mania can still be useful. Yet the "I did a thing" get so overwhelmed by negativity, nitpicking and "haha not perfect means doa" type of messages. That makes me want to participate less...

Defletter 6 hours ago | parent [-]

Oh that's just human nature: there's a reason why trashy tabloids continue to exist despite how public sentiment seems to universally agree that they're awful spreaders of rumour and insecurity. More people are Skankhunt42 than we'd like to admit.

Terr_ 4 hours ago | parent | prev | next [-]

That's a little bit apples to oranges, because I'm not monetizing this content, or paying to host it, or trying to make a personal brand, etc.

intended 5 hours ago | parent | prev [-]

Yes and no.

In the most simple sense - Yes, it is the content that matters.

In the more practical sense - cognitive and emotional resources are limited and our brains are not content agnostic.

We have different behaviors, expectations and capacities for talking to machines and talking to humans.

For example, if I am engaging with a human I can expect to potentially change their minds.

For a machine? Why bother even responding. It’s of no utility to me to respond.

Furthermore, all human communication comes with a human emotional context. There are vast amounts of information implied through tone, through what we choose not to say. Sometimes people say things in one emotional state that is not what they would say on another occasion.

To move the conversation forward, addressing the emotional payload behind the words used, matters more than the words used themselves.

There are a myriad reasons why humans are practically poorer for these tools.

nicbou 3 hours ago | parent | prev | next [-]

Every website that was driven by traffic is also dying. I have put nearly a decade of work into mine, and AI overviews and ChatGPT have reduced traffic by over 60%. At some point I will need to give up and find a job, and that corner of the internet will get no new original information, just rehashed slop.

nativeit 3 hours ago | parent | prev | next [-]

As someone who came of age before “the internet as you know it”, I am looking forward to all of the cancerous Web 2.0 OG slop and narcissism factories succumbing to their own fates. Let me tell you, the internet as we know it sucks, and the internet it ate 25-years ago is a marked improvement. We should be so lucky. Now go write a personal blog in plain text, and rejoice.

dartharva 2 hours ago | parent | prev | next [-]

You mean a complete collapse of social media, not the whole internet. The internet is a telecom ecosystem and has a lot more to it than just forums and link aggregators.

I honestly believe it might not even be such a bad thing. People were arguably better without social networks and media, and it's perhaps better to let the cancerous thing just die and keep the internet just as a utility powering boring things like banking and academia.

dana321 15 hours ago | parent | prev | next [-]

That and most of the news being behind a paywall, which they can scrape anyway.

The internet archive is my safe haven these days, i can go back and remember the old internet.

mikeocool 15 hours ago | parent [-]

Ha yeah, I quite like the 2003 vintage.

bobsmooth 13 hours ago | parent | prev [-]

Unless you're allowed to say slurs without being banned, your forum will be overrun with bots. The sanitation of the internet is the perfect breeding ground for brand-safe AI promotion bots.

georgeburdell 10 hours ago | parent | next [-]

4chan has bots too.

seattle_spring 6 hours ago | parent | prev [-]

Curious how you came to that conclusion. Anecdotally, places where you can slur to your heart's content like /r/conservative seem far more inundated with bots than other areas of Reddit. I feel like that's really saying something too, because Reddit has a really bad bot problem overall.