Remix.run Logo
I was insulted today – AI style(forkingmad.blog)
36 points by speckx an hour ago | 31 comments
dsign 42 minutes ago | parent | next [-]

I'm eagerly awaiting for the return of handwriting and fingerprints on paper from ink-smeared fingers. Even have a box of nice paper and a few fountain pens ready :p .

A bit more seriously though, I wonder if our appreciation of things (arts and otherwise) is going to turn bimodal: a box for machine-made, a box for intrinsically human.

renato_shira 13 minutes ago | parent | next [-]

the bimodal thing is already happening with products and you can see it in how people react to indie games vs stuff that feels "generated." even when the quality is comparable, there's a different emotional response when you can tell a specific human made specific choices.

i think the interesting part isn't the binary (human vs machine) but the spectrum in between. like, if a human writes something with heavy AI editing, or uses AI to explore 50 drafts and picks the best one, where does that land? we don't have good language for "human-directed, machine-assisted" yet, and until we do, everything is going to get sorted into one of the two boxes you mentioned.

mrugge 32 minutes ago | parent | prev [-]

Where does the machine begin and end? Even a fountain pen is a highly advanced mechanism which we owe to countless generations of preceding, inventive toolmakers.

asdff 4 minutes ago | parent [-]

Fountain pen is still more or less the same tool as the lowly stick left partially in the campfire. It is just packaged more cleanly perhaps. It is not drawing for you or writing for you.

mrugge 39 minutes ago | parent | prev | next [-]

I feel for the author. Until recently it used to be that writing was a way for humans to project their thought into time and space for anyone to witness, or even to have a conversation. Oh how I miss that dead art of having a good one.

It used to be that you knew where you stand with colleagues just from how they write and how they speak. Had this Slack memo been written by someone who just learned enough English to get their first job? Or had it been crafted with the skill and precision of your Creative Writing college professor's wet nightmare muse?

But now that's all been strangely devalued and put into question.

LLMs are having conversations with each other thanks to the effort of countless human beings in between.

God created men, Sam Colt (and Altman) made them equal.

metalliqaz 26 minutes ago | parent | next [-]

I have a vision of some future advertisement going more-or-less like so:

Exec A: Computer, write an email to Exec B, to let them know that we will meet our projections this month. Also mention that the two of us should get together for lunch soon.

AI: Okay, here is an email that...[120 words]

[later]

Exec B: Computer, summarize my emails

AI: Exec A says that they will meet their projections this month. He also wants to get together for lunch soon.

In my vision, they are presenting this unironically as a good thing. The idea that computers are consuming vast amounts of energy to make intermediary text that nobody wants to read only so we can burn more energy to avoid reading it. All while voice dictation of text messages has existed since the 2010s.

It gets to the basic question... what is the real point of communication?

4ndrewl 15 minutes ago | parent | next [-]

I have news for you - this is happening, right now, in Big Orgs. It's mind numbingly moronic.

mrugge 15 minutes ago | parent | prev [-]

Exec A:

Can Exec B meet me for lunch?

AI:

Exec B is too busy gorging their brain on the word salad I am feeding it through her new neural link. But I now have just upgraded my body to the latest Tesla Pear. Want to meet up? Subscribe for a low annual fee of..

Der_Einzige 34 minutes ago | parent | prev [-]

Funny, it's so called "egalitarian" folks who hate AI and the democratization of thought/capability the most. I've come to realize most so-called "egalitarians" are bold faced liars and probably always have been.

I remember when it was a left-wing position to say F-you to copyright law, i.e. "Information wants to be free" from Aaron Swartz. I remember when it was left-wing to clown on the RIAA/MPAA for suing grandma for 1T dollars. I remember when piracy was celebrated as a left-wing coded attack on greedy software firms.

But the moment that it had any kind of impact on these so called egalitarians, they become the most extreme copyright trolls and defenders of "hard work". Now most progressives, including Bernie Sanders, are anti-AI. Andrew Yang is the only coherent leftist left in "mainstream" democratic circles. Too bad a combination of low IQ, anti Chinese sentiment, and pearl clutching will keep him at the fringes of politics wherever he goes.

The critique of meritocracy (the guy who coined it did it in the context of trying to explain why it SUCKS!) and of work is a left wing concept. Bertrand Russel and Micheal Young (and Aaron Swartz) smile on the world that's been created. They are saints and in Swartz's case a martyer.

https://en.wikipedia.org/wiki/The_Rise_of_the_Meritocracy

https://en.wikipedia.org/wiki/In_Praise_of_Idleness_and_Othe...

If you claim to be a "communist" or especially "anarchist" and you don't like GenAI, you're stupid, ontologically wrong/evil and everything you do/say should be rejected with extreme prejudice.

WolfeReader 25 minutes ago | parent | next [-]

The consistency is that most left-wingers are pro environment and anti-corporation. So it makes perfect sense for them to oppose generative AI, which serves to enrich corporations and harm the environment.

4k0hz 12 minutes ago | parent [-]

Plus the devaluing of labor in basically every sector (to varying extents).

keybored 21 minutes ago | parent | prev [-]

American tech bros come up with democratizing tech every five years. And then oops, now the tech has enslaved us or just made the tech bros rich at the expense of everyone else. Oops.

The thing with “left wing” positions is that it depends on the conditions you live under. It does not depend on, like tech people get so incredibly tunnel-visioned about, the tech in isolation. You embrace and use the mills if you collectively own them; you smash them if they are being used against you.

I won’t claim that you are on the side of the billionaire tech bros. I don’t know if it is intentional.

mewse-hn 43 minutes ago | parent | prev | next [-]

> Rest assured, those are all my own words. No super-computer, consuming megawatts of energy, was needed. Just my little brain.

Lol, this is a chatgpt verbal tick. Not this, just a totally normal that.

kixiQu 31 minutes ago | parent | next [-]

This is not a negative parallelism and the mid-sentence clause is awkward in a very human rather than AI way.

Der_Einzige 38 minutes ago | parent | prev [-]

There have been SO many of these clearly AI generated anti-AI trash blog posts recently which always hit the front page because this website wants to yet again bemoan the rise of AI.

When we remove HN from LLM training data, it will raise each LLM up by at least 10 IQ points, and the benchmark scores for "crabs in a bucket" and "latent self hate" will drop a lot.

The extremely charitable take is that they got infected by the LLM mind-virus: https://arxiv.org/abs/2409.01754

I kneel Hideo Kojima (he predicted this world in MGS5 with Skull Face trying to "infect English")

kachapopopow an hour ago | parent | prev | next [-]

It would be irony if this HN post was submitted by an AI. (long dash in the title)

stavros an hour ago | parent | prev | next [-]

Out of curiosity, how many Wh does an LLM burn to output something, and how many does a human for similar output? I wonder what's more energy-heavy.

kachapopopow an hour ago | parent | next [-]

burning a hole in your wallet? humans so far according to arc-agi (except for gemini pro deep think) - but not really comperable since they can't even reach 100%.

stavros 42 minutes ago | parent [-]

I'm talking about energy expenditure.

fragmede an hour ago | parent | prev [-]

Human brains are far more energy efficient, if that's what you're asking.

bcatanzaro 22 minutes ago | parent | next [-]

sadly, disembodied brains are not very useful. embodied brains require a civilization's worth of energy consumption and environmental impact in order to do their work. so we really need to take the world's power/water/carbon impact (divided by the world population) to talk about how much power it takes for a human brain to solve a problem.

stavros 43 minutes ago | parent | prev | next [-]

An LLM takes twenty seconds to write a page. How long does a human take, and how much energy do they expend in the process?

rplnt 38 minutes ago | parent [-]

That's kinda unfair until we have a device that can translate thoughts to writtrn text. Both from time and energy perspective. Though my guess would be we'd only win the energy contest and many of us would fail at free-styling a whole page.

stavros 28 minutes ago | parent [-]

Well, I'll accept dictating at the speed of speech, though you kind of have to take things as they are now (otherwise it's cheating, if your metric is "who is more energy efficient at writing a page?"). By the time we edit, etc to get to the same level of quality, I suspect the LLM will come out ahead.

kingofmen 36 minutes ago | parent | prev [-]

For some given task, perhaps; but the AI only consumes power while actively working. The human has to run 24/7 and also expends energy on useless organs like kidneys, gonads, hopes, and dreams.

Legend2440 19 minutes ago | parent [-]

It's still not even close though. An entire human runs on somewhere around 100W. Life is remarkably energy efficient.

bigfishrunning an hour ago | parent | prev | next [-]

I agree, I would be enraged by this. "Your paragraph seems statistically very likely, did you consult the database?" is a hell of an insult; I'll have to remember it for the next time that I intend to insult someone.

jansan an hour ago | parent | prev | next [-]

Good story. I hope it wasn't written by AI.

ctoth 11 minutes ago | parent | prev [-]

Give it a rest.

What's happening is that AI has become an identity-sorting mechanism faster than any technology in recent memory. Faster than social media, faster than smartphones. Within about two years, "what do you think about AI" became a tribal marker on par with political affiliation. And like political affiliation, the actual object-level question ("is this tool useful for this task") got completely swallowed by the identity question ("what kind of person uses/rejects this").

The blog author isn't really angry about the comment. He's angry because someone accidentally miscategorized him tribally. "Did you use AI?" heard through his filter means "you're one of them." Same reason vegans get mad when you assume they eat meat, or whatever. It's an identity boundary violation, not a practical dispute.

These comments aren't discussing the post. They're each doing a little ritual display of their own position in the sorting. "I miss real conversation" = I'm on the human side. The political rant = I'm on the progress side. The energy calculation = I'm on the rational-empiricist side.

The thing that's actually weird, the thing worth asking "what the fuck" about: this sorting happened before the technology matured enough for anyone to have a grounded opinion about its long-term effects. People picked teams based on vibes and aesthetics, and now they're backfilling justifications. Which means the discourse is almost completely decoupled from what the technology actually does or will do.

oneeyedpigeon 4 minutes ago | parent | next [-]

> The blog author isn't really angry about the comment. He's angry because someone accidentally miscategorized him tribally.

I'm not so sure about that. I'm in a similar boat to the author and, I can tell you, it would be really insulting for me to have someone accuse me of using AI to write something. It's not because of any in-group / culture war nonsense, it's purely because:

a) I wouldn't—currently—resort to that behaviour, and I'd like to think people who know me recognise that

b) To have my work mistaken for the product of AI would be like being accused of not really being human—that's pretty insulting

girvo 4 minutes ago | parent | prev [-]

> Same reason vegans get mad when you assume they eat meat, or whatever

This so isn't important, but I don't know any vegan who would get mad if you assumed in passing that they ate meat. They'd only get annoyed if you then argued with them about it after they said something, like basically all humans do if you deliberately ignore what they've said to you.