Remix.run Logo
keiferski 4 days ago

Just to further elaborate on this with another example: the writing industry. (Technical, professional, marketing, etc. writing - not books.)

The default logic is that AI will just replace all writing tasks, and writers will go extinct.

What actually seems to be happening, however, is this:

- obviously written-by-AI copywriting is perceived very negatively by the market

- companies want writers that understand how to use AI tools to enhance productivity, but understand how to modify copy so that it doesn’t read as AI-written

- the meta-skill of knowing what to write in the first place becomes more valuable, because the AI is only going to give you a boilerplate plan at best

And so the only jobs that seem to have been replaced by AI directly, as of now, are the ones writing basically forgettable content, report-style tracking content, and other low level things. Not great for the jobs lost, but also not a death sentence for the entire profession of writing.

jaynetics 4 days ago | parent | next [-]

As someone who used to be in the writing industry (a whole range of jobs), this take strikes me as a bit starry-eyed. Throw-away snippets, good-enough marketing, generic correspondence, hastily compiled news items, flairful filler text in books etc., all this used to be a huge chunk of the work, in so many places. The average customer had only a limited ability to judge the quality of texts, to put it mildly. Translators and proofreaders already had to prioritize mass over flawless output, back when Google Translate was hilariously bad and spell checkers very limited. Nowadays, even the translation of legal texts in the EU parliament is done by a fraction of the former workforce. Very few of the writers and none of the proofreaders I knew are still in the industry.

Addressing the wider point, yes, there is still a market for great artists and creators, but it's nowhere near large enough to accommodate the many, many people who used to make a modest living, doing these small, okay-ish things, occasionally injecting a bit of love into them, as much as they could under time constraints.

anon191928 4 days ago | parent | next [-]

What I understand is AI leads certain markets to be smaller in terms of economics. Wayy smaller actually. Only few industry will keep growing because of this.

cj 4 days ago | parent [-]

Specifically markets where “good enough” quality is acceptable.

Translation is an good example. Still need humans for perfect quality, but most use cases arguably don’t require perfect.

And for the remaining translators their job has now morphed into quality control.

nostrademons 4 days ago | parent | prev [-]

I think this is a key point, and one that we've seen in a number of other markets (eg. computer programming, art, question-answering, UX design, trip planning, resume writing, job postings, etc.). AI eats the low end, the portion that is one step above bullshit, but it turns out that in a lot of industries the customer just wants the job done and doesn't care or can't tell how well it is done. It's related to Terence Tao's point about AI being more useful as a "red team" member [1].

This has a bunch of implications that are positive and also a bunch that are troubling. On one hand, it's likely going to create a burst of economic activity as the cost of these marginal activities goes way down. Many things that aren't feasible now because you can't afford to pay a copywriter or an artist or a programmer are suddenly going to become feasible because you can pay ChatGPT or Claude or Gemini at a fraction of the cost. It's a huge boon for startups and small businesses: instead of needing to raise capital and hire a team to build your MVP, just build it yourself with the help of AI. It's also a boon for DIYers and people who want to customize their life: already I've used Claude Code to build out a custom computer program for a couple household organization tasks that I would otherwise need to get an off-the-shelf program that doesn't really do what I want for, because the time cost of programming was previously too high.

But this sort of low-value junior work has historically been what people use to develop skills and break into the industry. And juniors become seniors, and typically you need senior-level skills to be able to know what to ask the AI and prompt it on the specifics of how to do a task best. Are we creating a world that's just thoroughly mediocre, filled only with the content that a junior-level AI can generate? What happens to economic activity when people realize they're getting shitty AI-generated slop for their money and the entrepreneur who sold it to them is pocketing most of the profits? At least with shitty human-generated bullshit, there's a way to call the professional on it (or at least the parts that you recognize as objectionable) and have them do it again to a higher standard. If the business is structured on AI and nobody knows how to prompt it to do better, you're just stuck, and the shitty bullshit world is the one you live in.

[1] https://news.ycombinator.com/item?id=44711306

zarzavat 4 days ago | parent | prev | next [-]

The assumption here is that LLMs will never pass the Turing test for copywriting, i.e. AI writing will always be distinguishable from human writing. Given that models that produce intelligible writing didn't exist a few years ago, that's a very bold assumption.

keiferski 4 days ago | parent | next [-]

No, I’m sure they will at some point, but I don’t think that eliminates the actual usefulness of a talented writer. It just makes unique styles more valuable, raises the baseline acceptable copy to something better (in the way that Bootstrap increased website design quality), and shifts the role of writer to more of an editor.

Someone still has to choose what to prompt and I don’t think a boilerplate “make me a marketing plan then write pages for it” will be enough to stand out. And I’d bet that the cyborg writers using AI will outcompete the purely AI ones.

(I also was just using it as a point to show how being identified as AI-made is already starting to have a negative connotation. Maybe the future is one where everything is an AI but no one admits it.)

zarzavat 4 days ago | parent [-]

Why couldn't an AI do all of that?

> And I’d bet that the cyborg writers using AI will outcompete the purely AI ones.

In the early days of chess engines there were similar hopes for cyborg chess, whereby a human and engine would team up to be better than an engine alone. What actually happened was that the engines quickly got so good that the expected value of human intervention was negative - the engine crunching so much information than the human ever could.

Marketing is also a kind of game. Will humans always be better at it? We have a poor track record so far.

CuriouslyC 4 days ago | parent | next [-]

Chess is objective, stories and style are subjective. Humans crave novelty, fresh voices, connection and layers of meaning. It's possible that the connection can be forged and it can get smart enough to bake layers of meaning in there, but AI will never be good at bringing novelty or a fresh voice just by its very nature.

dingnuts 4 days ago | parent [-]

LLMs are frozen in time and do not have experiences so there's nothing to relate to.

I'd pay extra for writing with some kind of "no AI used" certification, especially for art or information

cobbzilla 4 days ago | parent | prev | next [-]

no matter what you ask AI to do, it’s going to give you an “average“ answer. Even if you tell it to use a very distinct specific voice and write in a very specific tone, it’s going to give you the “average“ specific voice and tone you’ve asked for. AI is the antithesis of creativity and originality. This gives me hope.

IX-103 4 days ago | parent | next [-]

That's mostly true of humans though. They almost always give average answers. That works out because 1) most of the work that needs to be done is repetitive, not new so average answers are okay 2) the solution space that has been explored by humans is not convex, so average answers will still hit unexplored territory most of the time

cobbzilla 4 days ago | parent [-]

Absolutely! You can communicate with without (or with minimal) creativity. It’s not required in most cases. So AI is definitely very useful, and it can ape creativity better and better, but it will always be “faking it”.

chuckadams 4 days ago | parent | prev [-]

What is creative or original thought? You are not the first person to say this after all.

cobbzilla 4 days ago | parent [-]

Not being 100% algorithmically or mathematically derived is a good start. I’m certain there’s more but to me this is a minimum bar.

int_19h 3 days ago | parent [-]

If your brain is not running algorithms (which are ultimately just math regardless of the compute substrate), how do you imagine it working then, aside from religious woo like "souls"?

chuckadams 2 days ago | parent | next [-]

I dunno, I think artificiality is a pretty reasonable criterion to go by, but it doesn't seem at all related to originality, nor does originality really stack up when we too are also repeating and remixing what we were previously taught. Clearly we do a lot more than that as well, but when it comes to defining creativity, I don't think we're any closer to nailing that Jello to the tree.

cobbzilla 2 days ago | parent | prev [-]

sure and physics is all strings and pure math at the the most fundamental level; kind of misses the point.

int_19h a day ago | parent [-]

If that's your take, then you need to explain how the gap between this fundamental level and the level that you're concerned with is different in those two cases.

slowlyform 4 days ago | parent | prev | next [-]

I tried asking chatgpt for brainrot speech and all examples they gave me sound very different from what the new kids on the internet are using. Maybe language will always evolve faster than whatever amount of Data openAI can train their model with :).

oblio 4 days ago | parent | prev [-]

Intellectuals have a strong fetish for complete information games such as chess.

Reality and especially human interaction are basically the complete opposite.

wybiral 4 days ago | parent | prev | next [-]

AI will probably pass that test. But art is about experience and communicating more subtle things that we humans experience. AI will not be out in society being a person and gaining experience to train on. So if we're not writing it somewhere for it to regurgitate... It will always feel lacking in the subtlety of a real human writer. It depends on us creating content with context in order to mimic someone that can create those stories.

EDIT: As in, it can make really good derivative works. But it will always lag behind a human that has been in real life situations of the time and experienced being a human throughout them. It won't be able to hit the subtle notes that we crave in art.

int_19h 3 days ago | parent [-]

> AI will not be out in society being a person and gaining experience to train on.

It can absolutely do that, even today - you could update the weights after every interaction. The only reason why we don't do it is because it's insanely computationally expensive.

j45 4 days ago | parent | prev [-]

Today’s models are tuned to output the average quality of their corpus.

This could change with varying results.

What is average quality? For some it’s a massive upgrade. For others it’s a step down. For the experienced it’s seeing through it.

zarzavat 4 days ago | parent [-]

You're absolutely right, but AIs still have their little quirks that set them apart.

Every model has a faint personality, but since the personality gets "mass produced" any personality or writing style makes it easier to detect it as AI rather than harder. e.g. em dashes, etc.

But reducing personality doesn't help either because then the writing becomes insipid — slop.

Human writing has more variance, but it's not "temperature" (i.e. token level variance), it's per-human variance. Every writer has their own individual style. While it's certainly possible to achieve a unique writing style with LLMs through fine-tuning it's not cost effective for something like ChatGPT, so the only control is through the system prompt, which is a blunt instrument.

j45 4 days ago | parent [-]

It’s not a personality. There is no concept of replicating a person, personality or behaviours because the software is not the simulation of a living being.

It is a query/input and response format. Which can be modeled to simulate a conversation.

It can be a search engine that responds on the inputs provided, plus the system, account, project, user prompts (as constraints/filters) before the current turn being input.

The result can sure look like magic.

It’s still a statistically present response format based on the average of its training corpus.

Take that average and then add a user to it with their varying range and then the beauty varies.

LLMs can have many ways to explain the same thing more than 1 can be valid sometimes; other times not.

Scarblac 4 days ago | parent | prev [-]

Seems a bit optimistic to me. Companies may well accept a lower quality than they used to get if it's far cheaper. We may just get shittier writing across the board.

(and shittier software, etc)