Remix.run Logo
keiferski 4 days ago

There’s a simple flaw in this reasoning:

Just because X can be replaced by Y today doesn’t imply that it can do so in a Future where we are aware of Y, and factor it into the background assumptions about the task.

In more concrete terms: if “not being powered by AI” becomes a competitive advantage, then AI won’t be meaningfully replacing anything in that market.

You can already see this with YouTube: AI-generated videos are a mild amusement, not a replacement for video creators, because made by AI is becoming a negative label in a world where the presence of AI video is widely known.

Of course this doesn’t apply to every job, and indeed many jobs have already been “replaced” by AI. But any analysis which isn’t reflectively factoring in the reception of AI into the background is too simplistic.

keiferski 4 days ago | parent | next [-]

Just to further elaborate on this with another example: the writing industry. (Technical, professional, marketing, etc. writing - not books.)

The default logic is that AI will just replace all writing tasks, and writers will go extinct.

What actually seems to be happening, however, is this:

- obviously written-by-AI copywriting is perceived very negatively by the market

- companies want writers that understand how to use AI tools to enhance productivity, but understand how to modify copy so that it doesn’t read as AI-written

- the meta-skill of knowing what to write in the first place becomes more valuable, because the AI is only going to give you a boilerplate plan at best

And so the only jobs that seem to have been replaced by AI directly, as of now, are the ones writing basically forgettable content, report-style tracking content, and other low level things. Not great for the jobs lost, but also not a death sentence for the entire profession of writing.

jaynetics 4 days ago | parent | next [-]

As someone who used to be in the writing industry (a whole range of jobs), this take strikes me as a bit starry-eyed. Throw-away snippets, good-enough marketing, generic correspondence, hastily compiled news items, flairful filler text in books etc., all this used to be a huge chunk of the work, in so many places. The average customer had only a limited ability to judge the quality of texts, to put it mildly. Translators and proofreaders already had to prioritize mass over flawless output, back when Google Translate was hilariously bad and spell checkers very limited. Nowadays, even the translation of legal texts in the EU parliament is done by a fraction of the former workforce. Very few of the writers and none of the proofreaders I knew are still in the industry.

Addressing the wider point, yes, there is still a market for great artists and creators, but it's nowhere near large enough to accommodate the many, many people who used to make a modest living, doing these small, okay-ish things, occasionally injecting a bit of love into them, as much as they could under time constraints.

anon191928 4 days ago | parent | next [-]

What I understand is AI leads certain markets to be smaller in terms of economics. Wayy smaller actually. Only few industry will keep growing because of this.

cj 4 days ago | parent [-]

Specifically markets where “good enough” quality is acceptable.

Translation is an good example. Still need humans for perfect quality, but most use cases arguably don’t require perfect.

And for the remaining translators their job has now morphed into quality control.

nostrademons 4 days ago | parent | prev [-]

I think this is a key point, and one that we've seen in a number of other markets (eg. computer programming, art, question-answering, UX design, trip planning, resume writing, job postings, etc.). AI eats the low end, the portion that is one step above bullshit, but it turns out that in a lot of industries the customer just wants the job done and doesn't care or can't tell how well it is done. It's related to Terence Tao's point about AI being more useful as a "red team" member [1].

This has a bunch of implications that are positive and also a bunch that are troubling. On one hand, it's likely going to create a burst of economic activity as the cost of these marginal activities goes way down. Many things that aren't feasible now because you can't afford to pay a copywriter or an artist or a programmer are suddenly going to become feasible because you can pay ChatGPT or Claude or Gemini at a fraction of the cost. It's a huge boon for startups and small businesses: instead of needing to raise capital and hire a team to build your MVP, just build it yourself with the help of AI. It's also a boon for DIYers and people who want to customize their life: already I've used Claude Code to build out a custom computer program for a couple household organization tasks that I would otherwise need to get an off-the-shelf program that doesn't really do what I want for, because the time cost of programming was previously too high.

But this sort of low-value junior work has historically been what people use to develop skills and break into the industry. And juniors become seniors, and typically you need senior-level skills to be able to know what to ask the AI and prompt it on the specifics of how to do a task best. Are we creating a world that's just thoroughly mediocre, filled only with the content that a junior-level AI can generate? What happens to economic activity when people realize they're getting shitty AI-generated slop for their money and the entrepreneur who sold it to them is pocketing most of the profits? At least with shitty human-generated bullshit, there's a way to call the professional on it (or at least the parts that you recognize as objectionable) and have them do it again to a higher standard. If the business is structured on AI and nobody knows how to prompt it to do better, you're just stuck, and the shitty bullshit world is the one you live in.

[1] https://news.ycombinator.com/item?id=44711306

zarzavat 4 days ago | parent | prev | next [-]

The assumption here is that LLMs will never pass the Turing test for copywriting, i.e. AI writing will always be distinguishable from human writing. Given that models that produce intelligible writing didn't exist a few years ago, that's a very bold assumption.

keiferski 4 days ago | parent | next [-]

No, I’m sure they will at some point, but I don’t think that eliminates the actual usefulness of a talented writer. It just makes unique styles more valuable, raises the baseline acceptable copy to something better (in the way that Bootstrap increased website design quality), and shifts the role of writer to more of an editor.

Someone still has to choose what to prompt and I don’t think a boilerplate “make me a marketing plan then write pages for it” will be enough to stand out. And I’d bet that the cyborg writers using AI will outcompete the purely AI ones.

(I also was just using it as a point to show how being identified as AI-made is already starting to have a negative connotation. Maybe the future is one where everything is an AI but no one admits it.)

zarzavat 4 days ago | parent [-]

Why couldn't an AI do all of that?

> And I’d bet that the cyborg writers using AI will outcompete the purely AI ones.

In the early days of chess engines there were similar hopes for cyborg chess, whereby a human and engine would team up to be better than an engine alone. What actually happened was that the engines quickly got so good that the expected value of human intervention was negative - the engine crunching so much information than the human ever could.

Marketing is also a kind of game. Will humans always be better at it? We have a poor track record so far.

CuriouslyC 4 days ago | parent | next [-]

Chess is objective, stories and style are subjective. Humans crave novelty, fresh voices, connection and layers of meaning. It's possible that the connection can be forged and it can get smart enough to bake layers of meaning in there, but AI will never be good at bringing novelty or a fresh voice just by its very nature.

dingnuts 4 days ago | parent [-]

LLMs are frozen in time and do not have experiences so there's nothing to relate to.

I'd pay extra for writing with some kind of "no AI used" certification, especially for art or information

cobbzilla 4 days ago | parent | prev | next [-]

no matter what you ask AI to do, it’s going to give you an “average“ answer. Even if you tell it to use a very distinct specific voice and write in a very specific tone, it’s going to give you the “average“ specific voice and tone you’ve asked for. AI is the antithesis of creativity and originality. This gives me hope.

IX-103 4 days ago | parent | next [-]

That's mostly true of humans though. They almost always give average answers. That works out because 1) most of the work that needs to be done is repetitive, not new so average answers are okay 2) the solution space that has been explored by humans is not convex, so average answers will still hit unexplored territory most of the time

cobbzilla 4 days ago | parent [-]

Absolutely! You can communicate with without (or with minimal) creativity. It’s not required in most cases. So AI is definitely very useful, and it can ape creativity better and better, but it will always be “faking it”.

chuckadams 4 days ago | parent | prev [-]

What is creative or original thought? You are not the first person to say this after all.

cobbzilla 4 days ago | parent [-]

Not being 100% algorithmically or mathematically derived is a good start. I’m certain there’s more but to me this is a minimum bar.

int_19h 3 days ago | parent [-]

If your brain is not running algorithms (which are ultimately just math regardless of the compute substrate), how do you imagine it working then, aside from religious woo like "souls"?

chuckadams 2 days ago | parent | next [-]

I dunno, I think artificiality is a pretty reasonable criterion to go by, but it doesn't seem at all related to originality, nor does originality really stack up when we too are also repeating and remixing what we were previously taught. Clearly we do a lot more than that as well, but when it comes to defining creativity, I don't think we're any closer to nailing that Jello to the tree.

cobbzilla 2 days ago | parent | prev [-]

sure and physics is all strings and pure math at the the most fundamental level; kind of misses the point.

int_19h a day ago | parent [-]

If that's your take, then you need to explain how the gap between this fundamental level and the level that you're concerned with is different in those two cases.

slowlyform 4 days ago | parent | prev | next [-]

I tried asking chatgpt for brainrot speech and all examples they gave me sound very different from what the new kids on the internet are using. Maybe language will always evolve faster than whatever amount of Data openAI can train their model with :).

oblio 4 days ago | parent | prev [-]

Intellectuals have a strong fetish for complete information games such as chess.

Reality and especially human interaction are basically the complete opposite.

wybiral 4 days ago | parent | prev | next [-]

AI will probably pass that test. But art is about experience and communicating more subtle things that we humans experience. AI will not be out in society being a person and gaining experience to train on. So if we're not writing it somewhere for it to regurgitate... It will always feel lacking in the subtlety of a real human writer. It depends on us creating content with context in order to mimic someone that can create those stories.

EDIT: As in, it can make really good derivative works. But it will always lag behind a human that has been in real life situations of the time and experienced being a human throughout them. It won't be able to hit the subtle notes that we crave in art.

int_19h 3 days ago | parent [-]

> AI will not be out in society being a person and gaining experience to train on.

It can absolutely do that, even today - you could update the weights after every interaction. The only reason why we don't do it is because it's insanely computationally expensive.

j45 4 days ago | parent | prev [-]

Today’s models are tuned to output the average quality of their corpus.

This could change with varying results.

What is average quality? For some it’s a massive upgrade. For others it’s a step down. For the experienced it’s seeing through it.

zarzavat 4 days ago | parent [-]

You're absolutely right, but AIs still have their little quirks that set them apart.

Every model has a faint personality, but since the personality gets "mass produced" any personality or writing style makes it easier to detect it as AI rather than harder. e.g. em dashes, etc.

But reducing personality doesn't help either because then the writing becomes insipid — slop.

Human writing has more variance, but it's not "temperature" (i.e. token level variance), it's per-human variance. Every writer has their own individual style. While it's certainly possible to achieve a unique writing style with LLMs through fine-tuning it's not cost effective for something like ChatGPT, so the only control is through the system prompt, which is a blunt instrument.

j45 4 days ago | parent [-]

It’s not a personality. There is no concept of replicating a person, personality or behaviours because the software is not the simulation of a living being.

It is a query/input and response format. Which can be modeled to simulate a conversation.

It can be a search engine that responds on the inputs provided, plus the system, account, project, user prompts (as constraints/filters) before the current turn being input.

The result can sure look like magic.

It’s still a statistically present response format based on the average of its training corpus.

Take that average and then add a user to it with their varying range and then the beauty varies.

LLMs can have many ways to explain the same thing more than 1 can be valid sometimes; other times not.

Scarblac 4 days ago | parent | prev [-]

Seems a bit optimistic to me. Companies may well accept a lower quality than they used to get if it's far cheaper. We may just get shittier writing across the board.

(and shittier software, etc)

jhbadger 4 days ago | parent | prev | next [-]

>You can already see this with YouTube: AI-generated videos are a mild amusement, not a replacement for video creators, because made by AI is becoming a negative label in a world where the presence of AI video is widely known.

But that's because, at present, AI generated video isn't very good. Consider the history of CGI. In the 1990s and early 2000s, it was common to complain about how the move away from practical sets in favor of CGI was making movies worse. And it was! You had backgrounds and monsters that looked like they escaped from a video game. But that complaint has pretty much died out these days as the tech got better (although Nolan's Oppenheimer did weirdly hype the fact that its simulated Trinity blast was done by practical effects).

morsecodist 4 days ago | parent | next [-]

I don't agree that it is because of the "quality" of the video. The issue with AI art is that it lacks intentional content. I think people like art because it is a sort of conversation between the creator and the viewer. It is interesting because it has a consistent perspective. It is possible AI art could one day be indistinguishable but for people to care about it I feel they would need to lie and say it was made by a particular person or create some sort of persona for the AI. But there are a lot of people who want to do the work of making art. People are not the limiting factor, in fact we have way more people who want to make art than there is a market for it. What I think is more likely is that AI becomes a tool in the same way CGI is a tool.

ninetyninenine 4 days ago | parent | next [-]

[flagged]

morsecodist 4 days ago | parent | next [-]

I honestly can't tell if you're being facetious. Maybe I suck at writing and don't properly understand sarcasm but unfortunately I'm only human.

nprateem 4 days ago | parent | prev [-]

It's obviously not AI written.

tbrownaw 4 days ago | parent | prev [-]

> The issue with AI art is that it lacks intentional content. I think people like art because it is a sort of conversation between the creator and the viewer.

Intent is in the eye of the beholder.

nprateem 4 days ago | parent [-]

The trouble with AI shit is it's all contaminated by association.

I was looking on YT earlier for info on security cameras. It's easy to spot the AI crap: under 5 minutes and just stock video in the preview or photos.

What value could there be in me wasting time to see if the creators bothered to add quality content if they can't be bothered to show themselves in front of the lens?

What an individual brings is a unique brand. I'm watching their opinion which carries weight based on social signals and their catalogue etc.

Generic AI will always lack that until it can convincingly be bundled into a persona... only then the cycle will repeat: search for other ways to separate the lazy, generic content from the meaningful original stuff.

keiferski 4 days ago | parent | prev | next [-]

CGI is a good analogy because I think AI and creators will probably go in the same direction:

You can make a compelling argument that CGI operators outcompeted practical effects operators. But CGI didn’t somehow replace the need for a filmmaker, scriptwriter, cinematographers, etc. entirely – it just changed the skillset.

AI will probably be the same thing. It’s not going to replace the actual job of YouTuber in a meaningful sense; but it might redefine that job to include being proficient at AI tools that improve the process.

tomrod 4 days ago | parent [-]

I think they are evolving differently. Some very old cgi holds up because they invested a lot of money to make it so. Then they tried to make it cheaper and people started complaining because the output was worse than all prior options.

Melatonic 4 days ago | parent [-]

Jurassic Park is a great example - they also had excellent compositing to hide any flaws (compositing never gets mentioned in casual CGI talk but is one of the most important steps)

The dinosaurs were also animated by oldschool stop motion animators who were very, very good at their jobs. Another very underrated part of the VFX pipeline.

Doesnt matter how nice your 3D modelling and texturing are if the above two are skimped on !

djtango 4 days ago | parent | prev | next [-]

That's a Nolan thing like how Dunkirk used no green screen.

I think Harry Potter and Lord of the Rings embody the transition from old school camera tricks to CGI as they leaned very heavily into set and prop design and as a result have aged very gracefully as movies

silvestrov 4 days ago | parent [-]

I think the first HP movie was more magical than the latter ones as they felt too "Marvel CGI" for me.

Marvel movies have become tiresome for me, too much CGI that does not tell any interesting story. Old animated Disney movies are more rewatchable.

djtango 3 days ago | parent | next [-]

I like to see marvel as the state of the art/tech demo for CGI - this is what is achievable with near limitless budget

I still find Infinity War and Endgame visually satisfying spectacles but I am a forgiving viewer for those movies

player1234 2 days ago | parent | prev [-]

And they cost 300 million to make be cause of the CGI fest they are, hence need close to a billion in profits when considering marketing and the theater cut. So the cost of CGI and the enshittification of movies seems to be a good analogy to the usefuleness of LLM/AI.

Not a flex.

yoz-y 4 days ago | parent | prev | next [-]

That said, the complaint is coming back. Namely because most new movies use an incredible amount of CGI and due to the time constraints the quality suffers.

As such, CGI is once again becoming a negative label.

I don’t know if there is an AI equivalent of this. Maybe the fact that as models seem to move away from a big generalist model at launch, towards a multitude of smaller expert models (but retaining the branding, aka GPT-4), the quality goes down.

player1234 2 days ago | parent | next [-]

The equivalent is the massive cost of CGI and LLMs in comparison to the lackluster end result.

not_the_fda 3 days ago | parent | prev [-]

Now they just make the whole scene dark and you can't see anything. Saves money on CGI though.

__MatrixMan__ 4 days ago | parent | prev | next [-]

Do you get the feeling that AI generated content is lacking something that can be incrementally improved on?

Seems to me that it's already quite good in any dimension that it knows how to improve on (e.g. photorealism) and completely devoid of the other things we'd want from it (e.g. meaning).

tomrod 4 days ago | parent | next [-]

It's missing random flaws. Often the noise has patternd as a result of the diffusion or generation process.

arcane23 3 days ago | parent [-]

Yeah, I was thinking about this. Humans vary depending on a lot of factors. Today they're happy, tomorrow they're a bit down. This makes for some variation which can be useful. LLMs are made to be reliable/repeatable, as general experience. You know what you get. Humans are a bit more ... -ish, depending on ... things.

keiferski 4 days ago | parent | prev [-]

Yeah if you look at many of the top content creators, their appeal often has very little to do with production value, and is deliberately low tech and informal.

I guess AI tools can eventually become more human-like in terms of demeanor, mood, facial expressions, personality, etc. but this is a long long way from a photorealistic video.

Barrin92 4 days ago | parent | prev | next [-]

>But that's because, at present, AI generated video isn't very good.

It isn't good, but that's not the reason. There's a paper about 10 years ago where people used some computer system to generate Bach-like music that even Bach experts couldn't reliably tell apart, but nobody listens to bot music. (or nobody except for engine programmers watches computer chess, despite superiority. Chess is thriving more now including commercially than it ever did)

In any creative field what people are after is the interaction between the creator and the content, which is why compelling personalities thrive more, not less in a sea of commodified slop (be that by AI or just churned out manually).

It's why we're in an age where twitch content creators or musicians are increasingly skilled at presenting themselves as authentic and personal. These people haven't suffered from the fact that mass production of media is cheap, they've benefited from it.

thefaux 4 days ago | parent | next [-]

The wonder of Bach goes much deeper than just the aesthetic qualities of his music. His genius almost forces one to reckon with his historical context and wonder, how did he do it? Why did he do it? What made it all possible? Then there is the incredible influence that he had. It is easy to forget that music theory as we know it today was not formalized in his day. The computer programs that simulate the kind of music he made are based on that theory that he understood intuitively and wove into his music and was later revealed through diligent study. Everyone who studies Bach learns something profound and can feel both a kinship for his humanity and also an alienation from his seemingly impossible genius. He is one of the most mysterious figures in human history and one could easily spend their entire life primarily studying just his music (and that of his descendants). From that perspective, computer generated music in his style is just a leaf on the tree, but Bach himself is the seed.

> These people haven't suffered from the fact that mass production of media is cheap, they've benefited from it.

Maybe? This really depends on your value system. Every moment that you are focused on how you look on camera and trying to optimize an extractive algorithm is a moment you aren't focused on creating the best music that you can in that moment. If the goal is maximizing profit to ensure survival, perhaps they are thriving. Put another way, if these people were free to create music in any context, would they choose content creation on social media? I know I wouldn't, but I also am sympathetic to the economic imperatives.

oinfoalgo 3 days ago | parent | prev | next [-]

I am a Bach fiend and the problem is BWV 1 to 1080.

Why would I listen to algorithmic Bach compositions when there are so many of Bach's own work I have never listened to?

Even if you did get bored of all JS music, Carl Philipp Emanuel Bach has over 1000 works himself.

There are also many genius baroque music composers outside the Bach family.

This is true of any composer really. Any classical composer that the average person has heard of has an immense catalog of works compared to modern recording artists.

I would say I have probably not even listened to half the works of all my favorite composers because it is such a huge amount of music. There is no need for some kind of classical music style LORA.

vidarh a day ago | parent [-]

I don't question Bach's genius, but most baroque music doesn't interest me. Some of Bach's music I still enjoy enough that I could see myself listen to AI generated tracks made to generate music specifically similar to those pieces of Bach (and others) that I like. Though not enough that I'd seek out that in particular, and so I think the combination of what you say, with the general low-level of interest of those who would if it just happened to appear in my playlist still explains why it's not really a thing.

There are many artists, across the spectrum, like that for me, from geniuses that are just outside my own taste, to mediocre "one hit wonders" where I realise why they only had one hit when I listened to the rest of their catalogue but reallly would like more like that one hit (or handful)

And even when you like a broader selection of a composers music, there are time you might want "more of the same" of a specific piece. E.g. I quite like Beethoven, but I love the Moonlight Sonata, not just for what it is in itself, but the general systematic exploration of repetitive and slowly shifting of it.

There are other pieces by wildly different composers that invokes similar systematic exploration of patterns [1], but I'd also love to be able to hear more new "improvisations" over specific instances tuned very specifically to the aspects I like.

[1] On the extreme other "end" of these types of shiting repetitive patterns, I love Rob Hubbards Delta in-game theme of 11+ minutes of patterns repeated and iterated over as an illustration of the wide range that I like for much the same reason: https://www.youtube.com/watch?v=jOpIbm_XX-k

You can also find a slightly less shrill modern remake, though it also adds a bit too much for my taste: https://www.youtube.com/watch?v=-WE6av3g_8I&list=RD-WE6av3g_...

Or a somewhat more faithful arrangement: https://www.youtube.com/watch?v=AHpYBGW41gw&list=RDAHpYBGW41...

vidarh 4 days ago | parent | prev | next [-]

That's interesting, because after ElevenLabs launched their music generation I decided I really quite want to spent some time to have it generate background tracks for me to have on while working.

I don't know the name of any of the artists whose music I listened to over the last week because it does not matter to me. What mattered was that it was unobtrusive and fit my general mood. So I have a handful of starting points that I stream music "similar to". I never care about looking up the tracks, or albums, or artists.

I'm sure lots of people think like you, but I also think you underestimate how many contexts there are where people just don't care.

pfdietz 4 days ago | parent | prev [-]

Authenticity and sincerity are very important. When you can fake those, you've got it made.

danielbln 4 days ago | parent | prev [-]

Ironically, while the non-CGI SFX in e.g. Interstellar looked amazing, that sad fizzle of a practical explosion in Oppenheimer did not do the real thing justice and would've been better served by proper CGI VFX.

danny_codes 3 days ago | parent [-]

Totally agree. Nolan is a perfectionists though, so I don’t think he could let himself go for broke on the actual boom boom

antirez 4 days ago | parent | prev | next [-]

To understand why this is too optimistic, you have to look at things where AI is already almost human-level. Translations are more and more done exclusively with AI or with a massive AI help (with the effect of destroying many jobs anyway) at this point. Now ebook reading is switching to AI. Book and music album covers are often done with AI (even if this is most of the times NOT advertised), and so forth. If AI progresses more in a short timeframe (the big "if" in my blog post), we will see a lot of things done exclusively (and even better 90% of the times, since most humans doing a given work are not excellence in what they do) by AI. This will be fine if governments immediately react and the system changes. Otherwise there will be a lot of people to feed without a job.

keiferski 4 days ago | parent | next [-]

I can buy the idea that simple specific tasks like translation will be dramatically cut down by AI.

But even then – any serious legal situation (like a contract) is going to want a human in the loop to verify that the translation is actually correct. This will require actual translator skills.

AI art seems to basically only be viable when it can’t be identified as AI art. Which might not matter if the intention is to replace cheap graphic design work. But it’s certainly nowhere near developed enough to create anything more sophisticated, sophisticated enough to both read as human-made and have the imperfect artifacts of a human creator. A lot of the modern arts are also personality-driven, where the identity and publicity of the artist is a key part of their reception. There are relatively few totally anonymous artists.

Beyond these very specific examples, however, I don’t think it follows that all or most jobs are going to be replaced by an AI, for the reasons I already stated. You have to factor in the sociopolitical effects of technology on its adoption and spread, not merely the technical ones.

int_19h 3 days ago | parent | next [-]

It's kinda hilarious to see "simple task ... like translation". If you are familiar with the history of the field, or if you remember what automated translation looked like even just 15 years ago, it should be obvious that it's not simple at all.

If it were simple, we wouldn't need neural nets for it - we'd just code the algorithm directly. Or, at least, we'd be able to explain exactly how they work by looking at the weights. But now that we have our Babelfish, we still don't know how it really works in details. This is ipso facto evidence that the task is very much not simple.

oinfoalgo 3 days ago | parent | prev | next [-]

I use AI as a tool to make digital art but I don't make "AI Art".

Imperfection is not the problem with "AI Art". The problem is that it is really hard to not get the models to produce the same visual motifs and cliches. People can spot AI art so easy because of the motifs.

I think midjourney took this to another level with their human feedback. It became harder and harder to not produce the same visual motifs in the images to the point it is basically useless for me now.

dekimir 3 days ago | parent | prev | next [-]

> any serious legal situation (like a contract) is going to want a human in the loop to verify that the translation is actually correct

I hope you're right, but when I think about all those lawyers caught submitting unproofread LLM output to a judge... I'm not sure humankind is wise enough to avoid the slopification.

Davidzheng 4 days ago | parent | prev | next [-]

Isn't "But even then – any serious legal situation (like a contract) is going to want a human in the loop to verify that the translation is actually correct. This will require actual translator skills." only true if the false positive rate of the verifier is not much higher than the failure rate of the AI? At some point it's like asking a human to double check a calculator

crote 4 days ago | parent | prev | next [-]

> But even then – any serious legal situation (like a contract) is going to want a human in the loop to verify that the translation is actually correct.

The usual solution is to specify one language as binding, with that language taking priority if there turns out to be discrepancies between the multiple version.

griffzhowl 4 days ago | parent | prev [-]

You might still need humans in the loop for many things, but it can still have a profound effect if the work that used to be done by ten people can now be done by two or three. In the sectors that you mention, legal, graphic design, translation, that might be a conservative estimate.

There are bound to be all kinds of complicated sociopolitical effects, and as you say there is a backlash against obvious AI slop, but what about when teams of humans working with AI become more skillful at hiding that?

Wowfunhappy 4 days ago | parent | prev | next [-]

> Now ebook reading is switching to AI.

IMO these are terrible, I don't understand how anyone uses them. This is coming from someone who has always loved audiobooks but has never been particularly precious about the narrator. I find the AI stuff unlistenable.

evanelias 4 days ago | parent | prev | next [-]

> Book and music album covers are often done with AI (even if this is most of the times NOT advertised)

This simply isn't true, unless you're considering any minor refinement to a human-created design to be "often done with AI".

It certainly sounds like you're implying AI is often the initial designer or primary design tool, which is completely incorrect for major publishers and record labels, as well as many smaller independent ones.

spenrose 4 days ago | parent | prev | next [-]

Look at your examples. Translation is a closed domain; the LLM is loaded with all the data and can traverse it. Book and music album covers _don't matter_ and have always been arbitrary reworkings of previous ideas. (Not sure what “ebook reading” means in this context.) Math, where LLMs also excel, is a domain full of internal mappings.

I found your post “Coding with LLMs in the summer of 2025 (an update)” very insightful. LLMs are memory extensions and cognitive aides which provide several valuable primitives: finding connections adjacent to your understanding, filling in boilerplate, and offloading your mental mapping needs. But there remains a chasm between those abilities and much work.

apwell23 4 days ago | parent | prev | next [-]

> Book and music album covers are often done with AI

These suck. Things made with AI just suck big time. Not only are they stupid but they have negative value on your product.

I cannot think of single purely AI made video, song or any form of art that is any a good.

All AI has done is falsely convince ppl that they can now create things that they had no skills to do before AI.

antirez 4 days ago | parent | next [-]

This is not inherent to AI, but how the AI models were recently trained (by preference agreement of many random users). Look for the latest Krea / Black Forest Labs paper on AI style. The "AI look" can be removed.

Songs right now are terrible. For the videos, things are going to be very different once people can create full movies in their computers. Many will have access to the ability to create movies, and a few will be very good, and this will likely change many things. Btw this stupid "AI look" is only transient and is nowhere needed. It will be fixed, and AI images/videos generation will be impossible to stop.

nprateem 4 days ago | parent [-]

The trouble is, I'm perfectly well aware I can go to the AI tools, ask it to do something and it'll do it. So there's no point me wasting time eg reading AI blog posts as they'll probably just tell me what I've just read. The same goes for any media.

It'll only stand on its own when significant work is required. This is possible today with writing, provided the AI is directed to incorporate original insights.

And unless it's immediately obvious to consumers a high level of work has gone into it, it'll all be tarred by the same brush.

Any workforce needs direction. Thinking an AI can creatively execute when not given a vision is flawed.

Either people will spaff out easy to generate media (which will therefore have no value due to abundance), or they'll spend time providing insight and direction to create genuinely good content... but again unless it's immediately obvious this has been done, it will again suffer the tarring through association.

The issue is really one of deciding to whom to give your attention. It's the reason an ordinary song produced by a megastar is a hit vs when it's performed by an unsigned artist. Or, as in the famous experiment, the same world class violinist gets paid about $22 for a recital while busking vs selling out a concert hall for $100 per seat that same week.

This is the issue AI, no matter how good, will have to overcome.

HDThoreaun 4 days ago | parent | prev | next [-]

Ive made a ton of songs I enjoy with suno. Theyre not the greatest, but theyre definitely not the worst either.

apetresc 4 days ago | parent | prev | next [-]

I mean, test after test have shown that the vast, vast majority of humans are woefully unable to distinguish good AI art made by SOTA models from human art, and in many/most cases actively prefer it.

Maybe you’re a gentleman of such discerningly superior taste that you can always manage to identify the spark of human creativity that eludes the rest of us. Or maybe you’ve just told yourself you hate it and therefore you say you always do. I dunno.

apwell23 4 days ago | parent | next [-]

you couldve given me an example instead of this butthurt comment :)

apetresc 2 days ago | parent [-]

Sure! Here's the results of Scott Alexander's fairly large-scale "AI Art Turing Test": https://www.astralcodexten.com/p/how-did-you-do-on-the-ai-ar...

Note that this was already almost a year ago, and the results should be even more one-sided now.

4 days ago | parent | prev | next [-]
[deleted]
varelse 4 days ago | parent | prev [-]

[dead]

jordanpg 4 days ago | parent | prev [-]

Of course, your opinion may be subject to selection bias (i.e., you are only judging the art that you became aware was AI generated).

WesleyLivesay 4 days ago | parent [-]

Reminds me of the issue with bad CGI in movies. The only CGI you notice is the bad CGI, the good stuff just works. Same for AI generated art, you see the bad stuff but do not realize when you see a good one.

apwell23 4 days ago | parent [-]

care to give me some examples from youtube ? I am talking about videos that ppl on youtube connected to for the content in the video ( not AI demo videos).

crote 4 days ago | parent | prev [-]

> Translations are more and more done exclusively with AI or with a massive AI help

As someone who speaks more than one language fairly well: We can tell. AI translations are awful. Sure, they have gotten good enough for a casual "let's translate this restaurant menu" task, but they are not even remotely close to reaching human-like quality for nontrivial content.

Unfortunately I fear that it might not matter. There are going to be plenty of publishers who are perfectly happy to shovel AI-generated slop when it means saving a few bucks on translation, and the fact that AI translation exists is going to put serious pricing pressure on human translators - which means quality is inevitably going to suffer.

An interesting development I've been seeing is that a lot of creative communities treat AI-generated material like it is radioactive. Any use of AI will lead to authors or even entire publishers getting blacklisted by a significant part of the community - people simply aren't willing to consume it! When you are paying for human creativity, receiving AI-generated material feels like you have been scammed. I wouldn't be surprised to see a shift towards companies explicitly profiling themselves as anti-AI.

int_19h 3 days ago | parent [-]

As someone whose native language isn't English, I disagree. SOTA models are scary good at translations, at least for some languages. They do make mistakes, but at this point it's the kind of mistake that someone who is non-native but still highly proficient in the language might make - very subtle word order issues or word choices that betray that the translator is still thinking in another language (which for LLMs almost always tends to be English because of its dominance in the training set).

I also disagree that it's "not even remotely close to reaching human-like quality". I have translated large chunks of books into languages I know, and the results are often better than what commercial translators do.

onlyrealcuzzo 4 days ago | parent | prev | next [-]

It's becoming a negative label because they aren't as good.

I'm not saying it will happen, but it's possible to imagine a future in which AI videos are generally better, and if that happens, almost by definition, people will favor them (otherwise they aren't "better").

glhaynes 4 days ago | parent | next [-]

I'm not on Facebook, but, from what I can tell, this has arguably already happened for still images on it. (If defining "better" as "more appealing to/likely to be re-shared by frequent users of Facebook.")

techpineapple 4 days ago | parent | prev [-]

I mean, I can imagine any future, but the problem with “created by AI” is that because it’s relatively inexpensive, it seems like it will necessarily become noise rather than signal, if a person could pop out a high quality video in a day, in which case signal will revert to the celebrity that is marketing it rather than the video itself.

yoavm 4 days ago | parent | prev | next [-]

Perhaps this will go the way the industrial revolution did? A knife handcrafted by a Japanese master might have a very high value, but 99.9% of the knives are mass produced. "Creators" will become artisans - appreciated by many, consumed by few.

danielvaughn 4 days ago | parent | prev | next [-]

Another flaw is that humans won’t find other things to do. I don’t see the argument for that idea. If I had to bet, I’d say that if AI continues getting more powerful, then humans will transition to working on more ambitious things.

johnecheck 4 days ago | parent | next [-]

This is very similar to the 'machines will do all the work, we'll just get to be artists and philosophers' idea.

It sounds nice. But to have that, you need resources. Whoever controls the resources will get to decide whether you get them. If AI/machines are our entire economy, the people that control the machines control the resources. I have little faith in their benevolence. If they also control the political system?

You'll win your bet. A few humans will work on more ambitious things. It might not go so well for the rest of us.

treis 4 days ago | parent | next [-]

>This is very similar to the 'machines will do all the work, we'll just get to be artists and philosophers' idea

We've come a long ways to that goal. The amount of work both economic and domestic that humans do has dropped dramatically.

PKop 4 days ago | parent [-]

There are more mouths to feed and less territory per capita for each person (thus real estate inflation in desired locations). Like lanes on a highway, the population just fills the capacity with growth without the selective pressure of even selecting for skill or ability. The ways we've come see mostly front loaded as initially population takes time to grow while the immediate low hanging fruit of domestic drudgery being eliminated was quite a while ago. Meanwhile now "work" that has filled much of that obligation in the home has expanded to necessitating two full-time income households.

msgodel 4 days ago | parent | prev [-]

And it's very similar to "slaves will do all the work" which was actually possible but never happened.

4 days ago | parent | prev | next [-]
[deleted]
bamboozled 4 days ago | parent | prev [-]

If it became magic smart, then I don’t see why we couldn’t use it to enhance ourselves and become Transhuman?

johnecheck 4 days ago | parent [-]

There are a number of reasons you might not be able to.

Most likely? It's ridiculously expensive and you're poor.

cesarvarela 4 days ago | parent [-]

Techonolgy has been deflationary so far, the rich get it first but eventually it reaches everyone.

crote 4 days ago | parent [-]

Only when it is profitable for the rich.

bamboozled 3 days ago | parent [-]

Agree, Insulin is a prime example.

gopalv 4 days ago | parent | prev | next [-]

> because made by AI is becoming a negative label in a world

The negative label is the old world pulling the new one back, it rarely sticks.

I'm old enough to remember the folks saying "We used to have the paint the background blue" and "All music composers need to play an instrument" (or turn into a symbol).

d3nj4l 4 days ago | parent | prev | next [-]

> AI-generated videos are a mild amusement, not a replacement for video creators

If you seriously think this, you don’t understand the YouTube landscape. Shorts - which have incredible view times - are flooded with AI videos. Most thumbnails these days are made with AI image generators. There’s an entire industry of AI “faceless” YouTubers who do big numbers with nobody in the comments noticing. The YouTuber Jarvis Johnson made a video about how his feed has fully AI generated and edited videos with great view counts: https://www.youtube.com/watch?v=DDRH4UBQesI

What you’re missing is that most of these people aren’t going onto Veo 3, writing “make me a video” and publishing that; these videos are a little more complex in that they have separate models writing scripts, generating voiceover, and doing basic editing.

keiferski 4 days ago | parent [-]

These videos and shorts are a fraction of the entire YouTube landscape, and actual creators with identities are making vastly, vastly more money - especially once you realize how YouTube and video content in general is becoming a marketing channel for other businesses. Faceless channels have functionally zero brand, zero longevity, and no real way to extend that into broader products in the way that most successful creators have done.

That was my point: someone that has an identity as a YouTuber shouldn’t worry too much about being replaced by faceless AI bot content.

variadix 4 days ago | parent | prev | next [-]

Re: YT AI content. That is because AI video is (currently) low quality. If AI video generators could spit out full length videos that rivaled or surpassed the best human made content people wouldn’t have the same association. We don’t live in that world yet, but someday we might. I don’t think “human made” will be a desirable label for _anything_, videos, software, or otherwise, once AI is as good or better than humans in that domain.

MichaelZuo 4 days ago | parent | prev | next [-]

That’s the fundamental issue with most “analysis”, and most discussions really, on HN.

Since the vast vast majority of writers and commentators are not literal geniuses… they can’t reliably produce high quality synthetic analysis, outside of very narrow niches.

Even though for most comment chains on HN to make sense, readers certainly have to pretend some meaningful text was produced beyond happenstance.

Partly because quality is measured relative to the average, and partly because the world really is getting more complex.

nprateem 4 days ago | parent | next [-]

Oh come on. I may not be a genius but I can turn my mind to most things.

"I may not be a gynecologist, but I'll have a look."

MichaelZuo 4 days ago | parent [-]

Turning your mind to something doesn’t automatically lead to producing high quality synthetic analysis?

It doesn’t even seem relevant how good you are at step 1 for something so many steps later.

inquirerGeneral 4 days ago | parent | prev [-]

[dead]

j45 4 days ago | parent | prev | next [-]

Poorly made videos are poorly made videos.

Whether poor videos made by a human directly, or poorly made by a human using AI.

The use of software like AI to create videos with sloppy quality and reaults reflects on their skill.

Currently the use of AI leans towards sloppy because of the lower digital literacy of content creators with AI, and once they get into it, realizing how much goes into videos.

andai 4 days ago | parent | prev | next [-]

This only works in a world where AI sucks and/or can be easily detected. I've already found videos where on my 2nd or 3rd time watching I went, "wait, that's not real!" We're starting to get there, which is frankly beyond my ability to reason about.

It's the same issue with propaganda. If people say a movie is propaganda, that means the movie failed. If a propaganda movie is good propaganda, people don't talk about that. They don't even realize. They just talk about what a great movie it is.

jostylr 4 days ago | parent | prev [-]

One thing to keep in mind is not so much that AI would replace the work of video creators for general video consumption, but rather it could create personalized videos or music or whatever. I experimented with creating a bunch of AI music [1] that was tailored to my interests and tastes, and I enjoy listening to them. Would others? I doubt it, but so what? As the tools get better and easier, we can create our own art to reflect our lives. There will still be great human art that will rise to the top, but the vast inundation of slop to the general public may disappear. Imagine the fun of collaboratively designing whole worlds and stories with people, such as with tabletop role-playing, but far more immersive and not having to have a separate category of creators or waiting on companies to release products.

1: https://www.youtube.com/playlist?list=PLbB9v1PTH3Y86BSEhEQjv...