| ▲ | OptionOfT 4 hours ago | |||||||||||||||||||||||||||||||||||||
I remember when Netflix took out a whole page ad for their Orange is the new Black show. John Oliver had a piece on it https://www.youtube.com/watch?v=E_F5GxCwizc This is a natural extension of it. But what is revolutionary is the scale that this is now possible. We have so many people out there who now blindly trust the output of an LLM (how many colleagues have you had proudly telling you: I asked Claude and this is what it says <paste>). This is as advertiser's wet dream. Now it's ads at the bottom, but slowly they'll become more part of the text. And worst part: you don't know, bar the fact that the link has a refer(r)er attached to it. The internet before and after LLMs is like steel before and after the atomic bombs. Anything after is contaminated. | ||||||||||||||||||||||||||||||||||||||
| ▲ | Quarrelsome 4 hours ago | parent [-] | |||||||||||||||||||||||||||||||||||||
> slowly they'll become more part of the text Wouldn't that be quite challenging in terms of engineering? Given these people have been chasing AGI it would be a considerable distraction to pivot into hacking into the guts of the output to dynamically push particular product. Furthermore it would degrade their product. Furthermore you could likely keep prodding the LLM to then diss the product being advertised, especially given many products advertised are not necessarily the best on the market (which is why the money is spent on marketing instead of R&D or process). Even if you manage to successfully bodge the output, it creates the desire for traffic to migrate to less corrupted LLMs. | ||||||||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||||||||