Remix.run Logo
emsign 8 hours ago

You are missing the point of the author. He literally said no court should have rendered a judgement, that's the exact opposite of guilty until proven innocent. Guilty means a court has made a judgement.

He is proposing to not make a judgement at all. If the AI company CLAIMS something they have to prove it. Like they do in science or something. Any claim is treated as such, a claim. The trick is to not even claim anything, let the users all on their own come to the conclusion that it's magic. And it's true that LLMs by design cannot cite sources. Thus they cannot by design tell you if they made something up with disregard to it making sense or working, if they just copy and pasted it, something that either works or is crap, or if they somehow created something new that is fantastic.

All we ever see are the success stories. The success after the n-th try and tweaking of the prompt and the process of handling your agents the right way. The hidden cost is out there, barely hidden.

This ambiguity is benefitting the AI companies and they are exploiting it to the maximum. Going even as far as illegally obtaining pirated intellectual property from an entity that is banned in many countries on one end of their utilization pipeline and selling it as the biggest thing ever at the other end. And yes, all the doomsday stories of AI taking over the world are part of the marketing hype.

anilgulecha 6 hours ago | parent [-]

sure, no "court" should render it, but then

>AI output should be treated like a forgery

Who's passing this judgement this? Author? Civil society?

thwarted 13 minutes ago | parent [-]

A forgery isn't a subjective assessment. A forgery is intentionally made inaccurate claim of the origin of something. If the by-line is claiming it was made by someone who didn't make it, it's a forgery no matter how good of a copy it is judged to be.