Remix.run Logo
fooker 4 hours ago

Do not try to solve an unsolvable problem, you'll end up hurting real users quite a bit more than you might imagine. Imagine new enthusiastic users trying your platform getting hit with an AI label because of inevitable false positives.

'Detecting AI' is not a problem that has real solutions, the only avenue is something supply side like synthid. But that harms users too, by introducing further barriers for indie users.

zaptrem 4 hours ago | parent [-]

I train music generation models. They are very trivial to detect. In fact, detecting them then training them to evade detection by the detection model is a big part of training them! But the detectors win instantly without some hardcore regularization. Simply turn that off and you've instantly got a perfect classifier.

This isn't like text classification, the signal many orders of magnitude higher bitrate and so many more corners need to be cut. It's likely going to be nearly impossible or at least not remotely worth it to generate an audio signal that is truly undetectable in the foreseeable future.

fooker 3 hours ago | parent | next [-]

We are talking about entirely different things.

You are right, the output of a model that generates music directly is, for now, easy to categorize as AI.

What this big flux of AI generated music online isn't really that. It'a a tiny bit autogenerated stuff and a whole lot of automatically remixed stuff. The reason it can not be easily classified as AI is because quite a bit of human produced music is also that, and you'd just shut out real users.

MetaWhirledPeas 3 hours ago | parent | prev [-]

> They are very trivial to detect.

Today. Trying to detect AI is like extracting water from puddles in a lake that is quickly drying up. What is the point in the short term if it's impractical in the long term? It will catch some low-hanging fruit in the best case, and will find false positives in the worst.