| ▲ | baby 11 hours ago | ||||||||||||||||||||||
It solves some problems! For example, if you want to run a camgirl website based on AI models and want to also prove that you're not exploiting real people | |||||||||||||||||||||||
| ▲ | dragonwriter 10 hours ago | parent | next [-] | ||||||||||||||||||||||
> It solves some problems! For example, if you want to run a camgirl website based on AI models and want to also prove that you're not exploiting real people So, you exploit real people, but run your images through a realtime AI video transformation model doing either a close-to-noop transformation or something like changing the background so that it can't be used to identify the actual location if people do figure out you are exploiting real people, and then you have your real exploitation watermarked as AI fakery. I don't think this is solving a problem, unless you mean a problem for the would-be exploiter. | |||||||||||||||||||||||
| ▲ | echelon 11 hours ago | parent | prev [-] | ||||||||||||||||||||||
Your use case doesn't even make sense. What customers are clamoring for that feature? I doubt any paying customer in the market for (that product) cares. If the law cares, the law has tools to inquire. All of this is trivially easy to circumvent ceremony. Google is doing this to deflect litigation and to preserve their brand in the face of negative press. They'll do this (1) as long as they're the market leader, (2) as long as there aren't dozens of other similar products - especially ones available as open source, (3) as long as the public is still freaked out / new to the idea anyone can make images and video of whatever, and (4) as long as the signing compute doesn't eat into the bottom line once everyone in the world has uniform access to the tech. The idea here is that {law enforcement, lawyers, journalists} find a deep fake {illegal, porn, libelous, controversial} image and goes to Google to ask who made it. That only works for so long, if at all. Once everyone can do this and the lookup hit rates (or even inquiries) are < 0.01%, it'll go away. It's really so you can tell journalists "we did our very best" so that they shut up and stop writing bad articles about "Google causing harm" and "Google enabling the bad guys". We're just in the awkward phase where everyone is freaking out that you can make images of Trump wearing a bikini, Tim Cook saying he hates Apple and loves Samsung, or the South Park kids deep faking each other into silly circumstances. In ten years, this will be normal for everyone. Writing the sentence "Dr. Phil eats a bagel" is no different than writing the prompt "Dr. Phil eats a bagel". The former has been easy to do for centuries and required the brain to do some work to visualize. Now we have tools that previsualize and get those ideas as pixels into the brain a little faster than ASCII/UTF-8 graphemes. At the end of the day, it's the same thing. And you'll recall that various forms of written text - and indeed, speech itself - have been illegal in various times, places, and jurisdictions throughout history. You didn't insult Caesar, you didn't blaspheme the medieval church, and you don't libel in America today. | |||||||||||||||||||||||
| |||||||||||||||||||||||