Remix.run Logo
thisisit 2 hours ago

Laws will be passed to make it "safer". Just like it is happening with the id verification systems. Every image or video gen will require a watermark. Something visible which cannot be removed easily or hidden which can be detected and blocked. Access to models which do not comply will be made harder through id verification checks or something.

There will be some regulatory capture in between.

World will kick into gear only when something really bad happens. Maybe a influential person - rich or politician - fooled into doing something catastrophic due to a deepfake video/image. Until then normal people being affected isn't going to move the needle.

red-iron-pine 43 minutes ago | parent | next [-]

> Laws will be passed to make it "safer". Just like it is happening with the id verification systems. Every image or video gen will require a watermark. Something visible which cannot be removed easily or hidden which can be detected and blocked. Access to models which do not comply will be made harder through id verification checks or something.

i've thought about this off and on and how to implement it. Not easily, was my general takeaway.

or rather, it's easily to implement but you're in a adversarial relationship with bad actors and easy implementations may be easily broken

e.g. your certs gotta come from somewhere and stay protected, and how do you update and control them. key management for every single camera on every phone, etc.

Miraste an hour ago | parent | prev [-]

Verification needs to work the other way around, some kind of verifiable chain of trust for photos and videos from real cameras. Watermarking all generated media is impossible.

SirMaster 42 minutes ago | parent | next [-]

I don't really understand why this is so hard or why it wasn't just done from the get go.

Just have Apple and Google digitally sign videos and photos recorded from phones and then have Google and Meta, etc display that they are authentic when shown on their platforms.

alpha_squared 21 minutes ago | parent | next [-]

You're talking about the metadata of the files, which can always be edited and someone will inevitably try to make software to do exactly that. Also, Adobe's proposal for handling generated content is exactly this and they're not able to get buy-in from other companies.

SirMaster 19 minutes ago | parent [-]

Edit the metadata in what way? It's a cryptographic hash.

If the bits that make up the video as was recorded by the camera don't match the hash anymore, then you know it was modified. That doesn't mean it's fake, it just means use skepticism when viewing. On the other hand the ones that have not been modified and still match can be trusted.

Miraste 21 minutes ago | parent | prev [-]

It becomes a hard problem quickly when you introduce editing, and most photos and videos on social media are edited. I'm not sure how it would work. It seems more feasible than universal watermarks, though.

petesergeant 22 minutes ago | parent | prev [-]

You can bootstrap some of it. I wrote the following for solving this ~9 years ago. Kinda wish I'd done the PhD now: https://github.com/pjlsergeant/multimedia-trust-and-certific...