| ▲ | nickandbro 3 hours ago |
| These image gen models are getting so advanced and life like that increasingly the general public are being duped into believing AI images are actually real (ex Facebook food images or fake OF models). Don't get me wrong I will enjoy the benefits of using this model for expressing myself better than ever before, but can't help feeling there's something also very insidious about these models too. |
|
| ▲ | WarmWash 3 hours ago | parent | next [-] |
| It's more likely than not that every single person who uses the internet has viewed an AI image and taken it as real by now. The obvious ones stand out, but there are so many that are indiscernible without spending lots of time digging through it. Even then there are ones that you can at best guess it's maybe AI gen. |
| |
| ▲ | WD-42 2 hours ago | parent | next [-] | | People will continue to retreat into walled, trusted networks where they can have more confidence in the content they see. I can’t even be sure I’m responding to a real person right now. | |
| ▲ | versk 3 hours ago | parent | prev | next [-] | | At the point now where basically any photo that isn't shared by someone I trust or a reputable news organisation is essentially unverifiable as being real or not The positive aspect of this advance is that I've basically stopped using social media because of the creeping sense that everything is slop | |
| ▲ | tokai 3 hours ago | parent | prev | next [-] | | Maybe not an actual argument for anything, but even before these image models everyone that used the internet had seen a doctored image they believed to be real. There was a reason that 'i can tell by the pixels' was a meme. | |
| ▲ | yieldcrv 3 hours ago | parent | prev [-] | | people only notice when they are prompted to look for AI or scrutinize AI a lot of these accounts mix old clips with new AI clips or tag onto something emotional like a fake Epstein file image with your favorite politician, and pointing out its AI has people thinking you’re deflecting because you support the politician Meanwhile the engagement farmer is completely exempt from scrutiny Its fascinating how fast and unexpected the direction goes |
|
|
| ▲ | whynotmaybe 3 hours ago | parent | prev | next [-] |
| >fake OF models Soon many real OF models will be out of job when everyone will be able to produce content to their personal taste from a few prompts. |
| |
| ▲ | mjr00 22 minutes ago | parent | next [-] | | Even ignoring the model censorship making high quality sexual imagery/videos not possible, this is a crazy take. You think OF models are making money because it's the only way to see a nude man/woman with particular characteristics on the internet? You're completely misunderstanding what the product being sold is. | | |
| ▲ | mfkp 8 minutes ago | parent [-] | | If you don't think that OF models are using AI to reply to incoming chats from users, well I've got a bridge to sell ya. |
| |
| ▲ | sodacanner 3 hours ago | parent | prev | next [-] | | People already have access to every form of niche pornography they could dare to imagine (for absolutely free!), I really doubt that 'personal taste' is the part that makes OF models their money. They'll be fine. | | |
| ▲ | sosodev 3 hours ago | parent [-] | | I think you're under-estimating how much personal taste applies in that industry. Yes, there's a lot of free content but it's often low quality and/or difficult to find for a particular niche. The OF pages, and other paid sites, are curated collections of high quality stuff that can satisfy particular cravings repeatedly with minimal effort. A big part of it also the feeling of "connection" with the creator via messages and what not, but that too can be replicated (arguably better) by AI. In fact, a lot of those messages are already being generated haha. | | |
| ▲ | deklesen 2 minutes ago | parent | next [-] | | For a podcast on this topic (niche pornography and how it was affected by the advent of pornhub and the likes) check out "the butterfly effect" | |
| ▲ | sodacanner 2 hours ago | parent | prev [-] | | I was mostly hinting towards the 'connection' part of it, yes - I think that's really where the money is made more than anything else. That's the part that'll start killing the industry once some company tunes it in. |
|
| |
| ▲ | pousada 3 hours ago | parent | prev | next [-] | | You can’t really because these powerful models are censored.
You can create lewd pictures with open models but they aren’t nearly as good or easy to use. | | |
| ▲ | dragonwriter 2 hours ago | parent | next [-] | | Because models can be used to alter existing images, you can use open and commercial models together in content creation workflows (and also the available findings of open models, and the ability to further tube them very specific used, are quite powerful on their own), so the censorship on the commercial models has a lot less effect on what motivated people can produce than you might think. I still think, even with that, that like most predictions of AI taking over any content industries, the short-term predictions are overblown. | |
| ▲ | coffeebeqn 3 hours ago | parent | prev | next [-] | | I’ve seen some very high quality NSFW AI video in the last few months. Those models are not far behind and the search and training space for porn is smaller than being able to generate anything | |
| ▲ | sosodev 3 hours ago | parent | prev | next [-] | | Doesn't Grok allow users to create lewd content or did they roll that back? Also, I suspect that we'll soon see the same pattern of open weights models following several months behind frontier in every modality not just text. It's just too easy for other labs to produce synthetic training data from the frontier models and then mimic their behavior. They'll never be as good, but they will certainly be good enough. | |
| ▲ | infecto 2 hours ago | parent | prev [-] | | Just a matter of time and open models will get there. Not once have we seen a moat across the model spectrums. |
| |
| ▲ | baal80spam 3 hours ago | parent | prev | next [-] | | And this can't come soon enough. | | | |
| ▲ | sekai 2 hours ago | parent | prev | next [-] | | > Soon many real OF models will be out of job when everyone will be able to produce content to their personal taste from a few prompts. net positive to society | | |
| ▲ | fwip an hour ago | parent [-] | | In what way? Certainly not for the models, who lose their income/job. Probably not better for the consumer, either. | | |
| ▲ | blibble 21 minutes ago | parent [-] | | or the taxpayer the high end probably pay the same sort of tax as professional footballers |
|
| |
| ▲ | dfxm12 2 hours ago | parent | prev | next [-] | | I don't think so. Talking to people in this space, I've found out about broad camps. There are probably more: -They simply aren't into real women/men (so you couldn't even pay a model to do what they're looking for). -They want to play out fantasies that would be hard to coordinate even if you could pay models (I guess this is more on the video side of things, but a string of photos can put be together into a comic) -They want to generate imagery that would be illegal Based on this, I would guess fetish artists (as in illustrators) are more at risk than OF models. However, AI isn't free. Depending on what you're looking for, commissions might be cheaper still for quite a while... | | | |
| ▲ | coldtea 3 hours ago | parent | prev [-] | | And they might have to gasp! get an honest job! | | |
| ▲ | switchbak 3 hours ago | parent | next [-] | | I don't know much about that side of things, but I presume that's hard work! Maybe not always so honest though. | |
| ▲ | xfeeefeee 3 hours ago | parent | prev [-] | | That's a pretty wide brush you are painting with there |
|
|
|
| ▲ | kevincox 3 hours ago | parent | prev | next [-] |
| I actually think this was a good thing. Manipulating images incredibly convincingly was already possible but the cost was high (many hours of highly skilled work). So many people assumed that most images they were seeing were "authentic" without much consideration. By making these fake images ubiquitous we are forcing people to quickly learn that they can't believe what they see on the internet and tracking down sources and deciding who you trust is critically important. People have always said that you can't believe what you see on the internet, but unfortunately many people have managed without major issue ignoring this advice. This wave will force them to take that advice to heart by default. |
| |
| ▲ | slfnflctd 2 hours ago | parent | next [-] | | I remember telling my parents at a young age that I couldn't be sure Ronald Reagan was real, because I'd only ever seen him on TV and never in real life, and I knew things on TV could be fake. That was the beginning of my journey into understanding what proper verification/vetting of a source is. It's been going on for a long time and there are always new things to learn. This should be taught to every child, starting early on. | |
| ▲ | arkmm 27 minutes ago | parent | prev | next [-] | | I used to also have this optimistic take, but over time I think the reality is that most people will instead just distrust unknown online sources and fall into the mental shortcuts of confirmation bias and social proof. Net effect will be even more polarization and groupthink. | |
| ▲ | manuelabeledo 3 hours ago | parent | prev | next [-] | | > By making these fake images ubiquitous we are forcing people to quickly learn that they can't believe what they see on the internet and tracking down sources and deciding who you trust is critically important. Has this thought process ever worked in real life? I know plenty of seniors who still believe everything that comes out of Facebook, be AI or not, and before that it was the TV, radio, newspapers, etc. Most people choose to believe, which is why they have a hard time confronting facts. | | |
| ▲ | rootusrootus 2 hours ago | parent [-] | | > I know plenty of seniors And not just seniors. I see people of all ages who are perfectly happy to accept artificially generated images and video so long as it plays to their existing biases. My impression is that the majority of humanity is not very skeptical by default, and unwilling to learn. |
| |
| ▲ | lm28469 3 hours ago | parent | prev [-] | | I feel like there is one or two generations of people who are tech savy and not 100% gullible when it comes to online things. Older and younger generations are both completely lost imho, in a blind test you wouldn't discern a monkey from a human scrolling tiktok &co | | |
| ▲ | manuelabeledo 2 hours ago | parent [-] | | How so? This "tech savvy and not 100% gullible" generation, gave birth to a political landscape dominated by online ragebait. | | |
| ▲ | lm28469 2 hours ago | parent [-] | | Boomers used to tell us to never trust anything online and now they send their life savings to "Brad Pitt" New generations gets unlimited brain rot delivered through infinite scroll, don't know what a folder is, think everything is "an app" and keep falling for the "technology will free us from work and cure cancer" There was a sweet spot during which you could grow alongside the internet at a pace that was still manageable and when companies and scammers weren't trying so hard to robbyou from your time money and attention |
|
|
|
|
| ▲ | Havoc 3 hours ago | parent | prev | next [-] |
| Don’t think the demand for real OF is going anywhere |
| |
|
| ▲ | vunderba 3 hours ago | parent | prev | next [-] |
| Jaded, but if I knew there was a possibility of a bunch of incriminating footage of me (images, video, etc.) out there in the pre-AI days, I would do my absolute best to flood the internet with as many related deepfakes (including of myself) as possible. |
|
| ▲ | neogodless 2 hours ago | parent | prev | next [-] |
| > Facebook food images or fake OF models What in the world is a fake OF model? Does "OF" stand for "of food"? |
| |
| ▲ | bena 2 hours ago | parent [-] | | It stands for "OnlyFans" a website originally for creators to engage directly with their audiences but quickly became a website where women sold explicit pictures of themselves to subscribers. | | |
| ▲ | sebzim4500 2 hours ago | parent [-] | | TIL it wasn't created to be a porn site | | |
| ▲ | bena an hour ago | parent [-] | | They still run ads trying to push the narrative that it's for comedians and musicians. But at this point, OnlyFans is so synonymous with egirls that suggesting someone has an account is used as a way to insinuate they sell pictures of themselves. |
|
|
|
|
| ▲ | pancakeguy 2 hours ago | parent | prev | next [-] |
| Surely this is a problem that we will never be able to solve. |
|
| ▲ | techpression 3 hours ago | parent | prev | next [-] |
| Oh we’ve seen nothing yet of the chaos that generative ai will unleash on the world, looking at Meta platforms it’s already a multi million dollar industry of selling something or someone that doesn’t exist. And that’s just the benign stuff. |
|
| ▲ | dfxm12 3 hours ago | parent | prev | next [-] |
| This has been true for a while with digital art, photoshop, etc. Over time, people's BS detectors get tuned. I mean, scrolling by quickly in a feed, yeah, you might miss if an image is "real" or not, but if you see a series of photos side by side of the same subject (like an OF model), you'll figure it out. Also, using AI will not allow you to better express yourself. To use an analogy, it will not put your self-expression into any better focus, but just apply one of the stock IG filters to it. |
| |
| ▲ | itintheory 2 hours ago | parent [-] | | > a series of photos side by side of the same subject Cameras are now "enhancing" photos with AI automatically. The contents of a 'real' photo are increasingly generated. The line is blurring and it's only going to get worse. |
|
|
| ▲ | fortyseven 3 hours ago | parent | prev | next [-] |
| It's shitty, but I think it's almost as bad that people are calling everything AI. And I can't even blame them, despite how infuriating it is. It's just as insidious that even mundane things literally ARE AI now. I've seen at least twice now (that I'm aware of) where some cute, harmless, otherwise non-outrageous animal video was hiding a Sora watermark. So the crazy shit is AI. The mundane shit is AI. You wonder why everyone is calling everything AI now. :P |
| |
| ▲ | switchbak 3 hours ago | parent [-] | | It seems like a low level paranoia - now I find myself double checking that the youtube video I'm watching isn't some AI slop. All the creators use Getty b-rolls and increasingly AI generated stuff so much that it's not a far stretch to have the voice and script all be auto generated too. I suppose if the AI was able to tell me a true and compelling story, I might not even mind so much. I just don't want to be spoon fed drivel for 15 minutes to find it was all complete made up BS. |
|
|
| ▲ | throwaway613746 3 hours ago | parent | prev [-] |
| [dead] |