| AI, bots, and the constant need to fuel training, turns the internet forest into a place where anything a person publishes will be vampirically consumed, mimicked to see whether it can replicate whatever value could be had, and capitalized on if possible or ignored if not. It’s an interesting contrast to actual dark forest theory—-AI doesn’t want to destroy us, so we don’t need to hide for existential sake. But imagine walking outside and as soon as you do, innumerable copies of you spring up, and each action and sound you make is replicated and amplified. Like a weird Phantom Tollbooth meets Alice in Wonderland on DMT |
| > AI, bots, and the constant need to fuel training, turns the internet forest into a place where anything a person publishes will be vampirically consumed, mimicked to see whether it can replicate whatever value could be had, and capitalized on if possible or ignored if not. That's honestly not a problem, unless someone believes they're entitled to 100% of value generated by their interactions with the world - but that's a level of greed way beyond Scrooge McDuck, and the kind of thinking that defines the dreaded "late stage capitalism". It really is an option to publish something and not care about knowing exactly who read it, and what they're doing with it. |
| |
| ▲ | throwway120385 3 days ago | parent | next [-] | | Why do you think that business value is the only thing to look at here? If someone starts copying my likeness, mannerisms, writing style, etc. they could also use that to damage my reputation or harm my relations with other people. I think that those two possibilities represent irreparable harms with no associated business value. | | |
| ▲ | TeMPOraL 3 days ago | parent [-] | | > If someone starts copying my likeness, mannerisms, writing style, etc. they could also use that to damage my reputation or harm my relations with other people. I think that those two possibilities represent irreparable harms with no associated business value. There's 8 billion people on the planet. Unless you're a celebrity, this is not a real problem for you at this point (and if you are, it's a business problem). There's no way for a large model learning on the entirety of the Internet to somehow convert "copying likeness, mannerisms, writing style, etc." into damaging your reputation; doing that is something hyper-targeted, and at this point (and in conceivable future), there's no middle ground between "plausible deniability" and "someone targeting you specifically, which they could do just as well in pre-AI times". You're also not as unique as you think. There are many people with same mannerism, many people with same writing style, etc. Nor are those things constant over time. The flip side of not being a unique snowflake is also that anyone's contributions to the public Internet are, for purposes of AI training, worth approximately $0 on the margin, and impact the model just as much. It's in the volume of data that the patterns emerge that LLMs learn, volume too big for any person to be entitled to a meaningful part of it. | | |
| ▲ | throwway120385 13 hours ago | parent | next [-] | | You could also make the same argument about check forgery or wire fraud. "If someone passes a bad check in your name it's not a big deal because you're only one of 8 billion." However when it happens to you, specifically, it's a big deal because you have to fight the bank and the merchant over it. "You're not special" is a bad argument for why someone impersonating you isn't a problem. I mean even Black Mirror explained why impersonation is a problem in the very first episode many years ago. You don't even have to be a celebrity, you only have to be in someone else's way. | | |
| ▲ | TeMPOraL 3 hours ago | parent [-] | | > You could also make the same argument about check forgery or wire fraud. You couldn't, because unlike all the things we discussed about AI, those are actually harming the victim in real, direct terms. Having some model who doesn't know or care about you talking just like you, without claiming it's actually you, to a bunch of people who don't know or care you exist? That's zero actual damage to you. > "You're not special" is a bad argument for why someone impersonating you isn't a problem. I mean even Black Mirror explained why impersonation is a problem in the very first episode many years ago. I remember that episode. However, my argument isn't "you're not special, therefore impersonating you isn't a problem" - it's "you're not special, so what you think is impersonating you probably isn't, and even if, it doesn't hurt you in real terms", combined with "anything that's actual impersonation and/or hurts you directly was already possible, and AI currently doesn't impact this at all". Someone wants to screw with you? You're being targeted. AI might make the attacker's job a bit easier, but it's still someone going through the effort, vs. a background process on the Internet everyone seems to think LLMs are. Also: there's an inverse relationship between weight of accusations and social proximity. A specific person you know (and other people know you know) accusing you of something? It's a problem. Some random comments from random accounts, accusing you and 100 different people of something? Most people won't believe it. (Except when it's about child abuse. People are extremely sensitive to this - just bringing up the term and a name in the same sentence can ruin the victim's life.) |
| |
| ▲ | 3 days ago | parent | prev | next [-] | | [deleted] | |
| ▲ | hkgjjgjfjfjfjf 3 days ago | parent | prev [-] | | [dead] |
|
| |
| ▲ | PrismCrystal 3 days ago | parent | prev | next [-] | | > It really is an option to publish something and not care about knowing exactly who read it, and what they're doing with it. In a world with UBI (or in a world where creators are aristocracy who don’t have to worry about money) I would agree. In a world of precarity, obviously many people are going to desire to limit reuse of their creations unless they get rewarded for it somehow. | | |
| ▲ | TeMPOraL 2 days ago | parent [-] | | Except all those people performatively declaring their refusal to comment on forums and write blog posts, they're not actually making their money this way. LLMs take away nothing from them, except maybe inflated sense of uniqueness. I can understand why all kinds of artists, who make money from their creative work, have an issue with generative models. But I feel this is unavoidable - graphics and writing and video creation will follow the same path music already did decades above. For a long time now, bands and singers don't make their living through their songs and recordings. They make it through live performances and branded merch, and related things. Music is too cheap to meter, but human connection isn't, which is why musicians make money by being entertainers. This is the fate that awaits other creative arts too. | | |
| ▲ | PrismCrystal 2 days ago | parent [-] | | The artists who don't mind not making money from their work itself, have built a personal brand to obtain other sources of income. But the future that the OP describes, and which feels quite believable, is one where AI defeats humans even at building a monetized brand around obtained content, so there isn't anything left for humans. I am much less optimistic than you that people will demand human connection, I think they will be satisfied with a convincing enough simulacrum. |
|
| |
| ▲ | stryan 3 days ago | parent | prev | next [-] | | It's the same reason why you can't use pictures of people in public or their quotes or tweets for commercial reasons without permission: there's a general concept of human dignity and privacy that includes people's thoughts, words, and appearances/etc as belonging to them. It's not greedy to want respect for basic human autonomy. Plus pragmatically speaking this sounds suspiciously like something that's "not a problem" until it is a problem. If a poorly written LLM regurgitates my name and lines from a blog post (or god forbid hallucinates a blog post by me as a citation) in some defense of Pol Pot or something that's going to be come a problem for me very quickly. EDIT: Fridge thought: quite frankly if someone's making money off my interactions in this economy you can bet your last dollar I want my fair share of it. It's not greedy to gain value from your interactions with the world: you already do by interacting since that's a bi-lateral relationship ie it affects each party equally. It IS greedy for a third party who did not participate in the interaction to expect value from it. | |
| ▲ | ozim 3 days ago | parent | prev [-] | | It is not about being entitled to 100% of all the value. It is about the right to have my own value not diluted. It is not about caring to know who read each one of my words. It is about the fact that people can later say „xyz wrote that” instead of „oh well another AI generated crap comment” when in reality it was someone’s original thought. It is not capitalism, but basic respect for other human being. | | |
| ▲ | TeMPOraL 3 days ago | parent [-] | | "There are no fully original thoughts, everything is a remix." -- Socrates, ~370 BC. "We attribute a thought to a person through a chain of custody. Without it, the thought could be attributed to anyone, or to a generative language model. Not that it matters anyway." -- Abraham Lincoln, ~1862. |
|
|