| ▲ | flessner 21 hours ago |
| Everyone is talking about theft - I get it, but there's a more subtler point being made here. Current generation of AI models can't think of anything truly new. Everything is simply a blend of prior work. I am not saying that this doesn't have economic value, but it means these AI models are closer to lossy compression algorithms than they are to AGI. The following quote by Sam Altman from about 5 years ago is interesting. "We have made a soft promise to investors that once we build this sort-of generally intelligent system, basically we will ask it to figure out a way to generate an investment return." That's a statement I wouldn't even dream about making today. |
|
| ▲ | nearbuy 19 hours ago | parent | next [-] |
| > Current generation of AI models can't think of anything truly new. How could you possibly know this? Is this falsifiable? Is there anything we could ask it to draw where you wouldn't just claim it must be copying some image in its training data? |
| |
| ▲ | mjburgess 14 hours ago | parent [-] | | Novelty in one medium arises from novelty in others, shifts to the external environment. We got brass bands with brass instruments, synth music from synths. We know therefore, necessarily, that they can be nothing novel from an LLM -- it has no live access to novel developments in the broader environment. If synths were invented after its training, it could never produce synth music (and so on). The claim here is trivially falsifiable, and so obviously so that credulous fans of this technology bake it in to their misunderstanding of novelty itself: have an LLM produce content on developments which had yet to take place at the time of its training. It obviously cannot do this. Yet an artist which paints with a new kind of black pigment can, trivially so. | | |
| ▲ | nearbuy 8 hours ago | parent | next [-] | | Kind of a weird take that excludes the vast majority of human artwork that most people would consider novel. For all the complaints one might have of cubism, few would claim it's not novel. And yet it's not based on any new development in the external world but rather on mashing together different perspectives. Someone could have created the style 100 years earlier if they were so inclined, and had Picasso never existed, someone could create the novel style today just by "remixing" ideas from past art in that very particular way. | | |
| ▲ | pesus 7 hours ago | parent [-] | | I would argue that Picasso's life experiences, the environments he grew up and lived in, the people he interacted with, and the world events that took place in his life (like the world wars) were the external developments that led to the development of cubism. Sure, an AI could take in and analyze the works that existed prior, but it couldn't have the emotional reaction that occurred en masse after WWI and started the breakdown of more classical forms of art and the development/rise of more abstract forms of art. Or, as the kids might say, AI couldn't feel the vibe shift occurring in the world at the time. | | |
| ▲ | nearbuy 2 hours ago | parent [-] | | Of course Picasso's life experience influenced what he chose to make. This isn't what the parent comment is talking about. The claim was that current LLMs (though I assume they meant generative AI in general, since we're talking about image generation rather than text) are unable to produce anything novel. Meaning they either don't think Picasso's work is novel, or they don't think a human could have prompted an AI to make a new type of abstract art before having trained it on it. Whether the AI would want to do this is irrelevant. AIs don't want anything. They do what a human prompted. And while WWI may have shaped Picasso, learning data from WWI isn't necessary in order to make a cubist painting when prompted to stitch multiple perspectives into a painting. It's blending perspectives that are available from old data. And blending things in a new way is novelty. Most novel art falls into that category. |
|
| |
| ▲ | moffkalast 12 hours ago | parent | prev [-] | | > arises from novelty in others, shifts to the external environment > Everything is simply a blend of prior work. I generally consider these two to be the same thing. If novelty is based on something else, then it's highly derivative and its novelty is very questionable. A quantum random number generator is far more novel than the average human artist. > have an LLM produce content on developments which had yet to take place at the time of its training. It obviously cannot do this. Put someone in jail for the last 15 years, and ask them to make a smartphone. They obviously cannot do it either. | | |
| ▲ | mjburgess 10 hours ago | parent [-] | | So if your point is an LLM is something like a person kept in a coma inside solitary confinement -- sure? But I don't believe that's where we set the bar for art: we arent employing comatose inmates to do anything. > I generally consider these two to be the same thing. Sure words themselves bend and break under the weight of hype. Novelty is randomness. Everything is a work of art. For a work of art to be non-novel it can only incorporate randomness. The fallacies of ambiguity abound to the point where speaking coherently disappears completely. An artist who finds a cave half-collapsed for the first time has an opportunity to render that novel physical state of the universe into art. Every moment which passes has a near infinite amount of such novel circumstances. Since an LLM cannot do that, we must wreck and ruin our ability to describe this plain and trivial situation. Poke our eyes and skewer our brains. |
|
|
|
|
| ▲ | jedimastert 21 hours ago | parent | prev | next [-] |
| The problem with generating genuinely new art is it requires "inputs" that aren't art. It's requires life experiences. |
|
| ▲ | Davidzheng 21 hours ago | parent | prev | next [-] |
| I beseech you, in the bowels of Christ, think it possible that you may be mistaken. |
| |
| ▲ | kubanczyk 16 hours ago | parent [-] | | Oliver Cromwell, a letter to the General Assembly of the Church of Scotland, 3 August 1650 |
|
|
| ▲ | bbor 21 hours ago | parent | prev [-] |
| Disregarding the (common!) assumption that AGI will consist of one monolithic LLM instead of dozens of specialized ones, I think your comment fails to invoke an accurate, consistent picture of creativity/"truly new" cognition. To borrow Chomsky's framework: what makes humans unique and special is our ability to produce an infinite range of outputs that nonetheless conform to a set of linguistic rules. When viewed in this light, human creativity necessarily depends on the "linguistic rules" part of that; without a framework of meaning to work within, we would just be generating entropy, not meaningful expressions. Obviously this applies most directly to external language, but I hope it's clear how it indirectly applies to internal cognition and--as we're discussing here--visual art. TL;DR: LLMs are definitely creative, otherwise they wouldn't be able to produce semantically-meaningful, context-appropriate language in the first place. For a more empirical argument, just ask yourself how a machine that can generate a poem or illustration depicting [CHARACTER_X] in [PLACE_Y] doing [ACTIVITY_Z] in [STYLE_S] without being creative! [1] Covered in the famous Chomsky v. Foucault debate, for the curious: https://www.youtube.com/watch?v=3wfNl2L0Gf8 |
| |
| ▲ | flessner 7 hours ago | parent | next [-] | | This may not be apparent to an english speaker as the language has a rather fixed set of words, but in German, where creating new words is common, the lack of linguistic creativity is obvious. As an example, let's talk about "vibe coding" - It's a new term describing heavy LLM usage in programming, usually associated with Generation Z. If I am asking an LLM to generate a German translation for "vibe coder" it comes up with the neutral "Vibe-Programmierer". When asking it to be more creative it came up with "Schwingungsschmied" ("vibration smith"?) - What? I personally came up with the following words: * Gefühlsprogrammierer ("A programmer, that focuses on intuition and feeling.") * Freischnauzeprogrammierer ("Free-mouthed programmer - highlighting straightforwardness and the creative expression of vibe coding." - colloquial) Interesstingly, LLMs can describe both these terms, they just can't create them naturally. I tested this on all major LLMs and the results were similar. Generating a picture of a "vibe coder" also highlights more of a moody atmosphere instead of the Generation Z aspects that are associated with it on social media nowadays. | |
| ▲ | Peritract 11 hours ago | parent | prev [-] | | > a machine that can generate a poem or illustration depicting [CHARACTER_X] in [PLACE_Y] doing [ACTIVITY_Z] in [STYLE_S] without being creative Your example disproves itself; that's a madlib. It's not creative, it's just rolling the dice and filling in the blanks. Complex die and complex blanks are a difference of degree only, not creativity. | | |
| ▲ | bbor 2 hours ago | parent [-] | | It's not filling in the blanks that's impressive, it's meaningfully combining them all into an objectively unique narrative, building upon those blanks at length. Definitions are always up for debate on instrumental grounds, but I'm dubious of any definition of "creative" that excludes truly unique yet meaningful artifacts. The only thing past that is ineffable stuff, which is inherently not very helpful for scientific discussion. |
|
|