Remix.run Logo
simonw 12 hours ago

Just noticed this notice added at the top of the Blender announcement of their funding from Anthropic: https://www.blender.org/press/anthropic-joins-the-blender-de...

> Notice: This announcement is causing a lot of feedback. We are actively evaluating it.

Presumably a lot of Blender users work in roles that feel threatened by AI being used for computer graphics work.

Lots of negative replies on Blursky here: https://bsky.app/profile/blender.org/post/3mkkuyq3ijs2q

hgoel 12 hours ago | parent | next [-]

I don't really get the backlash about Blender here, this isn't generative art, it's basically a natural language means of scripting blender.

This feels like the proper way to have AI act as a tool to make artist's jobs easier without taking away their creativity?

Edit: I guess they might want absolutely no AI of any sort in their tools (which seems like a strange line to draw), or is it about the data it's been trained on?

ehnto 11 hours ago | parent | next [-]

It's really clear that businesses are hoping to replace people with AI. In an industry that is already very difficult to make a stable living in, and troubled with regular plagiarism, is it really that surprising that any encroachment of AI into that space would be met with backlash?

Even if you can see how individual circumstances could be beneficial to your workflow, it's a general direction I think many take issue with quite fairly.

locknitpicker 7 hours ago | parent | next [-]

> It's really clear that businesses are hoping to replace people with AI. In an industry that is already very difficult to make a stable living in, and troubled with regular plagiarism, is it really that surprising that any encroachment of AI into that space would be met with backlash?

But what's the plan, then? Prevent any third party from downloading Blender and integrate it in any way with an agent?

ehnto 2 hours ago | parent | next [-]

An actual plan would involve regulation, otherwise we are just complaining loudly while things march on anyway.

I fully expect things to march on anyway. I have no idea how it plays out for creative industries, I am still thinking and observing in that regard.

intended 7 hours ago | parent | prev [-]

At this stage there is just protest and reaction.

Joel_Mckay 11 hours ago | parent | prev [-]

Businesses have already replaced several background artists gambling on the uncopyrightable status of "AI" output being ignored. In a comercial setting, one can't sell what they never owned in the first place.

Without a constant stream of stolen training data, the "AI" piracy bleed-through and isomorphic plagiarism business model is unsustainable.

We look forward to liquidating the GPU data-centers at a heavy discount. =3

locknitpicker 7 hours ago | parent [-]

> Businesses have already replaced several background artists gambling on the uncopyrightable status of "AI" output being ignored. In a comercial setting, one can't sell what they never owned in the first place.

I'm skeptical of this line of reasoning. Major content providers have no problem with copyright, even when content is completely produced by anonymous contributors. Is this supposed to become an issue when you eliminate some anonymous contributors?

Joel_Mckay 6 hours ago | parent [-]

>Major content providers have no problem with copyright

Besides getting sued for piracy, settling out-of-court with Disney, and or externalizing DMCA/RIAA take-down liabilities on users.

A human may transfer rights or "license" to another party in many circumstances, but may not re-sell a codified Coca-Cola logo trademark out of convenience.

All levels of the US courts concluded an "AI" can't transfer nor actually create content rights. Most WIPO members also seemed to follow the same consensus.

https://www.bbc.com/future/article/20260414-the-monkey-selfi...

There was a similar issue with folks selling marginally pitch-shifted audio assets on the Unity and Web stores. Note, they didn't have the original legal right to license this content, and customers would get their content flagged eventually.

Some kids are cheeky pirating Sony and BBC libraries... exploiting peoples assumption buying an old CD set somehow magically gives the holder broadcast or game distribution rights.

Keep being skeptical, as it will keep you in business. =3

yorwba 5 hours ago | parent [-]

Not owning the rights to some content and somebody else owning those rights are not the same thing. If someone else owns the copyright and you redistribute their stuff without permission, they have grounds to sue you. If nobody owns the copyright, because it expired long ago or because it came into being without human creative input, you can sell it just fine. So can everyone else, of course. Now, if you put your own stuff on top, that you own the copyright to, those other people can no longer redistribute it without your permission, but you can. So there's hardly any risk in using uncopyrightable background art.

Joel_Mckay 5 hours ago | parent [-]

Unless the "AI" content output is fundamentally unable to prevent piracy of other peoples content (it demonstrably can't even on a CEO live stream.) Most models will happily spew any statistically salient trademark, copyrighted and or patented code/music/images/video. Note too, GPL/LGPL is a contaminating license, so legal submarines will surface sooner or later if injected into closed-source projects.

The "how" it happens part is just legally irrelevant "[piracy] with extra steps", but if you are interested in details see below. =3

https://www.youtube.com/watch?v=YhgYMH6n004

https://www.cbsnews.com/news/taylor-swift-ai-voice-likeness-...

Here is a simplified explanation of how vector search is done in many models:

https://www.youtube.com/watch?v=YDdKiQNw80c

And a more detailed toy implementation to learn how to build your own:

https://www.youtube.com/watch?v=OUE3FSIk46g

swatcoder 12 hours ago | parent | prev | next [-]

Regardless of the purported upside, many people in the arts feel betrayed by the commercial interests that built this technology on their work without their consent and threatened by the explicit intent of these vendors to devalue their work by saturating the art and design market with cheap automated substitution.

A lot of artists who would love to be able to direct their professional software in natural language have to reconcile that with how this technology came to be and what the aims are of the company now delivering it to them.

dmarcos 11 hours ago | parent | next [-]

I spent most of my career in the open source world and doesn’t bother me models are trained on my output. Should I feel differently? It seems there’s a kind of ego or emotional attachment to the output that is more common among artists than devs? Perhaps abundance vs scarcity mindsets?

hgoel 11 hours ago | parent | next [-]

Regarding generative images, it's more of an issue because the effects are different.

Software tends to be a "living" project, so just vibe coding with 0 software knowledge is not yet fully sustainable for maintaining a project. But with art, the AI just spits out a completed image.

The generated images compete directly with the people the data was sourced from, and there have also been many cases of abuse, eg people using AI to impersonate a popular artist and selling comissions under that artist's name.

The copyright situation for generated imagery is also tricky, so people pretending to be artists only to be sharing work that isn't copyrightable can cause a ton of trouble and financial loss for customers.

Most of these issues don't apply to software in the same way. That's why I was surprised by the backlash to this as it's just touching the software side, I don't see this as threatening artist's work.

When I was dabbling in image generation (~StyleGAN2 era), my vision for image generation models was as a support tool for artists (back then I was generating small character thumbnails to help me brainstorm ideas for drawing), believing that people valued art for the human effort. Even then I would have considered what Anthropic are trying to do here as the preferable way to use AI in art workflows.

figassis 6 hours ago | parent [-]

It threatens because we aren’t just talking about selling your art. Artists get hired at companies to produce all kinds of work that will now be replaced by AI.

nailk 4 hours ago | parent [-]

Artists get hired at companies because companies have the technology that made the artists work profitable, starting from book printing (public performance -> book printing -> cinema -> tv -> internet, similar to drawing -> photo -> digital). At the Public Performance / Drawing Era artists were mostly poor low class rogues. The technology made them what they are now.

They are protesting against natural technology development. To me it looks similar to taxi drivers protesting against Uber (protecting their right to scam tourists).

Did drawing artists protest against photography? Do celebrities protest against photographers selling their photos taken by them in public places?

They are right to be afraid though. What's really happening here most probably is Anthorpic buys rights to collect user trajectory data. In order to replace Blender users later.

neya 10 hours ago | parent | prev | next [-]

I'm an artist turned CTO. My perspective is really simple - theft is theft. You (not you specifically per se) can sugar coat it however you like, but copying open source codebases/work is different from stealing proprietary/licensed work without permission. It would have been ok if stealing/sharing copyrighted work was heavily normalized, but no, a lot of people have gone to prison for simply pirating DVDs and CDs and now you're telling me it's somehow ok if a corporation does it?

JimDabell 6 hours ago | parent | next [-]

Theft is theft, but learning is not theft.

Neither is fair use, and neither is copyright infringement. But learning most definitely is not theft.

dmarcos 9 hours ago | parent | prev | next [-]

How come? We give IP law / copyright legitimacy but it’s not clear to me the more I think about it. If you draw something you def own the physical drawing but owning the idea of the drawing during your lifetime feels strange to me. It’s also a very recent invention and humans created art before and will create after.

soundworlds 9 hours ago | parent | next [-]

I agree that copyright is foundationally wrong, but the way out has to be through a culture shift of people putting their work in Public Domain. It's not up to a private company to decide everyone else's work is public commons.

neya 8 hours ago | parent | prev | next [-]

The issue is not stealing the idea itself. The issue is stealing the work in its entirety - as is - with all its flaws and character intact. That's what makes art unique, right?

I would think the same goes for codebases too. On a personal note, I wrote a CMS in Elixir from scratch way before even AI was a thing. It uses a lot of proprietary flows to make it scale, helping it serve millions of requests efficiently. I certainly did not give OpenAI nor Microsoft permission to steal my code. And yet they did. Is that not theft of my Intellectual Property?

_aavaa_ 9 hours ago | parent | prev [-]

> but owning the idea of the drawing during your lifetime feels strange to me

Oh, I wish it was limited to lifetime.

USA is currently lifetime + 70 years, and work for hire is 95 years from creation.

locknitpicker 6 hours ago | parent | prev [-]

> It would have been ok if stealing/sharing copyrighted work was heavily normalized, but no, a lot of people have gone to prison for simply pirating DVDs and CDs and now you're telling me it's somehow ok if a corporation does it?

There is no such thing as "stealing" copyrighted work. Either you have unauthorized access and/or distribution, or you don't.

Unauthorized access to copyrighted work is perfectly legal in a big chunk of the world, including western Europe. Read up on the french tradition of copyright law, particularly the provisions for personal use.

This brings us to how "people have gone to prison for simply pirating DVDs and CDs". The bulk of the cases were focused on mass commercial distribution of verbatim copies of third-party content. I'm talking about DVD-burning factories.

intended 6 hours ago | parent | prev [-]

Yes?

For example, you could least feel that the world is large enough to have people with other needs, drives and ownership levels of their work.

You could also consider that this is not an even trade; artists had all their works ingested and didn’t get a commensurate stake in openAI.

You can consider that you had a choice to share when you contributed to open source. Then imagine how a counter culture artist, who despises corporate culture, must feel to have their work consumed by another rapacious tech entity.

Or you can be the filmmaker whose clients are now showing up with entire ad clips, and then decide they would rather not spend the money on CGI to complete the video - essentially demolishing work overnight.

This isn’t to say that there are not artists who are excited by this, or artist who are happy to have their art ingested. Just that the way you phrased your question evoked this answer.

MrScruff 5 hours ago | parent | prev | next [-]

Speaking as someone who works in the industry, I haven't really heard this sentiment. Artists are predominantly hostile to diffusion models, but optimistic about LLMs and their ability to help them write tools and scripts even if they're non-technical.

rcarr an hour ago | parent [-]

So basically artists are cool with developers not getting paid so long as they do...

hgoel 12 hours ago | parent | prev [-]

Yeah, I can understand being upset with their work being stolen to train these models. Anthropic doesn't seem to be working on image/video generation, but they are still training on text-based creative works of questionable sourcing.

Makes me think that there's some room in the model lineup for one that doesn't do as well on benchmarks, but is trained on "ethically sourced" data (though they'd need to somehow prove that they aren't "accidentally" including other data).

11 hours ago | parent [-]
[deleted]
mediaman 9 hours ago | parent | prev | next [-]

The funny thing is that Allegorithmic (now part of Adobe) was far more devastating to certain classes of game artist than stuff like this will be in its current form.

It almost totally automated vast swaths of texture generation by creating algorithmic systems that technical artists could use to create textures.

Want a brick texture? Sure, you connect some nodes and set parameters and you have great looking bricks. Want the mortar to be a little more widely spaced? Done. Want some moss on the brick? Want some chipping on the brick? Want some color variation? Done, done, done.

It probably reduced the amount of time to iterate textures by more than 100x.

Now, talented technical artists make OK money because they are good at using these tools. Photoshop jockies are gone.

LLM manipulation of Blender will be interesting but it's very, very challenging to see the path of something like Claude having nearly as big of an impact. It'll be helpful to automate some common tasks, to build internal tooling. But Allegorithmic single handedly changed the way 3D games look, because you could be so much more ambitious.

You didn't really hear about it, though, because it wasn't part of the cultural zeitgeist.

mbgerring 8 hours ago | parent | prev | next [-]

People who built a career on their mastery of Blender are going to lose their livelihoods. Why is this difficult to understand?

bayarearefugee 6 hours ago | parent [-]

> Why is this difficult to understand?

It'll be way easier to understand for developers when it starts happening in earnest to our profession, which is coming soon.

It already is here to some extent, but so far mostly on the junior end so it hasn't been impacting many people who are already established in an industry that has provided relatively easy stable livelihoods for the past 30+ years, but soon won't.

movedx01 4 hours ago | parent [-]

> coming soon

The developers are literally on the bleeding edge here, it might be the most developed of the AI use cases right now. The most advanced tooling for LLMs revolves around SWE work, there are multiple prolific benchmarks that the labs are actively targeting in this area, and new ones are being built, whole product categories being spawned, software companies bleeding money for tokens.

It's the other professions that are to follow once the training data is in place to go reach for their livelihoods. SWEs got the early taste of what is coming. And the blender news is exactly that.

simonw 11 hours ago | parent | prev | next [-]

I think it's mainly anti-AI sentiment in general.

consumer451 11 hours ago | parent [-]

I am a huge beneficiary of agentic dev tools. They completely changed my life and my income. However, I totally get the general anti-AI sentiment. The ultra-bear case is that it somehow kills all of us, the bull case is that those who own the inference get the all the spoils.

Even myself, while I am currently extremely empowered by these tools... I could see my role (Founder/PM/builder) disappearing in the next couple years.

I respect you a lot, so if you have a moment, I would really like to get talked down from my take.

shinryuu 6 hours ago | parent [-]

How and why did it change your income?

consumer451 3 minutes ago | parent [-]

My dev skills atrophied long ago. I still had b2b product ideas, but was never able to raise funding. Was now able to get well past MVPs on my own, and get paying users as a co-founder.

blurbleblurble 12 hours ago | parent | prev | next [-]

People are guzzling the amygdala control juice these days

make3 12 hours ago | parent [-]

Say that again in five years when you can't find a job except mega yatch toilet cleaner because Claude is distinguished engineer level for one millionth of your cost and thousands of times faster, and can be instantly parallelized in the tens or hundreds of thousands just to be spun down arbitrarily as needed at any time

oompydoompy74 12 hours ago | parent [-]

[flagged]

nozzlegear 10 hours ago | parent | next [-]

It may be hyperbole, but it's how people genuinely feel about AI.

Qunnipiac in March found that voters like AI less than ICE. They also found that over half of Americans think AI will do more harm than good: https://poll.qu.edu/poll-release?releaseid=3955

A Gallup poll in February found only 18% of Gen Z participants surveyed were hopeful about AI: https://www.gallup.com/analytics/651674/gen-z-research.aspx

Maybe those AI doomers all need to touch grass. Or, maybe, the reverse is true and the minority of people who are optimistic about AI are suffering from software brain.

https://www.theverge.com/podcast/917029/software-brain-ai-ba...

esafak 11 hours ago | parent | prev | next [-]

He exaggerates but several studies show that AI is depressing junior hiring, and models are only going to get more competitive with humans.

make3 11 hours ago | parent | prev [-]

The transformer paper was 9 years ago. 9 years between barely translating alright between two very closely related languages (English and French, huge fraction of shared words because of William the conqueror and cultural proximity etc etc) and what we have now.

The thing is able to code up full pretty competent thousand lines projects in an hour. Even hardcore engineers use it now, as of this year. My senior front end friends already can't find jobs.

You're crazy if you think things won't change dramatically, at the scale of all of society.

https://arxiv.org/abs/1706.03762

rimliu 6 hours ago | parent [-]

https://xkcd.com/605/

bryanrasmussen 6 hours ago | parent [-]

It's funny because you're arguing that 1 month showing 1 variable increasing by 1 point is as reliable as 9 years with continual increase among multiple variables by multiple points when trying to extrapolate a trend.

torginus 8 hours ago | parent | prev | next [-]

It's not artist replacement yet because they dont have the necessary training or sophistication.

I doubt the current state shows the end of their ambitions.

make3 12 hours ago | parent | prev | next [-]

There is no acceptable use of AI for most people in the artistic field. They see it as an extreme treason, and I understand. They're under incredible incredible threat.

They are conscious of preventing momentum in a bad direction.

If they don't fight it hyper hard, a huge fraction of them will be out of a job instantly.

hgoel 12 hours ago | parent | next [-]

That's a strange position to take. I can understand not wanting models that have been trained on questionably sourced data, but otherwise they're opposing essentially a UX change, not based on UX concerns but on ideological fears.

Given how much software and other AI/computer vision improvements 3D content often relies on, it's weird to decide that the algorithm itself is unallowable.

mbgerring 8 hours ago | parent | next [-]

Do you have any idea how hard it already is to make a living in a creative field?

make3 11 hours ago | parent | prev | next [-]

This is a very first degree analysis.

AI is seen as an oppressor and a threat, and AI providers are seen as oppressors. It's understandable that people don't want to collaborate with their oppressors, either direct or by association. If you were a Jew, would you buy shoes from the Nazis just because you were individually safe from them at that moment? Or would you if you were of a minority they hadn't started exterminating yet? Or if they were not exactly the Nazis killing your people but some affiliated group?

This sounds extreme until you realize they are under threat of losing their likelihood for good.

They are right to not accept your inevitability point without a fight, this is a human thing that can be fought, revolutions have happened, and will continue to happen.

I don't necessarily agree with this but I do understand it.

FireBeyond 11 hours ago | parent | prev [-]

> I can understand not wanting models that have been trained on questionably sourced data, but otherwise they're opposing essentially a UX change, not based on UX concerns but on ideological fears.

"If you ignore their biggest, their primary, concern, their other concerns seem almost trivial".

hgoel 10 hours ago | parent [-]

I literally said I understand if the training data sourcing is their primary concern.

make3 8 hours ago | parent | next [-]

he meant that that's not the primary concern. the sourcing of the data is a red herring, they care about losing their ability to make a living doing the thing that they love that is so central to their identity

FireBeyond 10 hours ago | parent | prev [-]

I think I'm not sure how to parse your statement... I don't think there'd be much care for (or need for) the UX change if it wasn't for the whole ideological/valid fear about training AI on creative works? But it has been a long day, so I apologize.

hgoel 10 hours ago | parent [-]

I've been all over the place with my thoughts, so it's fair for you to be unsure of how to parse what I said. When making my initial post, I was thinking "this is a coding model, it isn't an image/3d model generation model, so why do they care?". I further interpreted make3 as saying that 3d artists were opposed to AI in general because they view any AI use as trending towards taking away their jobs.

So, what I meant when I said '... otherwise ...' wasn't trying to dismiss the data sourcing concern, but more like "I understand if the data sourcing is the concern, but you (make3) seem to be saying it's about the use of AI in general (ie even if, hypothetically, an ethically sourced training dataset was used for a model), which feels like a weird restriction to me". That was when I added the edit to my initial post.

2001zhaozhao 7 hours ago | parent | prev | next [-]

This is the best phrasing of the issue I've seen online anywhere.

You can find AI useful and still be against its introduction into your field for entirely understandable reasons.

Unfortunately this does create uphill friction for any good-intentioned people trying to use AI to improve art by empowering people to take on more ambitious projects. (This is a general statement and not related to the case of Anthropic. Of course Anthropic here is just trying to sell their product, which is a fair thing to do in isolation, but I also understand the opposition to it on the grounds of its downstream effects.)

simianwords 7 hours ago | parent | prev [-]

Completely false and I hate this puritan gatekeeping. Artists who hate AI are the type to put more importance on the craft than the end product itself. Art is a means of communicating something personal. It’s not meant to show off skills in how well you can move a pencil or how many fricking tools you know in adobe.

AI removes all these hurdles and directly presents you with the end problem - communication. Artists hate that because most artists don’t have anything to communicate. These people deserve to be automated away. I don’t wanna see more derivative shit. Artists who have something special to communicate won’t feel threatened by AI but feel more freedom.

javascriptfan69 6 hours ago | parent [-]

>AI removes all these hurdles and directly presents you with the end problem - communication.

Which is why 99.9% of AI art is worthless. There's literally nothing personal or interesting about getting grok to fart out some picture you thought about while sitting on the toilet in the morning.

AI art will never be good without actual artists embracing the medium.

ihsw 11 hours ago | parent | prev [-]

[dead]

giancarlostoro 8 hours ago | parent | prev | next [-]

> Lots of negative replies on Blursky

To the surprise of no one.

jarjoura 9 hours ago | parent | prev | next [-]

I really do want to support artists, but I also feel super conflicted about what is actually at stake here if an AI agent generates a scene for me. I never would have hired a 3D artist before this moment, because there's no reason for me to. However, if I can easily poop out a 3D rendering of something custom without much time or cost, I would absolutely love to do that. How many one-off presentations or project design sessions I could have with cheap throwaway 3D artwork that provides value to explain my thought process?!

Just like AI image slop and AI book slop prove though, I highly doubt whatever Claude and Blender are cooking up will ever come close to taking a prompt like

> render a scene of a corgi sitting on a chair looking out of a window at 3 cats playing with the corgi's favorite toy.

and turning that into anything useful.

preommr 11 hours ago | parent | prev | next [-]

[flagged]

skybrian 11 hours ago | parent [-]

Bluesky also has a community of AI tool developers that are more sane. Occasionally a post escapes containment.

NetOpWibby 6 hours ago | parent | prev [-]

People on Mastodon are losing their shit too[1].

I understand being unhappy about something but people gotta relax.

---

[1]: https://social.coop/@netopwibby/116483037092383210

izacus 3 hours ago | parent [-]

Why do they "gotta relax"? Are they making you uncomfortable by voicing their opinions or why exactly?