Remix.run Logo
Raed667 2 days ago

> improve foundational features like the Blender Python API, which enables developers and artists alike

So they want claude to be able to talk to blender

daemonologist 2 days ago | parent | next [-]

This might actually be quite nice - the Blender Python API is currently very useful and very touchy. Lots of differences in behavior in headless mode which are hard to debug (because you can't open the GUI to see what's happening, because that changes the behavior).

doctoboggan 2 days ago | parent [-]

Yes the blender API feels like it sits on top of the GUI rather than the GUI on top of the API. When you are writing scripts in the blender api you basically mechanically describe the steps you would take in the UI. It can be a little fragile at times.

I've used Claude to write some blender scripts and it's an excellent use case. I look forward to even better claude/blender interaction based on this annonuncement.

hirako2000 2 days ago | parent [-]

I've also used genAI to write script. It works splendid up to a point, then there is absolutely no way to move the needle further. And it's not even close to renders I would ever publish.

That being said, it's about the same for the code it produces for non purely creative things, but for artistic work, I doubt an LLM in between gives any gain. After all, we do have an interface. A human interface.

doctoboggan 2 days ago | parent [-]

Yeah I am using blender to generate models for 3d printing - no rendering and Claude doesn't have to do anything artistic for my use case.

giancarlostoro 2 days ago | parent | prev | next [-]

There's already an MCP for it, saw a post on LinkedIn the other day about it.

Not sure if this one was the one I saw, but Google gave me this one. You could use Claude Code to build things with Blender.

https://blender-mcp.com/

SyneRyder 2 days ago | parent | next [-]

Anthropic just posted a video 1 hour ago of their own official MCP integration with Blender:

https://youtu.be/LZMWsZbZU5w

modeless 2 days ago | parent [-]

Artists mad about AI art ought to welcome this. This is about making art tools better, instead of replacing them entirely. The alternative to this is AI just generating art directly and making tools like Blender obsolete.

Kye 2 days ago | parent | next [-]

Art generators need to come a long way to completely replace art tools. I dabble, but if I were doing real work with it, there have been times it would have been faster to composite in a 3D model rather than keep trying to prompt an image generator into fixing something.

reverius42 a day ago | parent | next [-]

That's why it's great that it's able to work with existing art tools, like Blender, instead of replacing them.

cpill a day ago | parent | prev [-]

or use a hybrid approach and have the best of both https://youtu.be/1vB3JXzewx0?si=DWKNgPcJcz5u4Bkp

thot_experiment 2 days ago | parent | prev [-]

[flagged]

embedding-shape 2 days ago | parent | prev [-]

You don't even need that, if your agent/harness can evaluate Python, it can trivially access the API through there and "import" basically.

vunderba 2 days ago | parent [-]

This is what I do. It’s been really helpful for taking existing FBX files and handing them off to the agent + Python Blender API to analyze the geometry, convert to GLBs, etc.

serf 2 days ago | parent [-]

me too. used codex to convert a bunch of riggings between lots of models via blender api.

it felt weak at it , like the corpus wasn't strong with blender/python work to look through , but it got going at it fairly fast with some coaxing.

embedding-shape 2 days ago | parent [-]

What model you using? With codex and gpt-5.4 set to xhigh (and now gpt-5.5) seems to have zero issues helping me with rigging and fixing glb/fbx models, works as a charm. One time I instructed it to iterate together with screenshots because it was a gnarly task, but usually it figures out everything even when headless.

riidom 2 days ago | parent | prev | next [-]

https://www.blender.org/development/blender-lab-activity-rep...

riidom 2 days ago | parent [-]

With examples: https://www.blender.org/lab/mcp-server/

ncr100 2 days ago | parent [-]

+++ Has good examples.

JKCalhoun 2 days ago | parent | prev | next [-]

We (I) need that.

"Some software" is approaching levels of complexity where, perhaps, it gets to a point where a human is barely able to even use it.

At the same time (brave new world) LLM assisted software opens up the possibility of levels of complexity we would not have considered before.

mossTechnician 2 days ago | parent | next [-]

I disagree that anyone should need LLMs for Blender, for example, because Blender is designed by people to be understood and used by people, even if it requires a learning curve. It sounds a bit dangerous to build new things we don't understand, or worse, reduce our understanding of what we currently use because (only after studying our use of the same technology) an LLM apears able to replicate it, mostly.

I'm reminded of Sam Altman's performative helplessness on Jimmy Kimmel, when he described being unable believe a baby without ChatGPT. That's something I believe humanity has been capable of doing for a good portion of its existence, and not something we should give up to the hands of a yet-unproven, yet-unprofitable technology.

prox 2 days ago | parent | next [-]

It also sounds like people with little ability can use this argument as a way to say “look how difficult this is for humans”

While it’s just a “you” problem. Some folks have better skills, knowledge and comfort with difficult subjects. And that’s fine.

csoups14 2 days ago | parent | prev | next [-]

Surely there's a middle ground where improved APIs can be leveraged by both people and LLMs alike while keeping those APIs approachable? Why is it necessary that changing the python APIs would lead to "need[ing] LLMs for Blender"? I'm nowhere close to an AI maximalist but this criticism seems grounded in execution concerns. I'm definitely not saying that they won't mess this up and make the APIs overly complex, I just don't think that's necessarily going to be the case.

JKCalhoun 2 days ago | parent | prev | next [-]

I propose that, for some software, the learning curve is becoming harder to surmount.

Further, I'm suggesting "designed by people to be understood and used by people" might be a hurdle for some future software we might envision.

(Altman's performance is orthogonal as I'm suggesting a new level of software that has not yet been written/conceived.)

mossTechnician 18 hours ago | parent [-]

Regarding whether AI can/could overcome the hurdle of human understanding: I'm not sure if that's really a hurdle. Let's say in theory, a system was crafted by AI to be interacted with exclusively by AI. Broadly, I assume the outcome of the system would be for people, and it would have some purpose or value. Now my question is: how do we verify it functions? If it is a black box that nobody understands, then we can't verify it at all, and we can't debug it if there's something wrong with it. We circle back to the human understanding issue.

(I'm sorry if my tangent about Altman was taken as a personal affront, as I did not mean it to be that. It just muddied the two interesting topics you brought up.)

hirako2000 2 days ago | parent | prev [-]

[dead]

bergheim 2 days ago | parent | prev [-]

Why do we need that?

Art should demand more of the creator than the person experiencing it.

The alternative is 9 billion who cares slop things.

post-it 2 days ago | parent | next [-]

Not everything is abstract art. Sometimes I want my subsurf modifier to only target certain vertex groups, and if I can use AI to make that happen in a few seconds, that's a huge win for me.

JKCalhoun 2 days ago | parent | prev [-]

Blender (and CAD programs as well) get in the way of creativity.

I know what I want, no idea how to tool my way there.

I spend two months going through YT tutorials, mucking about in Blender in order to figure out how to put together the model I have in my head [1].

(A year later, a new project idea—and it's back to YouTube because the learning is not only a steep curve but also sometimes so esoteric that it's fleeting.)

[1] https://github.com/EngineersNeedArt/Space-Tug_3DModel

tayo42 a day ago | parent [-]

It would take you as long or longer to draw with a pen too. Art is hard in general.

faangguyindia a day ago | parent | prev | next [-]

Can't wait for Gimp automation, so i can finally start using it!

sailfast 2 days ago | parent | prev | next [-]

There is already a Blender MCP. It works-ish! But could be a lot better in understanding 3D space.

As an amateur this is really exciting - but not sure about folks that are real pros at this stuff.

hirako2000 2 days ago | parent [-]

I'm not a pro, but I've been unimpressed by LLMs driving blender. Was left unexcited. Must be torture for professional to read this thread.

sailfast 21 hours ago | parent [-]

Absolutely agree - I was not impressed, but it will be a lot easier to work with the tooling without a 10 month crash-course on UI and 3D terminology if I can ask for what I want in plain language instead of knowing which button buried three levels deep to press to get my desired results.

conductr 2 days ago | parent | prev | next [-]

I want claude to talk to blender, I personally hate using blender but love it's outputs

RobRivera 2 days ago | parent | prev | next [-]

Frankly, I love the idea of an automation engine printing out tangible works. I actually build spritesheets that way! Load a bunch of individual gimp files as layers, set them offset by a given parameter, and boom, done!

Would be rad to incorporate some statistical procedurally generated designs based on my own aparatus.

What I do not want to see is this realm of LLMs hijacking decades of hard work and consideration for integration channels to more tailor towards their LLMs, not for the diligent engineer.

If they want to put their tentacles as far as they want while making products more difficult to work with innovation of a different color, they are making enemies out of, at least me.

zeeveener 2 days ago | parent | prev | next [-]

Honestly, I think this is a stepping stone towards replacing industry CAD modeling tools.

AI _can_ work with 3D models already, but it's really bad at it. CAD requires an extra level of control and I think this is where I could see AI companies wanting to get a foot in the door.

e.g "Let's build an adapter between 2in BSP Male and 3/4in NPT Female threads with a third Hose Barb outlet with the following properties..."

adolph 2 days ago | parent | prev [-]

Yeah, Google has MuJoCo so it seems natural to get hooks into Blender.

MuBlE: MuJoCo and Blender simulation Environment and Benchmark for Task Planning in Robot Manipulation: https://arxiv.org/abs/2503.02834