| ▲ | stingraycharles 2 hours ago | |||||||
Python being a very bad language for LLM stuff is a hot take I haven’t heard before. Your arguments sound mostly like personal preferences that apply to any problem, not just agentic / LLM. If we’re going to throw experience around, after 30+ years of coding experience, I really don’t care too much anymore as long as it gets the job done and it doesn’t get in the way. LangChain is ok, LangGraph et al I try to avoid like the plague as it’s too “framework”-ish and doesn’t compose well with other things. | ||||||||
| ▲ | avaer an hour ago | parent [-] | |||||||
I used to write web apps in C++, so I totally understand not caring if it gets the job done. I guess the difference where I draw the line is that LLMs are inherently random I/O so you have to treat them like UI, or the network, where you really have no idea what garbage is gonna come in and you have to be defensive if you're going to build something complex -- otherwise, you as a programmer will not be able to understand or trust it and you will get hit by Murphy's law when you take off your blinders. (if it's simple or a prototype nobody is counting on, obviously none of this matters) To me insisting that stochastic inputs be handled in a framework that provides strong typing guarantees is not too different from insisting your untrusted sandbox be written in a memory safe language. | ||||||||
| ||||||||