| ▲ | wongarsu 10 hours ago | |
The same reason why they exist now. Why spend millions of tokens on designing, implementing and debugging something, followed by years of discovering edge cases in the real world, if I can just use a library that already did all of that Sure, leftpad and python-openai aren't hugely valuable in the age of LLMs, but redis and ffmpeg are still as useful as ever. Probably even more useful now that LLMs can actually know and use all their obscure features | ||
| ▲ | storystarling 7 hours ago | parent [-] | |
They know the syntax but seem to miss the architectural context. I've found that models will happily generate valid Redis commands that introduce race conditions or break state consistency the moment you have concurrency. It saves typing but you still need to design the locking strategy yourself. | ||