Remix.run Logo
thurn 3 hours ago

I really fundamentally do not understand what problem Gas City solves that is not already solved by normal subagent orchestration patterns. If you want to call your main LLM session the "mayor" and have it delegate its work out to planners and coders and reviewers and QA and so on, this is already a thing you can do! If you want to do this in a reusable way you can create skills and subagent definitions and use /commands, etc. Why do we need hundreds of thousands of lines of opaque Go code to accomplish any of this?

BoggleFiend 2 hours ago | parent | next [-]

I listened to his podcast on Pragmatic Engineer. I don't think he specifically addressed what it solved, but he talked about shifting the Overton window in regards to what's possible with AI agents. I'm not arguing that he actually accomplishes this -- just noting that his goal seems to be less "create something useful" and more so "create something that gets people's attention and maybe gets them to thinking about AI in a different way".

Cynically, he published a book on vibe coding recently, so he may just be grabbing attention as some effort to boost book sales.

conception 3 hours ago | parent | prev | next [-]

Admittedly, this project started before that was possible with the standard coding agents.

DonHopkins 3 hours ago | parent [-]

No it didn't!

refulgentis 3 hours ago | parent | prev [-]

I'm trying to make the phrase "AI DDOSing" happen.

ex. someone's GitHub repo with a ton of code and a README written by AI claiming fantastical features not present in the code.

Or, more subtle someone, "self-DDOS'ing via AI" - thats for when "LLM psychosis" is too strong, i.e. for "I went too far down a rabbit hole with the interactive chatbot for a month and now I have 1M LOC and 95% test coverage and an app that I don't understand"

I quit my job at Google in 2023 and have spent 2.5 years working on an LLM-based agentic app.

To me, this looks like an unfortunate self-AI-DDOS'ing by someone with even more runway than my seemingly infinite runway.

It's well-meaning, like, in 2030 I'm fairly sure we'll have a meta-layer and simplistic "here's a bug, read files, edit, fix" will seem slow/strangled. But he's at least a couple years ahead of the models, and whatever metalayer exists won't have the bizarre UX model.

whattheheckheck 2 hours ago | parent [-]

Someone has to be bleeding edge

refulgentis an hour ago | parent [-]

This is true, without people trying, we wouldn't know what doesn't work yet - for all I know he's cracked something big and in a month we'll see the first AI-built operating system (I'm not being sarcastic)