Remix.run Logo
philipkglass 5 hours ago

The small models that I can run at home are becoming more capable, and I have replaced some API-based tasks with local inference as they improve, but large open weights models are still a lot stronger. The nice thing with larger open weights models is that competing providers serve them at modest margins and prices. I don't have the hardware to run the largest Qwen models, but I can get API access at low cost. Since there are only modest barriers to new commercial inference providers for these models I'm not worried that API access to them will become drastically more expensive at some future time.

CamperBob2 5 hours ago | parent [-]

And since there are only modest barriers to new commercial inference providers for these models...

Congress: "Hold my beer and watch this"