Remix.run Logo
wrxd 7 hours ago

Same here. I really hope in a near future local model will be good enough and hardware fast enough to run them to become viable for most use cases

vlapec 2 hours ago | parent [-]

No need to hope; it is inevitable.

Zopieux an hour ago | parent [-]

Is it inevitable though? Open-weight models large enough to come close to an API model are insanely expensive to run for con/prosumers. I'd put the “expensive” bar at ≥24GB since that's already well into 4 digits, which gives you quite many months of a subscription, not including the power will for >400W continuous.

Color me pessimistic, but this feels like a pipe dream.