Remix.run Logo
corysama 5 days ago

Working on a rented GPU would not be local. But, renting a low-end GPU might be cheap enough to use for hobbyist creative work. I'm just musing on lots of different routes to make hobby AI use economically feasible.

simonw 5 days ago | parent | next [-]

The gpt-oss-20b model has demonstrated that a machine with ~13GB of available RAM can run a very decent local model - if that RAM is GPU-accessible (as seen on Apple silicon Macs for example) you can get very usable performance out of it too.

I'm hoping that within a year or two machines like that will have dropped further in price.

hyghjiyhu 5 days ago | parent | prev [-]

You are absolutely right that a rented GPU is not local, but even so it brings you many of the benefits of a local model. Rented hardware is a commodity, if one provider goes down there will be another. Or in the worst case you can decide to buy your own hardware. This ensures you will have continuity and control. You know exactly what model you are using and will be able to keep using it tomorrow too. You can ask it whatever you want.