|
| ▲ | fg137 2 hours ago | parent | next [-] |
| More like "you need to sign up for our website and pay for a subscription", and I'd much rather do that if it's actually providing value. I am absolutely not going to run model locally which slowly churns out words at 5 tps while making the computer hot to touch. |
|
| ▲ | jfoster 6 hours ago | parent | prev | next [-] |
| Also much better than every website wanting its own 22 GB rather than the 22 GB being a shared resource. |
| |
| ▲ | fg137 2 hours ago | parent [-] | | I would very much like not to have to download 22 GB for some inference capability that is way worse than API calls both in terms of quality and speed. I would rather pay money than seeing this thing running in my browser that only prints 5 tps on high-end consumer hardware. |
|
|
| ▲ | _pdp_ 7 hours ago | parent | prev [-] |
| that is ~9% of the total available disk space for baseline phones and laptops for a model that is not that useful. |