| ▲ | qwertox 17 hours ago | |||||||
Let's see which company becomes the first to sell "coding appliances": hardware with a model good enough for normal coding. If Mistral is so permissive they could be the first ones, provided that hardware is then fast/cheap/efficient enough to create a small box that can be placed in an office. Maybe in 5 years. | ||||||||
| ▲ | giancarlostoro 14 hours ago | parent | next [-] | |||||||
My Macbook Pro with an M4 Pro chip can handle a number of these models (I think it has 16GB of VRAM) with reasonable performance, my bottleneck continuously is the token caps. I assume someone with a much more powerful Mac Studio could run way more than I can, considering they get access to about 96GB of VRAM out of the system RAM iirc. | ||||||||
| ▲ | bakies 16 hours ago | parent | prev | next [-] | |||||||
I bought a framework desktop hoping to do this. | ||||||||
| ||||||||
| ▲ | brazukadev 17 hours ago | parent | prev | next [-] | |||||||
my bet is a deepseek box | ||||||||
| ▲ | baq 17 hours ago | parent | prev [-] | |||||||
llm in a box connected via usb is the dream. ...so it won't ever happen, it'll require wifi and will only be accessible via the cloud, and you'll have to pay a subscription fee to access the hardware you bought. obviously. | ||||||||