| ▲ | raincole 3 hours ago | |
Cool? And it has nothing to do with what kind of consumer hardware Apple should sell. If your use cases are literally "bigger model better" then the you should always use cloud. No matter how much computing power Apple squeezes into their device it won't be a mighty data center. | ||
| ▲ | gizajob 2 hours ago | parent | next [-] | |
For running the model once it’s been trained, all a datacenter does is give you lower latency. Once the devices have a large enough memory to host the model locally, then the need to pay datacenter bills is going to be questioned. I’d rather run OpenClaw on my device plugged into a local LLM rather than rely on OpenAI or Claude. | ||
| ▲ | 3 hours ago | parent | prev [-] | |
| [deleted] | ||