| ▲ | zozbot234 2 hours ago | |||||||
That AI will have to be significantly preferable to the baseline of open models running on cheap third-party inference providers, or even on-prem. This is a bit of a challenge for the big proprietary firms. | ||||||||
| ▲ | johnvanommen an hour ago | parent [-] | |||||||
> the baseline of open models running on cheap third-party inference providers, or even on-prem. This is a bit of a challenge for the big proprietary firms. It’s not a challenge at all. To win, all you need is to starve your competitors of RAM. RAM is the lifeblood of AI, without RAM, AI doesn’t work. | ||||||||
| ||||||||