| ▲ | paxys 16 hours ago | |||||||||||||||||||||||||
The problem with all these "AI box" startups is that the product is too expensive for hobbyists, and companies that need to run workloads at scale can always build their own servers and racks and save on the markup (which is substantial). Unless someone can figure out how to get cheaper GPUs & RAM there is really no margin left to squeeze out. | ||||||||||||||||||||||||||
| ▲ | nine_k 15 hours ago | parent | next [-] | |||||||||||||||||||||||||
Would a hedge fund that does not want to trust to a public AI cloud just buy chassis, mobos, GPUs, etc, and build an equivalent themselves? I suspect they value their time differently. | ||||||||||||||||||||||||||
| ||||||||||||||||||||||||||
| ▲ | qubex 8 hours ago | parent | prev | next [-] | |||||||||||||||||||||||||
They’re kickstarting a TINY device that is pocketable and aimed at consumers. I’ve backed it (full disclosure). | ||||||||||||||||||||||||||
| ||||||||||||||||||||||||||
| ▲ | kkralev 15 hours ago | parent | prev [-] | |||||||||||||||||||||||||
i think the real gap isnt at the high end tho. theres a whole segment of people who just want to run a 7-8b model locally for personal use without dealing with cloud APIs or sending their data somewhere. you dont need 4 GPUs for that, a jetson or even a mini pc with decent RAM handles it fine. the $12k+ market feels like it's chasing a different customer than the one who actually cares about offline/private AI | ||||||||||||||||||||||||||
| ||||||||||||||||||||||||||