| ▲ | hackyhacky 3 hours ago | ||||||||||||||||||||||
> Speed, cost, security, job/task management All of that will inevitably be solved. 50 years ago, using a personal computer was an extravagant luxury. Until it wasn't. 30 years ago, carrying a powerful computer in your pocket was unthinkable. Until it wasn't. Right now, it's cheaper to run your accounting math on dedicated adder hardware. But Llms will only get cheaper. When you can run massive LLMs locally on your phone, it's hard to justify not using it for everything. | |||||||||||||||||||||||
| ▲ | esseph 3 hours ago | parent [-] | ||||||||||||||||||||||
Not until power access/generation is MUCH cheaper. Long, long, long way off. If I can run 50,000 fixed tasks that cost me $0.834/hr but OpenAI is costing $37/hr and the automation takes 40x as long and can make TERRIBLE errors why the fuck would I not move to the deterministic system? Also, battery life of mobile devices. | |||||||||||||||||||||||
| |||||||||||||||||||||||