| ▲ | casey2 4 hours ago | |
Not quite, there are still trillions of dollars to burn through. We'll probably get some hardware that can accelerate LLM training and inference a million times, but still won't even be close to AGI It's interesting to think about what emotions/desires an AI would need to improve | ||
| ▲ | otabdeveloper4 2 hours ago | parent [-] | |
The actual business model is in local, offline commodity consumer LLM devices. (Think something the size and cost of a wi-fi router.) This won't happen until Chinese manufacturers get the manufacturing capacity to make these for cheap. I.e., not in this bubble and you'll have to wait a decade or more. | ||