▲ | TheOtherHobbes 2 days ago | |
Considering how bad LLMs are at understanding anything, and how they still manage to be useful, you simply don't need this level of complexity. You need something that mostly works most of the time, and has guardrails so when it makes mistakes nothing bad happens. Our brains acquire quite good heuristics for dealing with physical space without needing to experience all of physical reality. A cat-level or child-level understanding of physical space is more immediately useful than a philosopher-level of understanding. |