| ▲ | SquibblesRedux 5 days ago |
| This is another great example of how LLMs are not really any sort of AI, or even proper knowledge representation. Not saying they don't have their uses (like souped up search and permutation generators), but definitely not something that resembles intelligence. |
|
| ▲ | nonethewiser 5 days ago | parent [-] |
| While I agree, it's still shocking how far next token prediction gets us to looking like intelligence. It's amazing we need examples such as this to demonstrate it. |
| |
| ▲ | SquibblesRedux 5 days ago | parent | next [-] | | Another way to think about it is how interesting it is that humans can be so easily influenced by strings of words. (Or images, or sounds.) I suppose I would characterize it as so many people being earnestly vulnerable. It all makes me think of Kahneman's [0] System 1 (fast) and System 2 (slow) thinking. [0] "Thinking, Fast and Slow" https://en.wikipedia.org/wiki/Thinking,_Fast_and_Slow | |
| ▲ | seba_dos1 5 days ago | parent | prev [-] | | It is kinda shocking, but I'm sure ELIZA was too for many people back then. It just took shorter to realize what was going on there. |
|