▲ | lucaspauker 19 hours ago | |
To me it seems like LLMs are basically memory for humans as a whole. By interfacing with them, you can extract the knowledge, eliminating the need to remember things. | ||
▲ | tines 19 hours ago | parent | next [-] | |
And become the perfect puppet for the ruling class! 1984's got nothing on us. | ||
▲ | xlbuttplug2 13 hours ago | parent | prev [-] | |
This has been the case for a while with search engines. I'm convinced our brains have evolved (atrophied?) to avoid having to remember things that you can simply look up on your phone in a matter of seconds. |