Remix.run Logo
qsera 6 hours ago

If you use LLMs in a way that the underlying assumption is that it is capable of "thinking" or "caring" then you are going to get burned pretty bad. Because it is an illusion and illusions disappear when they have to bear real weight of reality.

But sadly LLMs push all the right buttons that lead humans into that kind of behavior. And the marketing around LLMs works overtime to reinforce that behavior.

But instead if you ignore all that and use LLMs as a search tool, then you will get positive returns from using it.