Remix.run Logo
stared 3 days ago

A lot of debugging, code and mind alike, benefits from rubber ducking. LLMs do it on steroids.

At the same time, if you take their output as some objective truth (rather than stimulus), it can be dangerous. People were already doing that with both physical and mental diagnosis with Google. Now, again, it is on steroids.

And the same as with the Internet itself, some may use it to get very fine medical knowledge, others will fall for plausible pseudoscience fitting their narration. Sometimes, because of the last of knowledge on how to distinguish these, sometimes - as they really, really wanted something to be true.

> LLM therapists seem to spot these behaviour and give the user what they want to hear.

To be fair, I have heard over and over about people with real therapists. (A classic is learning that all of their parents and all exes were toxic or narcissists.) It is more likely that a good friend tell you "you fucked up" than a therapist.

> The trap is seeing this success a few times and assuming it’s all good advice, without realizing it’s a mirror for your inputs.

It is very true. Yet, for any pieces of advice, not only interaction with LLMs. And yes, the more unverifiable source, the more grains of salt you need to take it with.