▲ | dkersten 5 hours ago | ||||||||||||||||||||||
This is similar to why I prefer LLM's to behave less human-like and more robotic and machine-like, because they're not humans or human-like, they are robotic and machine-like. The chatbot is not my friend and it can't be my friend, so it shouldn't behave like its trying to be my friend. It should answer my queries and requests with machine-like no-nonsense precision and accuracy, not try to make an emotional connection. Its a tool, not a person. | |||||||||||||||||||||||
▲ | sethammons 5 hours ago | parent | next [-] | ||||||||||||||||||||||
You're absolutely right. | |||||||||||||||||||||||
▲ | javier_e06 3 hours ago | parent | prev [-] | ||||||||||||||||||||||
I hear you ( I am not an LLM ). I can't deny that the "You are absolutely right" gives me a shot of confidence and entices me to continue the dialog. I am being manipulated. I prefer the machine to reply: Affirmative. Unfortunately this billion dollar LLM enterprises are competing for eyeballs and clicks. | |||||||||||||||||||||||
|