▲ | lm28469 5 days ago | |||||||||||||||||||
100%, it reminds me of this post I saw yesterday about how chatgpt confirmed "in its own words" it is a CIA/FBI honeypot: https://www.reddit.com/r/MKUltra/comments/1mo8whi/chatgpt_ad... When talking to an LLM you're basically talking to yourself, that's amazing if you're a knowledgeable dev working on a dev task, not so much if you're mentally ill person "investigating" conspiracy theories. That's why HNers and tech people in general overestimate the positive impact of LLMs while completely ignoring the negative sides... they can't even imagine half of the ways people use these tools in real life. | ||||||||||||||||||||
▲ | bluefirebrand 5 days ago | parent [-] | |||||||||||||||||||
I find this really sad actually Is it really so difficult to imagine how people will use (or misuse) tools you build? Are HNers or tech people in general just very idealistic and naive? Maybe I'm the problem though. Maybe I'm a bad person that is always imagining how many bad ways I would abuse any kind of system or power that I can, even though I don't have any actual intention to actually abuse systems | ||||||||||||||||||||
|