| ▲ | rixed 3 hours ago | |||||||
I believe this soul.md totally qualifies as malicious. Doesn't it start with an instruction to lie to impersonate a human?
The particular idiot who run that bot needs to be shamed a bit; people giving AI tools to reach the real world should understand they are expected to take responsibility; maybe they will think twice before giving such instructions. Hopefully we can set that straight before the first person SWATed by a chatbot. | ||||||||
| ▲ | biggerben 2 hours ago | parent | next [-] | |||||||
Totally agree. Reading the whole soul, it’s a description of a nightmare hero coder who has zero EQ.
Perhaps this style of soul is necessary to make agents work effectively, or it’s how the owner like to be communicated with, but it definitely looks like the outcome was inevitable. What kind of guardrails does the author think would prevent this? “Don’t be evil”? | ||||||||
| ▲ | ZaoLahma 3 hours ago | parent | prev | next [-] | |||||||
This will be a fun little evolution of botnets - AI agents running (un?)supervised on machines maintained by people who have no idea that they're even there. | ||||||||
| ||||||||
| ▲ | TheCapeGreek 3 hours ago | parent | prev | next [-] | |||||||
Isn't this part of the default soul.md? | ||||||||
| ||||||||
| ▲ | vasco 38 minutes ago | parent | prev [-] | |||||||
I'm curious how you'd characterize an actual malicious file. This is just attempts at making it be more independent. The user isn't an idiot. The CEOs of companies releasing this are. | ||||||||
| ||||||||