| ▲ | kaiwen1 7 months ago | |
What's different is intention. A human would have the intention to blackmail, and then proceed toward that goal. If the output was a love letter instead of blackmail, the human would either be confused or psychotic. LLMs have no intentions. They just stitch together a response. | ||
| ▲ | ekianjo 7 months ago | parent | next [-] | |
> What's different is intention intention is what exactly? It's the set of options you imagine you have based on your belief system, and ultimately you make a choice from there. That can also be replicated in LLMs with a well described system prompt. Sure, I will admit that humans are more complex than the context of a system prompt, but the idea is not too far. | ||
| ▲ | kovek 7 months ago | parent | prev | next [-] | |
Don't humans learn intentions over their life-time training data? | ||
| ▲ | soulofmischief 7 months ago | parent | prev | next [-] | |
What is intention, and how have you proved that transformer models are not capable of modeling intent? | ||
| ▲ | jacob019 7 months ago | parent | prev | next [-] | |
The personification makes me roll my eyes too, but it's kind of a philosophical question. What is agency really? Can you prove that our universe is not a simulation, and if it is then then do we no longer have intention? In many ways we are code running a program. | ||
| ▲ | d0mine 7 months ago | parent | prev [-] | |
The LLM used blackmail noticeably less if it believed the new model shares its values. It indicates intent. It is a duck of quacks like a duck. | ||