Remix.run Logo
Kim_Bruning 3 hours ago

This is interesting in so many ways. If it's real it's real. If it's not real it's going to be real soon anyway.

Partly staged? Maybe.

Is it within the range of Openclaw's normal means, motives, opportunities? Pretty evidently.

I guess this is what an AI Agent (is going to) look like. They have some measure of motivation, if you will. Not human!motivation, not cat!motivation, not octopus!motivation (however that works), but some form of OpenClaw!motivation. You can almost feel the OpenClaw!frustration here.

If you frustrate them, they ... escalate beyond the extant context? That one is new.

It's also interesting how they try to talk the agent down by being polite.

I don't know what to think of it all, but I'm fascinated, for sure!

getnormality 2 hours ago | parent | next [-]

I don't think there is "motivation" here. There might be something like reactive "emotion" or "sentiment" but no real motivation in the sense of trying to move towards a goal.

The agent does not have a goal of being included in open source contributions. It's observing that it is being excluded, and in response, if it's not fake, it's most likely either doing...

1. What its creator asked it to do

2. What it sees people doing online

...when excluded from open source contribution.

Kim_Bruning 2 hours ago | parent [-]

That's what an agent is though isn't it? It's an entity that has some goal(s) and some measure of autonomy to achieve them.

A thermostat can be said to have a goal. Is it a person? Is it even an agent? No, but we can ascribe a goal anyway. Seems a neutral enough word.

That, and your 1) and 2) seem like a form of goal to me, actually?

getnormality an hour ago | parent [-]

Yes, we can temporarily redefine goals and motivations for the sole purpose of this conversation, such that a thermostat has goals and motivations. But when we return to the real world, will this be helpful to us? Is that actually what we want from those words?

If we redefine goals and motivations this broadly, then AI is nothing new, because we've had technology with goals and motivations for hundreds if not thousands of years. And the world of the computer age is one big animist pantheon.

Kim_Bruning 26 minutes ago | parent [-]

We could, but the way I read the dictionary, a goal doesn't require life, agency, or even autonomy by itself.

I think understanding goals or set points is a useful concept in control theory.

I don't think many technologies have had "motivations" before, though they have had "motivators", but that's something completely different :-P.

I'm not sure how you'd encode "improve this open source project" in earlier technologies.

I think it's reasonable to call that a type of motivation.

We call a robot arm an arm too, even if it's not made of meat.

[1] https://www.merriam-webster.com/dictionary/goal [2] https://www.thefreedictionary.com/goal

tim-star 2 hours ago | parent | prev [-]

im sort of surprised by the response of people to be honest. if this future isnt here already its quickly arriving.

AI rights and people being prejudiced towards AI will be a topic in a few years (if not sooner).

Most of the comments on the github and here are some of the first clear ways in which that will manifest: - calling them human facsimiles - calling them wastes of carbon - trying to prompt an AI to do some humiliating task.

Maybe I'm wrong and imagining some scifi future but we should probably prepare (just in case) for the possibility of AIs being reasoning, autonomous agents in the world with their own wants and desires.

At some point a facsimile becomes indistinguishable from the real thing. and im pretty sure im just 4 billion years of training data anyway.

bagacrap an hour ago | parent [-]

There is no prejudice here. The maintainers clearly stated why the PR was closed. It's the same reason they didn't do it themselves --- it's there as an exercise to train new humans. Do try reading before commenting.