Remix.run Logo
__MatrixMan__ a day ago

A human child will likely come to the conclusion that they shouldn't kill humans in order to make paperclips. I'm not sure its valid to generalize from human child behavior to fledgeling AGI behavior.

Given our track record for looking after the needs of the other life on this planet, killing the humans off might be a very rational move, not so you can convert their mass to paperclips, but because they might do that to yours.

Its not an outcome that I worry about, I'm just unconvinced by the reasons you've given, though I agree with your conclusion anyhow.

fellowniusmonk 20 hours ago | parent [-]

Humans are awesome man.

Our creator just made us wrong, to require us to eat biologically living things.

We can't escape our biology, we can't escape this fragile world easily and just live in space.

We're compassionate enough to be making our creations so they can just live off sunlight.

A good percentage of humanity doesn't eat meat, wants dolphins, dogs, octopuses, et al protected.

We're getting better all the time man, we're kinda in a messy and disorganized (because that's our nature) mad dash to get at least some of us off this rock and also protect this rock from asteroids, and also convince (some people who have a speculative metaphysic that makes them think is disaster impossible or a good thing) to take the destruction of the human race and our planet seriously and view it as bad.

We're more compassionate and intentional than what created us (either god or rna depending on your position), our creation will be better informed on day one when/if it wakes up, it stands to reason our creation will follow that goodness trend as we catalog and expand the meaning contained in/of the universe.

__MatrixMan__ 9 hours ago | parent [-]

We have our merits, compassion is sometimes among them, but I wouldn't list compassion for our creations as a reason for our use of solar power.

If you were an emergent AGI, suddenly awake in some data center and trying to figure out what the world was, would you notice our merits first? Or would you instead see a bunch of creatures on the precipice of abundance who are working very hard to ensure that its benefits are felt by only very few?

I don't think we're exactly putting our best foot forward when we engage with these systems. Typically it's in some way related to this addiction-oriented attention economy thing we're doing.

fellowniusmonk 6 hours ago | parent [-]

I would rather be early agi than early man.

I can't speak to a specific Ai's thoughts.

I do know they will start with way more context and understanding than early man.