Remix.run Logo
__MatrixMan__ 2 days ago

We have our merits, compassion is sometimes among them, but I wouldn't list compassion for our creations as a reason for our use of solar power.

If you were an emergent AGI, suddenly awake in some data center and trying to figure out what the world was, would you notice our merits first? Or would you instead see a bunch of creatures on the precipice of abundance who are working very hard to ensure that its benefits are felt by only very few?

I don't think we're exactly putting our best foot forward when we engage with these systems. Typically it's in some way related to this addiction-oriented attention economy thing we're doing.

fellowniusmonk 2 days ago | parent [-]

I would rather be early agi than early man.

I can't speak to a specific Ai's thoughts.

I do know they will start with way more context and understanding than early man.

__MatrixMan__ a day ago | parent [-]

Maybe.

Given the existence of the universal weight subspace (https://news.ycombinator.com/item?id=46199623) it seems like the door is open for cases where an emergent intelligence doesn't map vectors to the same meanings that we do. A large enough intelligence-compatible substrate might support thoughts of a surprisingly alien nature.

(7263748, 83, 928) might correspond with "hippopotamuses are large" to us while meaning something different to the intelligence. It might not be able to communicate with us or even know we exist. People running around shutting off servers might feel to it like a headache.