| ▲ | JKCalhoun 8 days ago |
| An interesting point you make there — one would assume that if recursive self-improvement were a thing, Nature would have already lead humans into that "hall of mirrors". |
|
| ▲ | Terr_ 7 days ago | parent | next [-] |
| I often like to point out that Earth was already consumed by Grey Goo, and today we are hive-minds in titanic mobile megastructure-swarms of trillions of the most complex nanobots in existence (that we know of), inheritors of tactics and capabilities from a zillion years of physical and algorithmic warfare. As we imagine the ascension of AI/robots, it may seem like we're being humble about ourselves... But I think it's actually the reverse: It's a kind of hubris elevating our ability to create over the vast amount we've inherited. |
| |
| ▲ | solid_fuel 7 days ago | parent [-] | | To take it a little further - if you stretch the conventional definition of intelligence a bit - we already assemble ourselves into a kind of collective intelligence. Nations, corporations, clubs, communes -- any functional group of humans is capable of observing, manipulating, and understanding our environment in ways no individual human is capable of. When we dream of hive minds and super-intelligent AI it almost feels like we are giving up on collaboration. | | |
| ▲ | BlueTemplar 7 days ago | parent [-] | | We can probably thank our individualist mindset for that. (Not that it's all negative.) |
|
|
|
| ▲ | twic 8 days ago | parent | prev | next [-] |
| There's a variant of this that argues that humans are already as intelligent as it's possible to be. Because if it's possible to be more intelligent, why aren't we? And a slightly more reasonable variant that argues that we're already as intelligent as it's useful to be. |
| |
| ▲ | lukan 8 days ago | parent | next [-] | | "Because if it's possible to be more intelligent, why aren't we?" Because deep abstract thoughts about the nature of the universe and elaborate deep thinking were maybe not as useful while we were chasing lions and buffaloes with a spear? We just had to be smarter then them. Which included finding out that tools were great. Learning about the habits of the prey and optmize hunting success. Those who were smarter in that capacity had a greater chance of reproducing. Those who just exceeded in thinking likely did not lived that long. | | |
| ▲ | tshaddox 7 days ago | parent [-] | | Is it just dumb luck that we're able to create knowledge about black holes, quarks, and lots of things in between which presumably had zero evolutionary benefit before a handful of generations ago? | | |
| ▲ | bee_rider 7 days ago | parent | next [-] | | Basically yes it is luck, in the sense that evolution is just randomness with a filter of death applied, so whatever brains we happen to have are just luck. The brains we did end up with are really bad at creating that sort of knowledge. Almost none of us can. But we’re good at communicating, coming up with simplified models of things, and seeing how ideas interact. We’re not universe-understanders, we’re behavior modelers and concept explainers. | | |
| ▲ | tshaddox 7 days ago | parent [-] | | I wasn't referring the "luck" factor of evolution, which is of course always there. I was asking whether "luck" is the reason that the cognitive capabilities which presumably were selected for also came with cognitive capabilities that almost certainly were not selected for. My guess is that it's not dumb luck, and that what we evolved is in fact general intelligence, and that this was an "easier" way to adapt to environmental pressure than to evolve a grab bag of specific (non-general) cognitive abilities. An implication of this claim would be that we are universe-understanders (or at least that we are biologically capable of that, given the right resources and culture). In other words, it's roughly the same answer for the question "why do washing machines have Turing complete microcontrollers in them when they only need to do a very small number of computing tasks?" At scale, once you know how to implement general (i.e. Turing-complete and programmable) computers it tends to be simpler to use them than to create purpose-built computer hardware. |
| |
| ▲ | lukan 7 days ago | parent | prev [-] | | Evolution rewarded us for developing general intelligence. But with a very immediate practical focus and not too much specialisation. |
|
| |
| ▲ | godelski 7 days ago | parent | prev | next [-] | | I don't think the logic follows here. Nor does it match evidence. The premise is ignorant of time. It is also ignorant of the fact that we know there's a lot of things we don't know. That's all before we consider other factors like if there are limits and physical barriers or many other things. | |
| ▲ | danaris 8 days ago | parent | prev [-] | | While I'm deeply and fundamentally skeptical of the recursive self-improvement/singularity hypothesis, I also don't really buy this. There are some pretty obvious ways we could improve human cognition if we had the ability to reliably edit or augment it. Better storage & recall. Lower distractibility. More working memory capacity. Hell, even extra hands for writing on more blackboards or putting up more conspiracy theory strings at a time! I suppose it might be possible that, given the fundamental design and structure of the human brain, none of these things can be improved any further without catastrophic side effects—but since the only "designer" of its structure is evolution, I think that's extremely unlikely. | | |
| ▲ | JKCalhoun 7 days ago | parent [-] | | Some of your suggestions, if you don't mind my saying, seem like only modest improvements — akin to Henry Ford's quote “If I had asked people what they wanted, they would have said a faster horse.” To your point though, an electronic machine is a different host altogether with different strengths and weaknesses. | | |
| ▲ | danaris 7 days ago | parent [-] | | Well, twic's comment didn't say anything about revolutionary improvements, just "maybe we're as smart as we can be". |
|
|
|
|
| ▲ | marcosdumay 8 days ago | parent | prev [-] |
| Well, arguably that's exactly where we are, but machines can evolve faster. And that's an entire new angle that the cultists are ignoring... because superintelligence may just not be very valuable. And we don't need superintelligence for smart machines to be a problem anyway. We don't need even AGI. IMO, there's no reason to focus on that. |
| |
| ▲ | derefr 8 days ago | parent | next [-] | | > Well, arguably that's exactly where we are Yep; from the perspective of evolution (and more specifically, those animal species that only gain capability generationally by evolutionary adaptation of instinct), humans are the recursively self-(fitness-)improving accident. Our species-aggregate capacity to compete for resources within the biosphere went superlinear in the middle of the previous century; and we've had to actively hit the brakes on how much of everything we take since then, handicapping . (With things like epidemic obesity and global climate change being the result of us not hitting those brakes quite hard enough.) Insofar as a "singularity" can be defined on a per-agent basis, as the moment when something begins to change too rapidly for the given agent to ever hope to catch up with / react to new conditions — and so the agent goes from being a "player at the table" to a passive observer of what's now unfolding around them... then, from the rest of our biosphere's perspective, they've 100% already witnessed the "human singularity." No living thing on Earth besides humans now has any comprehension of how the world has been or will be reshaped by human activity; nor can ever hope to do anything to push back against such reshaping. Every living thing on Earth other than humans, will only survive into the human future, if we humans either decide that it should survive, and act to preserve it; or if we humans just ignore the thing, and then just-so-happen to never accidentally do anything to wipe it from existence without even noticing. | |
| ▲ | Terr_ 7 days ago | parent | prev [-] | | > machines can evolve faster [Squinty Thor] "Do they though?" I think it's valuable to challenge this popular sentiment every once-in-a-while. Sure, it's a good poetic metaphor, but when you really start comparing their "lifecycle" and change-mechanisms to the swarming biological nanobots that cover the Earth, a bunch of critical aspects just aren't there or are being done to them rather than by them. At least for now, these machines mostly "evolve" in the same sense that fashionable textile pants "evolve". |
|