Remix.run Logo
king_phil 5 hours ago

Dark forest makes no sense to me. Why would a civilization eradicate another, spending huge amounts of resources (time, energy, material) when the universe has such an enormous scale that you cannot even get to each other in a timescale that makes much sense...

cbau 5 hours ago | parent | next [-]

To quote from the book:

> “First: Survival is the primary need of civilization. Second: Civilization continuously grows and expands, but the total matter in the universe remains constant. One more thing: To derive a basic picture of cosmic sociology from these two axioms, you need two other important concepts: chains of suspicion and the technological explosion.”

1. you can never know the intentions of other entities, and they cannot know yours (chain of suspicion)

2. technology level grows unpredictably (technological explosion)

3. the goal of civilization is survival

4. resources are finite but growth is infinite

As soon as you identify another entity in the forest, even if they cannot annihilate you at present and signal peace, both could change without warning. Therefore, the only rational move is to eradicate the other immediately. (Especially if you believe the other will deduce the same.)

Elimination in the book is basically sending a nuke, not a costly invasion force.

not sure it actually is true, but that's the argument in the book

jmull 4 hours ago | parent | next [-]

I really liked those books, for all the creative ideas... it's fine that they don't all work, but the Dark Forest has to be among the worst of them. It was unfortunate it was highlighted.

Some rebuttals, going point by point...

1. you can know the intentions of other entities by observing and communicating with them.

2. technology explosions, like pretty much exponential phenomena, are self limiting. They necessarily consume the medium that makes them possible.

3. and 4. civilizations aren't necessarily sentient (ours certainly isn't) and don't have an agency, much less goals. Individuals have goals, and some may work for the survival of the civilization they belong to. But others may decide they can profit if they work with the aliens.

4. Multiple civilizations may well come into competition over resources, but that's more of an argument about why the forest would not be dark.

Practically speaking, a civilizations that opts to focus on massive, vastly expensive efforts to find and exterminate far flung civilizations because they may become a rival in the future may be easily outcompeted by civilizations that learn to communicate with and work with other civilizations they encounter.

iugtmkbdfil834 3 hours ago | parent | next [-]

To an extent, rebuttals land.

However,

1. You are assuming a lot in the sense that you assume presence of intention -- not something guaranteed to be a feature of an alien civilization, which is, well, alien. People think that anthropocenrism only applies to body shape and having legs, because the way it tends to express itself in popular culture is robots on legs and human body shape in aliens.

And same point goes to communication; just assuming you could is a big leap.

2.Bold assumption that they are self limiting. I think the real question is what , exactly, tends to limit it. I think the answer tends to be resources, which is the foundation of dark forest argument theory to begin with.

What I am saying is that it is not a rebuttal you think it is.

3. :D yes

4. You may be again imposing human perspective on as scale that goes a little bit beyond it.

I will end with a.. semi-optimistic note. I am not sure dark forest theory is valid. We are speculating mostly based on human tendencies. By the same token, I posit that we are about as likely to be turned into an art exhibit by a passing alien artist not unlike some ants that had molten metal poured into their nests [1].

Any real alien reasons would be alien to us.

[1]https://laughingsquid.com/ant-colony-sculptures-made-by-pour...

jmull 2 hours ago | parent [-]

You can observe patterns of behavior, develop theories understanding, attempt/experiment with interactions, and refine based on the results. That's communication (and doesn't assume anything about the other alien civilization).

Now, civilizations may be more or less willing to do this and more or less successful, but that's not the same thing as no one will dare try, as the dark forest theory wants.

(Personally, I think civilizations that are better at this will outcompete ones that are worse or refuse, though that's just my own opinion.)

> Bold assumption that they are self limiting.

Name the exponential phenomena that aren't self limiting -- that don't consume the medium which allows them to exist in the first place.

> I think the answer tends to be resources, which is the foundation of dark forest argument theory to begin with.

Well, yes. One of the reasons the dark forest theory isn't coherent.

> Any real alien reasons would be alien to us.

Yes, but this doesn't back up the dark forest theory. It also doesn't mean aliens cannot be understood at any level or interacted with in any way.

(The dark forest theory makes very strong claims on the logic, intentions, strategies, resource use/governance of alien civilizations, BTW, and wants this to be uniform amongst them... even though the one civilization we actually know of doesn't adhere to them.)

recursivecaveat 3 hours ago | parent | prev [-]

Cleansing is basically free for advanced civilizations in the books. The alien (Singer) who wipes out Sol in the 3rd book doesn't even have to answer any questions from their manager about doing it, that's how cheap it is. While its true that individuals desire cooperation, I think you can assume that civilizations will keep a lid on people who will completely destroy them (or failing that, be destroyed). It seems like expansion of civilizations is not really an option. The Singer's civilization only has 1 colony world and they're already in some kind of extremely destructive war with them. Presumably the idea is once your own people expand multiple light years away, all the logic about aliens applies to them too. On the other hand if you can't expand why do you not run scorched earth on the galaxy?

There definitely is some weirdness about observation and communication: Singer's civilization can wipe out Sol with a flick of the wrist, but while they can observe the number and type of Earth's planets, that seems to be their limit. The sophon enables FTL communication and observation between Earth and Trisolaris, but the more advanced civilizations don't seem to make use of them? You could be absolutely certain of someone's threat level and intentions with one. Maybe something about the technology can be traced back to its origin system, so they are too risky to use.

I think it's all reasonable in the books, especially as a self-reinforcing state. It does definitely require a highly specific set of universal laws / technological constraints though. If the FTL drive didn't also broadcast your position to the whole universe for eg, it would crack everything wide open.

4 hours ago | parent | prev | next [-]
[deleted]
bethekidyouwant 4 hours ago | parent | prev | next [-]

That’s true among human societies as well, but trade leads to more prosperity.

AnimalMuppet 4 hours ago | parent | prev [-]

It's first-order thinking. Second-order would be to question whether trying to eradicate another race might motivate them to eradicate you, when they weren't motivated to do it before.

nate 5 hours ago | parent | prev | next [-]

Are you asking about the 3 body problem version of this? Spoiler alert: The folks doing the eradicating aren't spending much time/energy/anything on eradicating. It's one large missile through space.

I think the gist is: sure, we humans can't conceive of getting to anyone else in the universe in any timescale, but if we can keep ourselves from destroying ourselves, we'll eventually figure it out. And we'll spread. And we'll kill everything that isn't us in the process as we've done as explorers on this planet.

So really in 3BP: it's inexpensive to eradicate. But insanely expensive to possibly get the intention wrong of any other civilization you encounter. They might kill you.

(again, this is just my interpretation of what 3BP said)

5 hours ago | parent | next [-]
[deleted]
thomashop 4 hours ago | parent | prev [-]

I don't think it's correct that we destroyed everything that isn't us. If we take all living beings, we have destroyed only a small percentage.

05 4 hours ago | parent [-]

Not counting by total terrestrial vertebrate biomass.

piker 5 hours ago | parent | prev | next [-]

Makes some sense to me, as the prisoner's dilemma dictates at least some fraction will try to kill you. So you've got to go first.

Reminds me of the Dan Carlin take on aircraft carriers in World War II: if you in a carrier spotted an opposing carrier and didn't send everything you had before it spotted you, you were dead. The only move was to go all in every time.

Phemist 5 hours ago | parent | prev | next [-]

The dark forest is conditional on that it does not require huge amounts of resources to eradicate another civilization and that (over time) the universe turns out not to be of a scale enormous enough (and in the book there are agents working to actively make it smaller).

Bringing it back to the dark forest of idea space, it is an interesting question whether the the space of feasibly executable ideas being small (as this essay assumes) is inherently true, or more of a function of our inability to navigate/travel it very well.

If the former, then yes it probably is/will be a dark forest. If the latter, then I would think the jury is still out.

2 hours ago | parent [-]
[deleted]
lifeformed 4 hours ago | parent | prev | next [-]

"Timescales that makes sense" may be a human reasoning but not necessarily the reasoning of inconceivably advanced timeless civilizations. Sure, that planet of fish may be harmless now, but what about in a quick three billion years when they have FTL and AGI and Von Neuman probes and Dyson spheres and antimatter bombs? Easier to click the delete button now to save the trouble later.

sebastianconcpt 4 hours ago | parent | prev | next [-]

Agree, is a fiction based in accepting the premise of zero-sum game.

It denies that more advanced civilizations might have better models of the universe where they know this isn't an issue and we're just stupid teenagers in the neighborhood playing dangerous games and merely taking a look every now and then to see if we prove we will survive ourselves.

0x3f 5 hours ago | parent | prev | next [-]

Competition kills margins (profits, security, QoL), so the budget for eradication should be quite high, but generally speaking the idea is to destroy even fledgling upstarts, back when the cost is low.

lstodd 4 hours ago | parent [-]

And the idea does not make sense once you include intel being incomplete into the equation: what if the preemptive strike will not attain complete eradication?

You might or might not fatally cripple the opponent, but retaliation can do that too and you cannot be sure that it won't. It's MAD all over again.

0x3f 4 hours ago | parent [-]

Well if they're only an upstart, they don't have the ability to destroy you _yet_. You 'nuke' them in the hope they won't get that ability. You're aiming to stop MAD from being a thing.

In those terms, the US should have been nuking and dominating everyone, and the idea was floated after WW2, but I believe they were precluded by practical limitations.

If they had developed the tech outside of wartime, and built up a stockpile, maybe that is indeed what would have happened and we'd have a one-world government already.

middayc 12 minutes ago | parent | next [-]

The dynamics at least with the space dark forest is different than when we live on the same planet. It has to do with lack/slow communication over vast space (that you can't trust anyway).

It relied on two principles "the chain of suspicion" and "technological explosion", which don't hold true if we are on the same planet. You can google it (or llm it) :)

lstodd 4 hours ago | parent | prev [-]

Point is you cannot know if they are an upstart (whatever upstart means). It can be misinterpretation, it can be camoflage, it can be anything. But once you rain death you're better be prepared to be grateful for what you are about to receive back.

0x3f 3 hours ago | parent [-]

Depends on the context. We certainly knew nobody else had nukes.

lstodd 3 hours ago | parent [-]

That.. was the case for all of four years. And forgive me if I doubt certainity.

0x3f 3 hours ago | parent [-]

Four years is plenty of time to start launching. Also, MAD incentivizes disclosure. What would be the point of having secret nukes? Openly having them is the only way to stop the US using its nukes to stop your nuke program, in this scenario.

viccis an hour ago | parent | prev | next [-]

It's a silly concept IMO because it assumes that civilizations with the ability to do interstellar travel or communication make the decision to not do so because they have knowledge of an interstellar force that destroys any civilization that does so. It would seem like any civilization that becomes aware of such a force would be destroyed, so how would all of these surviving ones know of the danger? Actual dark forests are quiet because a mix of the animals' instinct and visible signs of danger.

While it's possible that some civilizations would hypothetically be able to observe what happened to others and keep quiet, they would all have to do so to solve the contradictions of Fermi's paradox.

Hikikomori 4 hours ago | parent | prev [-]

A space war is not needed, they could just send a few missiles to take out anyone.

I have my own theory of dark forest and AGIs. That there's some collection of AGIs out there allowing evolution to develop intelligence anywhere it happens and takes them out once it produces an AGI, or if it doesn't performs a reset. They have literally all the time available to them, can easily travel the vast distances if needed.