Remix.run Logo
LorenDB 2 days ago

IMO the most likely solution to interplanetary networking is to throw tons of datacenter and compute that's anywhere more than a few light-seconds from the nearest existing datacenter, then use something along the lines of IPFS to perform data synchronization between planets.

bigfatkitten 2 days ago | parent | next [-]

Despite the name, IPFS has no properties that make it suitable for this application. It’s very bandwidth intensive and isn’t designed with latency or disruption tolerance in mind.

knome 2 days ago | parent | prev | next [-]

there's a lot of interesting problems just in the networking.

if it took four years for a message to cross the void from where you are to the recipient, you certainly wouldn't want to wait a full eight years to see they didn't send a receipt message and only then retransmit.

eight years is some awful latency.

you'd probably want to send each message at something like a fibonacci over the months. so, gaps of (1, 1, 2, 3, 5, 8, etc) would mean sending the message on months (1, 2, 4, 7, 12, 20, 33, etc) until you got a confirmation message that they had received it. they would similarly want to send confirmations in the same sort of pattern until they stopped receiving copies of that message.

spreading the resends out over time would ensure not all of your bandwidth was going to retransmissions. you'd want that higher number of initial transmissions in hopes that enough of the message makes it across the void that they would have started sending receipts reasonably close to the four years the initial message would take to get there.

if you had the equivalent of a galactic fido-net system, it could be decades and lifetimes between messages sent to distant stars and messages sent back.

furyofantares 2 days ago | parent | next [-]

Wouldn't you want to completely saturate your bandwidth? Just always be transmitting whatever message has been transmitted the least.

knome 2 days ago | parent [-]

that would probably depend on how much power it takes to send the messages, how much actual usable bandwidth you could manage over the distances involved, and how much data you want to send.

if it takes a large amount of energy to send the data, we probably wouldn't want to run the equipment all the time. strong pulses would let the equipment cool down or recharge capacitor banks or whatever during downtime.

interstellar dust and other debris floating through space could cause interference, not to mention radiation from everything else around us, and our own sun shining right next to our little laser.

might want to move the laser out onto pluto or something to avoid having it right up against the sun.

toast0 2 days ago | parent | prev | next [-]

You'd want to do a lot of work with erasure codes as well.

Sanzig 2 days ago | parent | prev | next [-]

It would be a lot more efficient to use erasure coding + heavy interleaving with other traffic so that you can withstand a maximum predicted outage period.

scottyah 2 days ago | parent | prev [-]

and you'd probably want to take orbits/vectors into account, a djikstra-esque algorithm where the distances change is crazy.

Also, our signals are usually going very short distances very quickly and are very protected from solar/cosmic waves by the ionosphere. What kind of data loss could you get transmitting in open space across vast distances and time?

Sanzig 2 days ago | parent [-]

Interstellar space is pretty empty, and we have good models for it thanks to the radio astronomy community. Dispersion is low enough to be nearly negligible, even over tens of light years.

Determining theoretical interstellar link rates is a fairly straightforward link budgeting exercise, easier in fact than most terrestrial link calculations because you don't have multipath to worry about.

jvanderbot 2 days ago | parent | prev | next [-]

I agree! This was my obsession when I worked at JPL, unfortunately the answer was usually "no mission will sacrifice their budget for reusable assets".

You'd need a mission whose purpose is to emplace compute stations.

That's why we can't have nice things.

r14c 2 days ago | parent | prev | next [-]

you'd probably want a different protocol than IPFS for that application. managing a DHT with extremely high latency isn't going to work very well. something like named-data networking would probably work better since the transmitter can know

1. exactly what prefixes need to be buffered based on the received interest messages from deep space 2. exactly which data rate is possible at any given time 3. exactly how much data needs to be sent from the buffer in each transmission

optimizing for high latency really pushes your design choices around compared to our comparatively very low latency uses here on earth. its pretty interesting to think about.

macintux 2 days ago | parent | prev | next [-]

How would that work to, say, Mars? Have satellites filling many, many orbits between the two planets?

cjtrowbridge 2 days ago | parent | next [-]

We already have an interplanetary internet called the NASA Deep Space Network. Understanding it's limitations and challenges is a good way to start thinking about this.

BizarroLand 2 days ago | parent | prev [-]

Nah, nothing that extreme. The broadcast range and bandwidth of even current technology in space could handle a huge amount of fairly rapid data transfer between the two planets.

It would be more like a handful of satellites, some orbiting earth, some orbiting mars, and then a handful of relay satellites serving as intermediaries.

Don't count on playing e-sports competitively, though.

The lag under ideal conditions would be insane, about 2.5 minutes each way (when the planets are "only" 40 million kilometers apart), but with repeaters and overhead probably closer to twice that.

macintux 2 days ago | parent [-]

The comment was a few light-seconds. That's a lot of hops to Mars to fill to sustain that coverage year-round.

snickell 2 days ago | parent [-]

The distance between earth and mars varies between 150 and 2000 light seconds.

macintux 2 days ago | parent [-]

But carpeting that distance across the entire volume of space between the planets with data centers every few light-seconds apart seems ambitious. A hundred or more data centers in space?

> throw tons of datacenter and compute that's anywhere more than a few light-seconds from the nearest existing datacenter

I think I'm misinterpreting the comment.

BizarroLand a day ago | parent [-]

It's like the transatlantic internet cable. One really beefy interconnect is more than enough for two halves of the planet to talk.

We wouldn't need to blanket the solar system in data centers to be able to communicate with other planets. We would only need enough connections so that no matter where in their respective orbits they are, there is a line of radio "sight" that is clear enough for high bandwidth communications to work.

I don't have access to the specifics, but I imagine something between 5 and 10 satellite data centers orbiting the sun in between earth and mars would be enough to maintain communications with minimum delay regardless of when in the solar year the comms take place.

macintux a day ago | parent [-]

At their maximum separation, Mars & Earth are about 20 minutes apart. If we had 10 satellite data centers all in perfect alignment (disregarding the sun, which obviously makes a hash out of things) they'd still each be 2 minutes apart.

Once you take into consideration the sun, plus the fact that the you'd need to cover the full disk to keep all data centers within a few minutes of another one in an unbroken chain back to both planets, I just don't get the math involved here.

But, I'm also terrible at both math and visualization, so I readily concede I may be missing something obvious.

BizarroLand a day ago | parent [-]

Think of it more like 3 circles.

The inner circle has Earth's orbit in it. The outer circle is Mar's orbit.

The middle circle would be a ring of relatively stationary satellites in between them.

And in the center of all 3 circles is the Sun, which will not allow radio signals to pass through.

I drew a crappy illustration to demonstrate: https://ibb.co/tP2rkzS0

When Mars and the Earth are on opposite sides of the sun, a satellite ring can transmit around the sun and keep the communication lines open.

Having a ring of relay satellites gives you a set distance to transmit from Mars. The satellites can then transmit their received data from the one that is closest to Mars to the one that is closest to Earth, which would then send the data to Earth.

This is helpful for a variety of reasons, but the most important one is that with this setup, even when the Sun is in between Earth and Mars, you could still send data around the sun.

Constant communication, no communications breakdowns. Even if 1 satellite failed for some reason, a bit of maneuvering would allow the others to backfill the gap until it could be repaired or replaced.

Even when Earth and Mars are close together, it would still be smart to use the relay so that the power levels are easily calculated and maintained.

macintux 17 hours ago | parent [-]

That makes sense. I guess I was hung up on “a few light seconds” since that’s more like, what, 5-10 minutes per hop?

unit149 2 days ago | parent | prev [-]

[dead]