Remix.run Logo
lisper 4 hours ago

This analysis is not quite fair. It takes into account locality (i.e. the speed of light) when designing UUID schemes but not when computing the odds of a collision. Collisions only matter if the colliding UUIDs actually come into causal contact with each other after being generated. So just as you have to take locality into account when designing UUID trees, you also have to take it into account when computing the odds of an actual local collision. So a naive application of the birthday paradox is not applicable because that ignores locality. So an actual fair calculation of the required size of a random UUID is going to be a lot smaller than the ~800 bits the article comes up with. I haven't done the math, but I'd be surprised if the actual answer is more than 256 bits.

(Gotta say here that I love HN. It's one of the very few places where a comment that geeky and pedantic can nonetheless be on point. :-)

fdefitte 3 minutes ago | parent | next [-]

This is the right critique. The whole article is a fun thought experiment but it massively overestimates the problem by ignoring causality. In practice, UUID collisions only matter within systems that actually talk to each other, and those systems are bounded by light cones. 128 bits is already overkill for anything humans will build in the next thousand years. 256 bits is overkill for anything that could physically exist in this universe.

u1hcw9nx 4 hours ago | parent | prev | next [-]

You must consider both time and locality.

From now until protons decay and matter does not exist anymore is only 10^56 nanoseconds.

Sharlin 3 hours ago | parent | next [-]

If protons decay. There isn't really any reason to believe they're not stable.

hnuser123456 3 hours ago | parent | next [-]

And recent DESI data suggests that dark energy is not constant and the universe will experience a big crunch in a little more than double its current age, for a total lifespan of 33 billion years, no need to get wild with the orders of magnitude on years into the future. The infinite expansion to heat death over 10^100 years is looking less likely, 10^11 years should be plenty.

https://www.sciencedaily.com/releases/2026/02/260215225537.h...

frikit 3 hours ago | parent | prev [-]

Protons can decay because the distinction between matter and energy isn't permanent.

Two quarks inside the proton interact via a massive messenger particle. This exchange flips their identity, turning the proton into a positron and a neutral pion. The pion then immediately converts into gamma rays.

Proton decayed!

Etheryte 3 hours ago | parent | prev | next [-]

That's such an odd way to use units. Why would you do 10^56 * 10^-9 seconds?

lisper 3 hours ago | parent | next [-]

This was my thought. Nanoseconds are an eternity. You want to be using Planck units for your worst-case analysis.

u1hcw9nx 3 hours ago | parent [-]

If you go far beyond nanoseconds, energy becomes a limiting factor. You can only achieve ultra-fast processing if you dedicate vast amounts of matter to heat dissipation and energy generation. Think on a galactic scale: you cannot have even have molecular reaction speeds occurring at femtosecond or attosecond speeds constantly and everywhere without overheating everything.

lisper 3 hours ago | parent [-]

Maybe. It's not clear whether these are fundamental limits or merely technological ones. Reversible (i.e. infinitely efficient) computing is theoretically possible.

magicalhippo an hour ago | parent | prev [-]

Nanoseconds is a natural unit for processors operating around a GHz, as it's roughly the time of a clock cycle.

If a CPU takes 4 cycles to generate a UUID and the CPU runs at 4 GHz it churns out one every nanosecond.

rbanffy 4 hours ago | parent | prev | next [-]

If we think of the many worlds interpretation, how many universes will we be making every time we assign a CCUID to something?

petcat 3 hours ago | parent | next [-]

> many worlds interpretation

These are only namespaces. Many worlds can have all the same (many) random numbers and they will never conflict with each other!

shiandow 2 hours ago | parent | prev | next [-]

In that interpretation the total number of worlds does not change.

antonvs 3 hours ago | parent | prev [-]

We don't "make" universes in the MWI. The universal wavefunction evolves to include all reachable quantum states. It's deterministic, because it encompasses all allowed possibilities.

rbanffy 3 hours ago | parent [-]

Humpf…

You just had to collapse my wave function here…

dheera 2 hours ago | parent | prev | next [-]

Protons (and mass and energy) could also potentially be created. If this happens, the heat death could be avoided.

Conservation of mass and energy is an empirical observation, there is no theoretical basis for it. We just don't know any process we can implement that violates it, but that doesn't mean it doesn't exist.

dinosaurdynasty an hour ago | parent [-]

Conservation laws result from continuous symmetries in the laws of physics, as proven by Noether's theorem.

dheera 3 minutes ago | parent [-]

Time translation symmetry implies energy conservation, but time translation symmetry is only an empirical observation on a local scale and has not been shown to be true on a global universe scale.

scotty79 3 hours ago | parent | prev | next [-]

Proton decay is hypothetical.

hamdingers 2 hours ago | parent [-]

So is the need for cosmologically unique IDs. We're having fun.

rubyn00bie 4 hours ago | parent | prev [-]

I got a big laugh at the “only” part of that. I do have a sincere question about that number though, isn’t time relative? How would we know that number to be true or consistent? My incredibly naive assumption would be that with less matter time moves faster sort of accelerating; so, as matter “evaporates” the process accelerates and converges on that number (or close it)?

zamadatix 3 hours ago | parent | next [-]

Times for things like "age of the universe" are usually given as "cosmic time" for this reason. If it's about a specific object (e.g. "how long until a day on Earth lasts 25 hours") it's usually given in "proper time" for that object. Other observers/reference frames may perceive time differently, but in the normal relativistic sense rather than a "it all needs to wind itself back up to be equal in the end" sense.

idiotsecant 3 hours ago | parent | prev [-]

The local reference frame (which is what matters for proton decay) doesn't see an outside world moving slower or faster depending on how much mass is around it to any significant degree until you start adding a lot of mass very close around.

k_roy 21 minutes ago | parent | prev | next [-]

Reminds me of a time many years ago when I received a whole case of Intel NICs all with the same MAC address.

It was an interesting couple of days before we figured it out.

svnt 3 hours ago | parent | prev | next [-]

Maybe the definitions are shifting, but in my experience “on point” is typically an endorsement in the area of “really/precisely good” — so I think what you mean is “on topic” or similar.

Pedantry ftw.

lisper 3 hours ago | parent [-]

:-)

RobotToaster 2 hours ago | parent | prev | next [-]

Would this take into account IDs generated by objects moving at relativistic speeds? It would be a right pain to travel for a year to another planet, arrive 10,000 years late, and have a bunch of id collisions.

9dev 2 hours ago | parent | next [-]

Oh no! We should immediately commence work on a new UUID version that addresses this use case.

lisper 2 hours ago | parent | prev [-]

I have to confess I have not actually done the math.

ctoth 2 hours ago | parent | prev [-]

Hanson's Grabby Aliens actually fits really well here if you're looking for some math to base off of.