2/18/2026 at 7:33:41 PM
This analysis is not quite fair. It takes into account locality (i.e. the speed of light) when designing UUID schemes but not when computing the odds of a collision. Collisions only matter if the colliding UUIDs actually come into causal contact with each other after being generated. So just as you have to take locality into account when designing UUID trees, you also have to take it into account when computing the odds of an actual local collision. So a naive application of the birthday paradox is not applicable because that ignores locality. So an actual fair calculation of the required size of a random UUID is going to be a lot smaller than the ~800 bits the article comes up with. I haven't done the math, but I'd be surprised if the actual answer is more than 256 bits.(Gotta say here that I love HN. It's one of the very few places where a comment that geeky and pedantic can nonetheless be on point. :-)
by lisper
2/18/2026 at 11:27:21 PM
Reminds me of a time many years ago when I received a whole case of Intel NICs all with the same MAC address.It was an interesting couple of days before we figured it out.
by k_roy
2/19/2026 at 11:54:38 AM
How does that happen? Was it an OEM bulk kind of deal where you were expected to write a new MAC for each NIC when deploying them?by imglorp
2/19/2026 at 12:48:44 PM
There's a fun hypothesis I've read about somewhere, goes something like this:As the universe expands the gap between galaxies widens until they start "disappearing" as no information can travel anymore between them. Therefore, if we assume that intelligent lifeforms exist out there, it is likely that these will slowly converge to the place in the universe with the highest mass density for survival. IIRC we even know approximately where this is.
This means a sort of "grand meeting of alien advanced cultures" before the heat death. Which in turn also means that previously uncollided UUIDs may start to collide.
Those damned Vogons thrashing all our stats with their gazillion documents. Why do they have a UUID for each xml tag??
by exfalso
2/19/2026 at 3:09:40 PM
It is counter intuitive but information can still travel between places that are so distant that expansion between them is faster than the speed of light. It's just extremely slow (so I still vote for going to the party at the highest density place).We do see light from galaxies that are receding away from us faster than c. At first the photons going in our direction are moving away from us but as the universe expands over time at some point they find themselves in a region of space that is no longer receding faster than c, and they start approaching.
by jobigoud
2/19/2026 at 7:07:28 PM
That's not exactly it. Light gets redshifted instead of slowing down, because light will be measured to be the same speed in all frames of reference. So even though we can't actually observe it yet, light traveling towards us still moves at c.It's a different story entirely for matter. Causal and reachable are two different things.
Regardless, such extreme redshifting would make communication virtually impossible - but maybe the folks at Blargon 5 have that figured out.
by zamalek
2/19/2026 at 2:55:33 PM
I think I missed something: how do galaxies getting further away (divergence) imply that intelligent species will converge anywhere? It isn’t like one galaxy getting out of range of another on the other side of the universe is going to affect things in a meaningful way…A galaxy has enough resources to be self-reliant, there’s no need for a species to escape one that is getting too far away from another one.
by zimzam
2/19/2026 at 3:13:09 PM
Social aspect. There is no need but it's more fun to spend the end of the Universe with other intelligences than each in its own place.by jobigoud
2/19/2026 at 3:23:51 PM
You'll run out of resources eventually. Moving to the place with the most mass gives you the most time before you run out.by wat10000
2/19/2026 at 5:02:14 PM
Yes that's the idea. The expansion simply means that the window of migration will close. Once it's closed, your galaxy is cut off and will run out of fuel sooner than the high-density area.by exfalso
2/19/2026 at 5:19:02 PM
Well eventually there are no galaxies just a bunch of cosmic rays. Some clusters of matter will last longer.I think for this to work, either life would have to plentiful near the end, or you’d need FTL travel.
by paulddraper
2/19/2026 at 3:56:37 PM
I think I sense a strange Battle Royale type game…by chamomeal
2/18/2026 at 8:08:02 PM
You must consider both time and locality.From now until protons decay and matter does not exist anymore is only 10^56 nanoseconds.
by u1hcw9nx
2/18/2026 at 8:20:22 PM
If protons decay. There isn't really any reason to believe they're not stable.by Sharlin
2/18/2026 at 8:35:40 PM
And recent DESI data suggests that dark energy is not constant and the universe will experience a big crunch in a little more than double its current age, for a total lifespan of 33 billion years, no need to get wild with the orders of magnitude on years into the future. The infinite expansion to heat death over 10^100 years is looking less likely, 10^11 years should be plenty.https://www.sciencedaily.com/releases/2026/02/260215225537.h...
by hnuser123456
2/19/2026 at 1:02:45 AM
not obvious to me this makes things better as opposed to worse? sure, the time bound helps but in the runup to a crunch won't we get vastly more devices in causal range at an asymptotically increasing rate?by disconcision
2/19/2026 at 7:24:14 AM
Who’s there doing the counting? I would assume the temperatures at those extremes won’t support life in its known forms.Perhaps some Adamesque (as in douglas adams) creature whose sole purpose is to collect all unique UUIDs and give them names.
by Towaway69
2/19/2026 at 10:37:19 AM
Runup to the crunch is a looong time lots of which is probably very habitable. in 5 billion years life can arise from scratch become conscious and exterminate itselfby throwaway290
2/18/2026 at 8:41:36 PM
Protons can decay because the distinction between matter and energy isn't permanent.Two quarks inside the proton interact via a massive messenger particle. This exchange flips their identity, turning the proton into a positron and a neutral pion. The pion then immediately converts into gamma rays.
Proton decayed!
by frikit
2/19/2026 at 3:49:55 AM
This destroys a baryon, an operation which is prohibited by the standard model.by itishappy
2/19/2026 at 7:08:32 AM
Baryon number is an accidental symmetry, not a fundamental one. Unlike charge or color, it is not protected by a gauge principle and is just a consequence of the field content and renormalizability at low energies.The standard model is almost certainly an effective field theory and a low-energy approximation of a more comprehensive framework. In any ultraviolet completion, such as a GUT, quarks and leptons inhabit the same multiplets. At these scales, the distinction between matter types blurs, and the heavy gauge bosons provide the exact mediation mechanism described to bypass the baryon barrier.
Furthermore, the existence of the universe is an empirical mandate for baryon-violation. If baryon number were a strict, immutable law, the Sakharov conditions could not be met, and the primordial matter-antimatter symmetry would have resulted in a total annihilation. Our existence is proof that baryon number is not conserved. Even within the current framework, non-perturbative effects like sphalerons demonstrate that the Standard Model vacuum itself does not strictly forbid the destruction of baryons.
by giraldorich
2/19/2026 at 2:06:49 PM
The sum of the conserved quantities, e.g. chromatic charge, electric charge and spin, is null for the set of 8 particles formed by the 3 u quarks, the 3 d quarks and the electron and the neutrino, i.e. for the components of a proton plus a neutron plus an electron plus a neutrino.This is the only case of a null sum for these quantities, where no antiparticles are involved. The sum is also null for 2 particles, where one is the antiparticle of the other, allowing their generation or annihilation, and it is also null for the 4 particles that take part in any weak interaction, like the decay of a neutron into a proton, which involves a u quark, a d antiquark, an electron and an antineutrino, and this is what allows the transmutations between elementary particles that cannot happen just through generation and annihilation of particle-antiparticle pairs.
Thus generation and annihilation of groups of such 8 particles are not forbidden by the known laws. The Big Bang model is based on equal quantities of these 8 particles at the beginning, which is consistent with their simultaneous generation at the origin.
On the other hand, the annihilation of such a group of 8 particles, which would lead to the disappearance of some matter, appears as an extraordinarily improbable event.
For annihilation, all 8 particles would have to come simultaneously at a distance from each other much smaller than the diameter of an atomic nucleus, inside which quarks move at very high speeds, not much less than the speed of light, so they are never close to each other.
The probability of a proton colliding simultaneously with a neutron, with an electron and with a neutrino, while at the same time the 6 quarks composing the nucleons would also be gathering at the same internal spot seems so low that such an event is extremely unlikely to ever have happened in the entire Universe, since its beginning.
by adrian_b
2/18/2026 at 8:18:54 PM
That's such an odd way to use units. Why would you do 10^56 * 10^-9 seconds?by Etheryte
2/18/2026 at 8:34:45 PM
This was my thought. Nanoseconds are an eternity. You want to be using Planck units for your worst-case analysis.by lisper
2/18/2026 at 8:46:33 PM
If you go far beyond nanoseconds, energy becomes a limiting factor. You can only achieve ultra-fast processing if you dedicate vast amounts of matter to heat dissipation and energy generation. Think on a galactic scale: you cannot have even have molecular reaction speeds occurring at femtosecond or attosecond speeds constantly and everywhere without overheating everything.by u1hcw9nx
2/18/2026 at 8:51:18 PM
Maybe. It's not clear whether these are fundamental limits or merely technological ones. Reversible (i.e. infinitely efficient) computing is theoretically possible.by lisper
2/19/2026 at 2:24:13 PM
Reversible computing is not infinitely efficient, because irreversible operations, e.g. memory erasing, cannot be completely avoided.However, the computing efficiency could be greatly increased by employing reversible operations whenever possible and there are chances that this will be done in the future, but the efficiency will remain far from infinite.
by adrian_b
2/18/2026 at 11:54:27 PM
If you have a black hole as an infinite heat sink this helps a great deal.by UltraSane
2/19/2026 at 3:35:20 AM
Black holes have a maximum growth rateby jquery
2/19/2026 at 5:55:42 AM
By infinite I mean a black hole gets COLDER as you add mass and energy to it.by UltraSane
2/19/2026 at 3:56:35 AM
Planck units are a mathematical convenience, not a physical limit. For instance, the Planck mass is on the order of an eyelash or grain of sand.by itishappy
2/19/2026 at 3:59:44 AM
Planck units are physical limits. The Planck mass is the limit of the mass of an elementary particle before it would form a black hole.by lisper
2/19/2026 at 4:08:42 AM
"Plank units are not physical limits on reality itself" is what I should have said. We can obviously have larger or smaller masses.The plank time is a limit on a measurement process, not the smallest unit of time.
by itishappy
2/19/2026 at 4:26:20 AM
> Plank units are not physical limits on reality itselfWe don't actually know that. They might be. Planck units are what happens when GR meets QM and we just don't know yet what happens there.
But as a heuristic, they probably put pretty good bounds on what we can reasonably expect to be technologically achievable before humans go extinct.
by lisper
2/19/2026 at 2:54:08 PM
Nope. What you say is a myth.The Planck mass is just the square root of the quotient of dividing the product between the natural units of angular momentum and velocity, by the Newtonian constant of gravitation.
This Planck mass expresses a constant related to the conversion of the Newtonian constant of gravitation from the conventional system of units to a natural system of units, which is why it appears instead of the classic Newtonian constant inside a much more complex expression that computes the Chandrasekhar limit for black holes.
The Planck mass has absolutely no physical meaning (otherwise than expressing in a different system of units a constant equivalent with the Newtonian constant of gravitation), unlike some other true universal constants, like the so-called constant of fine structure (or constant of Sommerfeld), which is the ratio between the speed of an electron revolving around a nucleus of infinite mass in the state with the lowest total energy, and the speed of light (i.e. that electron speed measured in natural units). The constant of fine structure is a measure of the intensity of the electromagnetic interaction, like the Planck mass or the Newtonian constant of gravitation are measures of the intensity of the gravitational interaction.
The so-called "Planck units" have weird values because they are derived from the Newtonian constant of gravitation, which is extremely small. Planck has proposed them in 1899, immediately after computing for the first time what is now called as Planck's constant.
He realized that Planck's constant provides an additional value that would be suitable for a system of natural fundamental units, but his proposal was a complete failure because he did not understand the requirements for a system of fundamental units. He has started from the proposals made by Maxwell a quarter of century before him, but from 2 alternatives proposed by Maxwell for defining a unit of mass, Planck has chosen the bad alternative, of using the Newtonian constant of gravitation.
Any system of fundamental units where the Newtonian constant of gravitation is chosen by convention, instead of being measured, is impossible to use in practice. The reason is that this constant can be measured only with great uncertainties. Saying by law that it has a certain value does not make the uncertainties disappear, but it moves them into the values of almost all other physical quantities. In the Planck system of units, no absolute value is known with a precision good enough for modern technology. The only accurate values are relative, i.e. the ratios between 2 physical quantities of the same kind.
The Planck system of units is only good for showing how a system of fundamental units MUST NOT be defined.
Because the Planck units of length and time happen by chance to be very small, beyond the range of any experiments that have ever been done in the most powerful accelerators, absolutely nobody knows what can happen if a physical system could be that small, so claims that some particle could be that small and it would collapse in a black hole are more ridiculous than claiming to have seen the Monster of Loch Ness.
The Einsteinian theory of gravitation is based on averaging the distribution of matter, so we can be pretty sure that it cannot be valid in the same form at elementary particle level, where you must deal with instantaneous particle positions, not with their mass averaged over a great region of empty space.
It has become possible to use Planck's constant in a system of fundamental units only much later than 1899, i.e. after 1961, when the quantization of magnetic field was measured experimentally. However, next year, in 1962, an even better method was discovered, by the prediction of the Josephson effect. The Josephson effect would have been sufficient to make the standard kilogram unnecessary, but metrology has been further simplified by the discovery of the von Klitzing effect in 1980. Despite the fact that this would have been possible much earlier, only since 2019 the legal system of fundamental units depends on Planck's constant, but in a good way, not in that proposed by Planck.
by adrian_b
2/18/2026 at 11:01:42 PM
Nanoseconds is a natural unit for processors operating around a GHz, as it's roughly the time of a clock cycle.If a CPU takes 4 cycles to generate a UUID and the CPU runs at 4 GHz it churns out one every nanosecond.
by magicalhippo
2/18/2026 at 8:14:15 PM
If we think of the many worlds interpretation, how many universes will we be making every time we assign a CCUID to something?by rbanffy
2/18/2026 at 9:07:38 PM
> many worlds interpretationThese are only namespaces. Many worlds can have all the same (many) random numbers and they will never conflict with each other!
by petcat
2/18/2026 at 9:31:41 PM
In that interpretation the total number of worlds does not change.by shiandow
2/18/2026 at 8:29:07 PM
We don't "make" universes in the MWI. The universal wavefunction evolves to include all reachable quantum states. It's deterministic, because it encompasses all allowed possibilities.by antonvs
2/18/2026 at 8:39:44 PM
Humpf…You just had to collapse my wave function here…
by rbanffy
2/19/2026 at 5:19:26 PM
That's Copenhagen, not MWI! :Pby antonvs
2/18/2026 at 9:35:10 PM
Protons (and mass and energy) could also potentially be created. If this happens, the heat death could be avoided.Conservation of mass and energy is an empirical observation, there is no theoretical basis for it. We just don't know any process we can implement that violates it, but that doesn't mean it doesn't exist.
by dheera
2/19/2026 at 7:22:34 AM
All of physics is „just“ based on empirical observation. It’s still a pretty good tool for prediction.by adrianN
2/18/2026 at 11:03:06 PM
Conservation laws result from continuous symmetries in the laws of physics, as proven by Noether's theorem.by dinosaurdynasty
2/18/2026 at 11:45:17 PM
Time translation symmetry implies energy conservation, but time translation symmetry is only an empirical observation on a local scale and has not been shown to be true on a global universe scale.by dheera
2/18/2026 at 8:18:41 PM
Proton decay is hypothetical.by scotty79
2/18/2026 at 9:19:19 PM
So is the need for cosmologically unique IDs. We're having fun.by hamdingers
2/18/2026 at 8:16:46 PM
I got a big laugh at the “only” part of that. I do have a sincere question about that number though, isn’t time relative? How would we know that number to be true or consistent? My incredibly naive assumption would be that with less matter time moves faster sort of accelerating; so, as matter “evaporates” the process accelerates and converges on that number (or close it)?by rubyn00bie
2/18/2026 at 8:43:48 PM
Times for things like "age of the universe" are usually given as "cosmic time" for this reason. If it's about a specific object (e.g. "how long until a day on Earth lasts 25 hours") it's usually given in "proper time" for that object. Other observers/reference frames may perceive time differently, but in the normal relativistic sense rather than a "it all needs to wind itself back up to be equal in the end" sense.by zamadatix
2/18/2026 at 8:47:09 PM
The local reference frame (which is what matters for proton decay) doesn't see an outside world moving slower or faster depending on how much mass is around it to any significant degree until you start adding a lot of mass very close around.by idiotsecant
2/19/2026 at 7:59:33 AM
Ah but if we are considering near-infinitesimal probabilities, we should metagame and consider the very low probability that our understanding of cosmology is flawed and light cones aren’t actually a limiting factor on causal contact.by jl6
2/19/2026 at 12:11:15 PM
Sorry, your laptop was produced before FTL was invented, so its MAC address is only recognized in the Milky Way sector.by missingdays
2/19/2026 at 2:56:22 PM
If we allow FTL information exchange, don't we run into the possibility that the FTL accessible universe is infinite, so unique IDs are fundamnetally not possible? Physics doesn't really do much with this because the observable universe is all that 'exists' in a Russel's Teapot sense.by SkyBelow
2/18/2026 at 11:45:57 PM
This is the right critique. The whole article is a fun thought experiment but it massively overestimates the problem by ignoring causality. In practice, UUID collisions only matter within systems that actually talk to each other, and those systems are bounded by light cones. 128 bits is already overkill for anything humans will build in the next thousand years. 256 bits is overkill for anything that could physically exist in this universe.by fdefitte
2/18/2026 at 9:49:37 PM
Would this take into account IDs generated by objects moving at relativistic speeds? It would be a right pain to travel for a year to another planet, arrive 10,000 years late, and have a bunch of id collisions.by RobotToaster
2/18/2026 at 9:57:00 PM
I have to confess I have not actually done the math.by lisper
2/18/2026 at 10:06:02 PM
Oh no! We should immediately commence work on a new UUID version that addresses this use case.by 9dev
2/18/2026 at 8:58:13 PM
Maybe the definitions are shifting, but in my experience “on point” is typically an endorsement in the area of “really/precisely good” — so I think what you mean is “on topic” or similar.Pedantry ftw.
by svnt
2/18/2026 at 9:05:08 PM
:-)by lisper
2/18/2026 at 10:06:14 PM
Hanson's Grabby Aliens actually fits really well here if you're looking for some math to base off of.by ctoth
2/19/2026 at 5:19:26 AM
The answer is 42. Have it from good source!by quijoteuniv