alt.hn

2/18/2026 at 6:37:22 PM

Cosmologically Unique IDs

https://jasonfantl.com/posts/Universal-Unique-IDs/

by jfantl

2/18/2026 at 7:33:41 PM

This analysis is not quite fair. It takes into account locality (i.e. the speed of light) when designing UUID schemes but not when computing the odds of a collision. Collisions only matter if the colliding UUIDs actually come into causal contact with each other after being generated. So just as you have to take locality into account when designing UUID trees, you also have to take it into account when computing the odds of an actual local collision. So a naive application of the birthday paradox is not applicable because that ignores locality. So an actual fair calculation of the required size of a random UUID is going to be a lot smaller than the ~800 bits the article comes up with. I haven't done the math, but I'd be surprised if the actual answer is more than 256 bits.

(Gotta say here that I love HN. It's one of the very few places where a comment that geeky and pedantic can nonetheless be on point. :-)

by lisper

2/18/2026 at 11:27:21 PM

Reminds me of a time many years ago when I received a whole case of Intel NICs all with the same MAC address.

It was an interesting couple of days before we figured it out.

by k_roy

2/19/2026 at 11:54:38 AM

How does that happen? Was it an OEM bulk kind of deal where you were expected to write a new MAC for each NIC when deploying them?

by imglorp

2/19/2026 at 12:48:44 PM

There's a fun hypothesis I've read about somewhere, goes something like this:

As the universe expands the gap between galaxies widens until they start "disappearing" as no information can travel anymore between them. Therefore, if we assume that intelligent lifeforms exist out there, it is likely that these will slowly converge to the place in the universe with the highest mass density for survival. IIRC we even know approximately where this is.

This means a sort of "grand meeting of alien advanced cultures" before the heat death. Which in turn also means that previously uncollided UUIDs may start to collide.

Those damned Vogons thrashing all our stats with their gazillion documents. Why do they have a UUID for each xml tag??

by exfalso

2/19/2026 at 3:09:40 PM

It is counter intuitive but information can still travel between places that are so distant that expansion between them is faster than the speed of light. It's just extremely slow (so I still vote for going to the party at the highest density place).

We do see light from galaxies that are receding away from us faster than c. At first the photons going in our direction are moving away from us but as the universe expands over time at some point they find themselves in a region of space that is no longer receding faster than c, and they start approaching.

by jobigoud

2/19/2026 at 7:07:28 PM

That's not exactly it. Light gets redshifted instead of slowing down, because light will be measured to be the same speed in all frames of reference. So even though we can't actually observe it yet, light traveling towards us still moves at c.

It's a different story entirely for matter. Causal and reachable are two different things.

Regardless, such extreme redshifting would make communication virtually impossible - but maybe the folks at Blargon 5 have that figured out.

by zamalek

2/19/2026 at 2:55:33 PM

I think I missed something: how do galaxies getting further away (divergence) imply that intelligent species will converge anywhere? It isn’t like one galaxy getting out of range of another on the other side of the universe is going to affect things in a meaningful way…

A galaxy has enough resources to be self-reliant, there’s no need for a species to escape one that is getting too far away from another one.

by zimzam

2/19/2026 at 3:13:09 PM

Social aspect. There is no need but it's more fun to spend the end of the Universe with other intelligences than each in its own place.

by jobigoud

2/19/2026 at 3:23:51 PM

You'll run out of resources eventually. Moving to the place with the most mass gives you the most time before you run out.

by wat10000

2/19/2026 at 5:02:14 PM

Yes that's the idea. The expansion simply means that the window of migration will close. Once it's closed, your galaxy is cut off and will run out of fuel sooner than the high-density area.

by exfalso

2/19/2026 at 5:19:02 PM

Well eventually there are no galaxies just a bunch of cosmic rays. Some clusters of matter will last longer.

I think for this to work, either life would have to plentiful near the end, or you’d need FTL travel.

by paulddraper

2/19/2026 at 3:56:37 PM

I think I sense a strange Battle Royale type game…

by chamomeal

2/18/2026 at 8:08:02 PM

You must consider both time and locality.

From now until protons decay and matter does not exist anymore is only 10^56 nanoseconds.

by u1hcw9nx

2/18/2026 at 8:20:22 PM

If protons decay. There isn't really any reason to believe they're not stable.

by Sharlin

2/18/2026 at 8:35:40 PM

And recent DESI data suggests that dark energy is not constant and the universe will experience a big crunch in a little more than double its current age, for a total lifespan of 33 billion years, no need to get wild with the orders of magnitude on years into the future. The infinite expansion to heat death over 10^100 years is looking less likely, 10^11 years should be plenty.

https://www.sciencedaily.com/releases/2026/02/260215225537.h...

by hnuser123456

2/19/2026 at 1:02:45 AM

not obvious to me this makes things better as opposed to worse? sure, the time bound helps but in the runup to a crunch won't we get vastly more devices in causal range at an asymptotically increasing rate?

by disconcision

2/19/2026 at 7:24:14 AM

Who’s there doing the counting? I would assume the temperatures at those extremes won’t support life in its known forms.

Perhaps some Adamesque (as in douglas adams) creature whose sole purpose is to collect all unique UUIDs and give them names.

by Towaway69

2/19/2026 at 10:37:19 AM

Runup to the crunch is a looong time lots of which is probably very habitable. in 5 billion years life can arise from scratch become conscious and exterminate itself

by throwaway290

2/18/2026 at 8:41:36 PM

Protons can decay because the distinction between matter and energy isn't permanent.

Two quarks inside the proton interact via a massive messenger particle. This exchange flips their identity, turning the proton into a positron and a neutral pion. The pion then immediately converts into gamma rays.

Proton decayed!

by frikit

2/19/2026 at 3:49:55 AM

This destroys a baryon, an operation which is prohibited by the standard model.

by itishappy

2/19/2026 at 7:08:32 AM

Baryon number is an accidental symmetry, not a fundamental one. Unlike charge or color, it is not protected by a gauge principle and is just a consequence of the field content and renormalizability at low energies.

The standard model is almost certainly an effective field theory and a low-energy approximation of a more comprehensive framework. In any ultraviolet completion, such as a GUT, quarks and leptons inhabit the same multiplets. At these scales, the distinction between matter types blurs, and the heavy gauge bosons provide the exact mediation mechanism described to bypass the baryon barrier.

Furthermore, the existence of the universe is an empirical mandate for baryon-violation. If baryon number were a strict, immutable law, the Sakharov conditions could not be met, and the primordial matter-antimatter symmetry would have resulted in a total annihilation. Our existence is proof that baryon number is not conserved. Even within the current framework, non-perturbative effects like sphalerons demonstrate that the Standard Model vacuum itself does not strictly forbid the destruction of baryons.

by giraldorich

2/19/2026 at 2:06:49 PM

The sum of the conserved quantities, e.g. chromatic charge, electric charge and spin, is null for the set of 8 particles formed by the 3 u quarks, the 3 d quarks and the electron and the neutrino, i.e. for the components of a proton plus a neutron plus an electron plus a neutrino.

This is the only case of a null sum for these quantities, where no antiparticles are involved. The sum is also null for 2 particles, where one is the antiparticle of the other, allowing their generation or annihilation, and it is also null for the 4 particles that take part in any weak interaction, like the decay of a neutron into a proton, which involves a u quark, a d antiquark, an electron and an antineutrino, and this is what allows the transmutations between elementary particles that cannot happen just through generation and annihilation of particle-antiparticle pairs.

Thus generation and annihilation of groups of such 8 particles are not forbidden by the known laws. The Big Bang model is based on equal quantities of these 8 particles at the beginning, which is consistent with their simultaneous generation at the origin.

On the other hand, the annihilation of such a group of 8 particles, which would lead to the disappearance of some matter, appears as an extraordinarily improbable event.

For annihilation, all 8 particles would have to come simultaneously at a distance from each other much smaller than the diameter of an atomic nucleus, inside which quarks move at very high speeds, not much less than the speed of light, so they are never close to each other.

The probability of a proton colliding simultaneously with a neutron, with an electron and with a neutrino, while at the same time the 6 quarks composing the nucleons would also be gathering at the same internal spot seems so low that such an event is extremely unlikely to ever have happened in the entire Universe, since its beginning.

by adrian_b

2/18/2026 at 8:18:54 PM

That's such an odd way to use units. Why would you do 10^56 * 10^-9 seconds?

by Etheryte

2/18/2026 at 8:34:45 PM

This was my thought. Nanoseconds are an eternity. You want to be using Planck units for your worst-case analysis.

by lisper

2/18/2026 at 8:46:33 PM

If you go far beyond nanoseconds, energy becomes a limiting factor. You can only achieve ultra-fast processing if you dedicate vast amounts of matter to heat dissipation and energy generation. Think on a galactic scale: you cannot have even have molecular reaction speeds occurring at femtosecond or attosecond speeds constantly and everywhere without overheating everything.

by u1hcw9nx

2/18/2026 at 8:51:18 PM

Maybe. It's not clear whether these are fundamental limits or merely technological ones. Reversible (i.e. infinitely efficient) computing is theoretically possible.

by lisper

2/19/2026 at 2:24:13 PM

Reversible computing is not infinitely efficient, because irreversible operations, e.g. memory erasing, cannot be completely avoided.

However, the computing efficiency could be greatly increased by employing reversible operations whenever possible and there are chances that this will be done in the future, but the efficiency will remain far from infinite.

by adrian_b

2/18/2026 at 11:54:27 PM

If you have a black hole as an infinite heat sink this helps a great deal.

by UltraSane

2/19/2026 at 3:35:20 AM

Black holes have a maximum growth rate

by jquery

2/19/2026 at 5:55:42 AM

By infinite I mean a black hole gets COLDER as you add mass and energy to it.

by UltraSane

2/19/2026 at 3:56:35 AM

Planck units are a mathematical convenience, not a physical limit. For instance, the Planck mass is on the order of an eyelash or grain of sand.

by itishappy

2/19/2026 at 3:59:44 AM

Planck units are physical limits. The Planck mass is the limit of the mass of an elementary particle before it would form a black hole.

by lisper

2/19/2026 at 4:08:42 AM

"Plank units are not physical limits on reality itself" is what I should have said. We can obviously have larger or smaller masses.

The plank time is a limit on a measurement process, not the smallest unit of time.

by itishappy

2/19/2026 at 4:26:20 AM

> Plank units are not physical limits on reality itself

We don't actually know that. They might be. Planck units are what happens when GR meets QM and we just don't know yet what happens there.

But as a heuristic, they probably put pretty good bounds on what we can reasonably expect to be technologically achievable before humans go extinct.

by lisper

2/19/2026 at 2:54:08 PM

Nope. What you say is a myth.

The Planck mass is just the square root of the quotient of dividing the product between the natural units of angular momentum and velocity, by the Newtonian constant of gravitation.

This Planck mass expresses a constant related to the conversion of the Newtonian constant of gravitation from the conventional system of units to a natural system of units, which is why it appears instead of the classic Newtonian constant inside a much more complex expression that computes the Chandrasekhar limit for black holes.

The Planck mass has absolutely no physical meaning (otherwise than expressing in a different system of units a constant equivalent with the Newtonian constant of gravitation), unlike some other true universal constants, like the so-called constant of fine structure (or constant of Sommerfeld), which is the ratio between the speed of an electron revolving around a nucleus of infinite mass in the state with the lowest total energy, and the speed of light (i.e. that electron speed measured in natural units). The constant of fine structure is a measure of the intensity of the electromagnetic interaction, like the Planck mass or the Newtonian constant of gravitation are measures of the intensity of the gravitational interaction.

The so-called "Planck units" have weird values because they are derived from the Newtonian constant of gravitation, which is extremely small. Planck has proposed them in 1899, immediately after computing for the first time what is now called as Planck's constant.

He realized that Planck's constant provides an additional value that would be suitable for a system of natural fundamental units, but his proposal was a complete failure because he did not understand the requirements for a system of fundamental units. He has started from the proposals made by Maxwell a quarter of century before him, but from 2 alternatives proposed by Maxwell for defining a unit of mass, Planck has chosen the bad alternative, of using the Newtonian constant of gravitation.

Any system of fundamental units where the Newtonian constant of gravitation is chosen by convention, instead of being measured, is impossible to use in practice. The reason is that this constant can be measured only with great uncertainties. Saying by law that it has a certain value does not make the uncertainties disappear, but it moves them into the values of almost all other physical quantities. In the Planck system of units, no absolute value is known with a precision good enough for modern technology. The only accurate values are relative, i.e. the ratios between 2 physical quantities of the same kind.

The Planck system of units is only good for showing how a system of fundamental units MUST NOT be defined.

Because the Planck units of length and time happen by chance to be very small, beyond the range of any experiments that have ever been done in the most powerful accelerators, absolutely nobody knows what can happen if a physical system could be that small, so claims that some particle could be that small and it would collapse in a black hole are more ridiculous than claiming to have seen the Monster of Loch Ness.

The Einsteinian theory of gravitation is based on averaging the distribution of matter, so we can be pretty sure that it cannot be valid in the same form at elementary particle level, where you must deal with instantaneous particle positions, not with their mass averaged over a great region of empty space.

It has become possible to use Planck's constant in a system of fundamental units only much later than 1899, i.e. after 1961, when the quantization of magnetic field was measured experimentally. However, next year, in 1962, an even better method was discovered, by the prediction of the Josephson effect. The Josephson effect would have been sufficient to make the standard kilogram unnecessary, but metrology has been further simplified by the discovery of the von Klitzing effect in 1980. Despite the fact that this would have been possible much earlier, only since 2019 the legal system of fundamental units depends on Planck's constant, but in a good way, not in that proposed by Planck.

by adrian_b

2/18/2026 at 11:01:42 PM

Nanoseconds is a natural unit for processors operating around a GHz, as it's roughly the time of a clock cycle.

If a CPU takes 4 cycles to generate a UUID and the CPU runs at 4 GHz it churns out one every nanosecond.

by magicalhippo

2/18/2026 at 8:14:15 PM

If we think of the many worlds interpretation, how many universes will we be making every time we assign a CCUID to something?

by rbanffy

2/18/2026 at 9:07:38 PM

> many worlds interpretation

These are only namespaces. Many worlds can have all the same (many) random numbers and they will never conflict with each other!

by petcat

2/18/2026 at 9:31:41 PM

In that interpretation the total number of worlds does not change.

by shiandow

2/18/2026 at 8:29:07 PM

We don't "make" universes in the MWI. The universal wavefunction evolves to include all reachable quantum states. It's deterministic, because it encompasses all allowed possibilities.

by antonvs

2/18/2026 at 8:39:44 PM

Humpf…

You just had to collapse my wave function here…

by rbanffy

2/19/2026 at 5:19:26 PM

That's Copenhagen, not MWI! :P

by antonvs

2/18/2026 at 9:35:10 PM

Protons (and mass and energy) could also potentially be created. If this happens, the heat death could be avoided.

Conservation of mass and energy is an empirical observation, there is no theoretical basis for it. We just don't know any process we can implement that violates it, but that doesn't mean it doesn't exist.

by dheera

2/19/2026 at 7:22:34 AM

All of physics is „just“ based on empirical observation. It’s still a pretty good tool for prediction.

by adrianN

2/18/2026 at 11:03:06 PM

Conservation laws result from continuous symmetries in the laws of physics, as proven by Noether's theorem.

by dinosaurdynasty

2/18/2026 at 11:45:17 PM

Time translation symmetry implies energy conservation, but time translation symmetry is only an empirical observation on a local scale and has not been shown to be true on a global universe scale.

by dheera

2/18/2026 at 8:18:41 PM

Proton decay is hypothetical.

by scotty79

2/18/2026 at 9:19:19 PM

So is the need for cosmologically unique IDs. We're having fun.

by hamdingers

2/18/2026 at 8:16:46 PM

I got a big laugh at the “only” part of that. I do have a sincere question about that number though, isn’t time relative? How would we know that number to be true or consistent? My incredibly naive assumption would be that with less matter time moves faster sort of accelerating; so, as matter “evaporates” the process accelerates and converges on that number (or close it)?

by rubyn00bie

2/18/2026 at 8:43:48 PM

Times for things like "age of the universe" are usually given as "cosmic time" for this reason. If it's about a specific object (e.g. "how long until a day on Earth lasts 25 hours") it's usually given in "proper time" for that object. Other observers/reference frames may perceive time differently, but in the normal relativistic sense rather than a "it all needs to wind itself back up to be equal in the end" sense.

by zamadatix

2/18/2026 at 8:47:09 PM

The local reference frame (which is what matters for proton decay) doesn't see an outside world moving slower or faster depending on how much mass is around it to any significant degree until you start adding a lot of mass very close around.

by idiotsecant

2/19/2026 at 7:59:33 AM

Ah but if we are considering near-infinitesimal probabilities, we should metagame and consider the very low probability that our understanding of cosmology is flawed and light cones aren’t actually a limiting factor on causal contact.

by jl6

2/19/2026 at 12:11:15 PM

Sorry, your laptop was produced before FTL was invented, so its MAC address is only recognized in the Milky Way sector.

by missingdays

2/19/2026 at 2:56:22 PM

If we allow FTL information exchange, don't we run into the possibility that the FTL accessible universe is infinite, so unique IDs are fundamnetally not possible? Physics doesn't really do much with this because the observable universe is all that 'exists' in a Russel's Teapot sense.

by SkyBelow

2/18/2026 at 11:45:57 PM

This is the right critique. The whole article is a fun thought experiment but it massively overestimates the problem by ignoring causality. In practice, UUID collisions only matter within systems that actually talk to each other, and those systems are bounded by light cones. 128 bits is already overkill for anything humans will build in the next thousand years. 256 bits is overkill for anything that could physically exist in this universe.

by fdefitte

2/18/2026 at 9:49:37 PM

Would this take into account IDs generated by objects moving at relativistic speeds? It would be a right pain to travel for a year to another planet, arrive 10,000 years late, and have a bunch of id collisions.

by RobotToaster

2/18/2026 at 9:57:00 PM

I have to confess I have not actually done the math.

by lisper

2/18/2026 at 10:06:02 PM

Oh no! We should immediately commence work on a new UUID version that addresses this use case.

by 9dev

2/18/2026 at 8:58:13 PM

Maybe the definitions are shifting, but in my experience “on point” is typically an endorsement in the area of “really/precisely good” — so I think what you mean is “on topic” or similar.

Pedantry ftw.

by svnt

2/18/2026 at 9:05:08 PM

:-)

by lisper

2/18/2026 at 10:06:14 PM

Hanson's Grabby Aliens actually fits really well here if you're looking for some math to base off of.

by ctoth

2/19/2026 at 5:19:26 AM

The answer is 42. Have it from good source!

by quijoteuniv

2/18/2026 at 9:03:02 PM

A more realistic estimate of the total number of addressable things should take into account that for anything to be addressable, its address should be stored somewhere at least once.

If it takes at least Npb particles to store one bit of information, then the number of addressable things would decrease with the number of bits of the address.

So let's call Nthg the number of addressable things, and assume the average number of bits per address grows with Nb = f(Ntng).

Then the maximum number of addressable things is the number that satisfies Nthg = Np/(Npb*f(Ntng)), where Np is the total number of particles.

by m4nu3l

2/19/2026 at 3:36:19 AM

Heh. I once had to make an argument that 256 bit randomly assigned identifiers are good enough without explicit collision checking. People wanted me to add complex and expensive collision checks.

My argument was the 2^256 actually approaches the number of atom in the observable universe (within 1 to 3 orders of magnitude), and that collisions are so unlikely that we'll have millions of datacenter meltdowns first (all assuming we have a good source of random numbers, of course). In the end I convinced everybody that even 128 bits are good enough, without any collision checking required.

I thought my arguments was clever, but this is so much better. :)

by linuxhansl

2/19/2026 at 8:53:15 AM

If the mechanism for generating those 256 random bits is distributed and untrusted parties generate ids, then you need collision detection because they may be malicious

If it's not distributed you can just have a counter

If it's distributed but coordinated by a single party (say, it's your servers), you can do sharding on incremented counters. Like, every server are assigned a region of ids

by nextaccountic

2/19/2026 at 4:36:30 AM

Nah, it's much easier than that.

The total amount of computer data across all of humanity is less that 1 yottabyte. We're expected to reach 1 yottabyte within the next decade, and will probably do so before 2030. That's all data, everywhere, including nation-states.

The birthday paradox says that you'll reach a 50% chance of at least one collision (as a conservative first order approximation) at the square root of the domain size. sqrt(2^256) is 2^128.

Now, a 256 bit identifier takes up 32 bytes of storage. 2^128 * 32 bytes = 10^16 yottabytes. That's 10 quadrillion yottabytes just to store the keys. And it's even odds whether you'll have a collision or not.

And if the 50% number scares them, well, you'll have a 1% chance of a collision at around... 2^128 * 0.1. Yeah, so you don't reach a 1% over the whole life of the system until you get to a quadrillion yottabytes.

Because you're never getting anywhere near the square root of the size, the chances of any collision occurring are flatly astronomical.

by da_chicken

2/19/2026 at 4:46:51 AM

> A more reasonable upper limit might be to assume that every atom in the observable universe will get one ID (we assume atoms won’t be assigned multiple IDs throughout time, which is a concession). There are an estimated atoms in the universe. Using the same equation as above, we find that we need 532 bits to avoid (probabilistically) a collision up to that point.

This doesn't take into account that you will inevitably want to assign unique IDs to various groups of atoms (e.g. this microchip, that car, etc.). And don't even get me started on assigning unique IDs to each subatomic particle.

by kmoser

2/19/2026 at 7:53:05 AM

> unique IDs to each subatomic particle

You only need one ID for each type of particle. Since the laws of physics dictate that the particles themselves are indistinguishable from each other.

by wavemode

2/19/2026 at 3:34:04 PM

Just because a given particle is indistinguishable from another of the same type doesn't mean they are the same actual thing. If you are assigning IDs to each item in the universe for accounting/inventory purposes, you'll still want a separate ID for each particle.

by kmoser

2/19/2026 at 8:44:29 AM

Don't they have different x,y,z positions?

by Drakim

2/19/2026 at 5:16:22 PM

Yes, but the issue is that they don't have identity. The idea of assigning unique identifiers to particles is doomed because, basically, "there are no particles, there are only fields" (https://arxiv.org/abs/1204.4616).

Particles are how quantized fields present themselves when probed by localized interactions. In general, they're also observer-dependent.

The idea of assigning an "ID" to an object reflects a macro-level notion of re-identifiable objects persisting through time. But at the quantum level, that kind of classical individuality - object identity - doesn't exist.

by antonvs

2/19/2026 at 5:21:32 AM

> This doesn't take into account that you will inevitably want to assign unique IDs to various groups of atoms (e.g. this microchip, that car, etc.).

Sure it does. Those are not going to add up to a single extra bit.

by Dylan16807

2/19/2026 at 5:55:53 AM

Even every possible permutation of every single subatomic element in the universe? Even if we just consider atoms, at 10^80 atoms in the entire universe, there are (10^80)! possible permutations, which is many, many, many orders of magnitude larger.

And this isn't even counting sets that include multiples of the same item; once you get into that territory, there really is no upper bound.

by kmoser

2/19/2026 at 6:15:37 AM

New atoms form all the time, either through fusion or fission. The latter is happening right now all around you- either from potassium in plants breaking down, to radon gas that sept up from the ground, to carbon itself. All of these have unstable isotopes with half lives short enough to have at least a little activity near you.

Given that constant change to the available combinations of sets, it would seem that a truly capable system would need to be practically infinite, no?

by zdragnar

2/19/2026 at 10:22:47 AM

No, not every permutation. An atom gets to be in one larger object (recursively).

Definitely no multiples. What would that even mean, also you would need unbounded space for multiples of just two atoms.

by Dylan16807

2/19/2026 at 3:42:43 PM

> Definitely no multiples. What would that even mean, also you would need unbounded space for multiples of just two atoms.

I have a list and I want to assign a unique ID to each list item. Each list item itself contains one or more items:

  1. My umbrella [ID "a"] and my sunglasses [ID "b"]
  2. My umbrella ["a"], my sunglasses ["b"], and my umbrella ["a"]
List item 2 contains two references to my umbrella [ID "a"].

by kmoser

2/19/2026 at 7:12:11 PM

So in a physical sense of identifying things that's nonsense. And again if you allow that then lists of just one or two atoms can take up infinite storage space. That's a super obvious reject. It makes no sense to even try to accommodate it in an ID system.

by Dylan16807

2/19/2026 at 5:16:13 AM

So UUIDv∞ will be at least 536 bit long?

And with group IDs, timestamp, etc. - 1024 bit long?

by nivertech

2/19/2026 at 5:14:48 AM

>And don't even get me started on assigning unique IDs to each subatomic particle.

If a neutrino oscillates between flavors, does it get 3 IDs? Or does it get a new ID with each oscillation?

Thankfully, we only need one electron ID at all.

by NoMoreNicksLeft

2/18/2026 at 7:16:15 PM

Great insights and visualisations!

I build a whole database around the idea of using the smallest plausible random identifiers, because that seems to be the only "golden disk" we have for universal communication, except for maybe some convergence property of latent spaces with large enough embodied foundation models.

It's weird that they are really under appreciated in the scientific data management and library science community, and many issues that require large organisations at the moment could just have been better identifiers.

To me the ship of Theseus question is about extrinsic (random / named) identifiers vs. intrinsic (hash / embedding) identifiers.

https://triblespace.github.io/triblespace-rs/deep-dive/ident...

https://triblespace.github.io/triblespace-rs/deep-dive/tribl...

by j-pb

2/18/2026 at 10:43:45 PM

Entity identity can be intrinsic. Why not consistency contracts?

by ctoth

2/18/2026 at 7:13:58 PM

Just past page 281 of Becky Chambers's delightful "the galaxy, and the ground within".

  Received Message
  Encryption: 0
  From: GC Transit Authority --- Gora System (path: 487-45411-479-4)
  To: Ooli Oht Ouloo (path: 5787-598-66)
  Subject: URGENT UPDATE
Man I love the series.

Looks like this multispecies universe has centrally-agreed-upon path addressing system.

by adityaathalye

2/18/2026 at 8:05:06 PM

You should check out Vernor Vinge's A Fire Upon The Deep for more fun examples of how intra-galactic communication would be labeled, with routes & such.

by pavel_lishin

2/19/2026 at 4:44:50 AM

In fact, it is right here in my stack of to-reads! Picking it up now due to your recommendation. Cheers!

by adityaathalye

2/18/2026 at 7:30:00 PM

From this book in particular, I love the scene with everyone sitting around talking about how horrifying the concept of cheese is. The rest of the quartet is wonderful, with the second book (A Closed and Common Orbit) being the MVP IMO.

by Octoth0rpe

2/19/2026 at 4:14:58 PM

Before replying, I waited for a moment to re-read those couple of pages once more. Cracked up again... Oh the utter incomprehension. And I can relate to the bit about eating the enzyme so that one can eat the cheese without getting sick. Cheese is horrifyingly great.

by adityaathalye

2/19/2026 at 12:53:56 PM

Use a deck of cards for representation. 52 digits where 'K♠' for king-of-spades would be one character in Unicode. it isn't just cosmological unique, it's easier to read, harder to edit manually, and easier for our pattern recognition to keep track of.

And best feature: anyone can generate a random id of such representation by getting a deck of cards and shuffling it properly. Playing cards are ubiquitous. I can see a camera "reading" the decks after they've been splayed on a table after a shuffle. This might even make a better random number seeds.

You're not sure if there is any demand for this sort of stuff? Look at dicekeys:

https://dicekeys.com/

by notepad0x90

2/19/2026 at 1:54:03 PM

> shuffling it properly.

I think you glossed over the big weakness in the idea.

by nkrisc

2/18/2026 at 8:15:54 PM

I forget the context but the other day I also learned about Snowflake IDs [1] that are apparently used by Twitter, Discord, Instagram, and Mastodon.

Timestamp + random seems like it could be a good tradeoff to reduce the ID sizes and still get reasonable characteristics, I'm surprised the article didn't explore there (but then again "timestamps" are a lot more nebulous at universal scale I suppose). Just spitballing here but I wonder if it would be worthwhile to reclaim ten bits of the Snowflake timestamp and use the low 32 bits for a random number. Four billion IDs for each second.

There's a Tom Scott video [2] that describes Youtube video IDs as 11-digit base-64 random numbers, but I don't see any official documentation about that. At the end he says how many IDs are available but I don't think he considers collisions via the birthday paradox.

[1]: https://en.wikipedia.org/wiki/Snowflake_ID

[2]: https://youtu.be/gocwRvLhDf8

by ekipan

2/18/2026 at 8:59:33 PM

> [1]: https://en.wikipedia.org/wiki/Snowflake_ID

Isn't this just the same scheme as version 1 UUID, except with half the bits? I guess they didn't want to dedicate 128 bits to their IDs.

by swiftcoder

2/18/2026 at 8:53:38 PM

Getting the entire universe to agree on a single clock for creating timestamps sounds absurdly difficult. Probably impossible?

by buzzerbetrayed

2/18/2026 at 9:54:45 PM

"Agreement" of time is probably nonsense, yeah. I realized after posting so I edited in the parenthetical, but as [3] notes, locality probably makes this less of a real issue.

Apparently with the birthday paradox 32 bit random IDs only allow some tens of thousands per second before collision chance passes 50%. Maybe that's acceptable?

[3]: https://news.ycombinator.com/item?id=47065241

by ekipan

2/19/2026 at 4:50:24 AM

You don't need the universe to agree. You need your ID system to agree within a reasonable margin of error.

by Zambyte

2/18/2026 at 10:58:04 PM

The temperature of the cosmic microwave background can be used as a universal clock.

by speakeron

2/18/2026 at 11:59:01 PM

So can neutron star spin rates

by UltraSane

2/18/2026 at 11:58:40 PM

Neutron star spins collectively can be used as a pretty accurate clock.

by UltraSane

2/18/2026 at 8:53:36 PM

That also looks like the widely used BSON ids, to anyone else interested

by drchickensalad

2/19/2026 at 12:49:22 PM

Is it possible to construct an ID using some kind of centralized observable phenomena? Due to how time and distance distinguish things, would they always be unique? Like only one person will ever simultaneously observe stars in certain positions and intensities, color, etc. Similar to how I've heard some companies use lava lamps or other noisy processes to generate entropy.

I guess I'm wondering if there is a way to construct a universal coordinate frame for the whole universe? If so, then its possible to trivially assign local time + x + y + z + salt to make unique ids.

by program_whiz

2/18/2026 at 7:29:39 PM

Fun read.

One upside of the deterministic schemes is they include provenance/lineage. Can literally "trace up" the path the history back to the original ID giver.

Kinda has me curious about how much information is required to represent any arbitrary provenance tree/graph on a network of N-nodes/objects (entirely via the self-described ID)?

(thinking in the comment: I guess if worst case linear chain, and you assume that the information of the full provenance should be accessible by the id, that scales as O(N x id_size), so its quite bad. But, assuming "best case" (that any node is expected to be log(N) steps from root, depth of log(N)) feels like global_id_size = log(N) x local_id_size is roughly the optimal limit? so effectively the size of the global_id grows as log(N)^2? Would that mean: from the 399 bit number, with lineage, would be a lower limit for a global_id_size be like (400 bit)^2 ~= 20 kB (because of carrying the ordered-local-id provenance information, and not relative to local shared knowledge)

by bluecoconut

2/18/2026 at 8:13:12 PM

The ATProto underlying BlueSky social network is similar. It uses a content-addressed DAG.

Each “post” has a CID, which is a cryptographic hash of the data. To “prove” ownership of the post, there’s a witness hash that is sent that can be proved all the way up the tree to the repo root hash, which is signed with the root key.

Neat way of having data say “here’s the data, and if you care to verify it, here’s an MST”.

by montyanne

2/18/2026 at 8:40:39 PM

Two ways to frame it:

Provenance is a DAG, so you get a partial order for free by topological sort. That can be extended to a compatible total order. Then provenance for a node is just its ordering. This kind of mapping from objects to the first N consecutive naturals is also a minimal perfect hash function, which have n log n overhead. We can't navigate the tree to track ancestry, but equality implies identical ancestry.

Alternatively, we could track the whole history in somewhat more bits with a succinct encoding, 2N if it's a binary tree.

In practice, deterministic IDs usually accept a 2^-N collision risk to get log n.

by AlotOfReading

2/18/2026 at 8:21:32 PM

From real life we know that people prefer to have multiple anonymous IDs, or self-selected handles, either makes fully deterministic generation schemes moot.

Also, network routing requires objects that have multiple addresses.

Physics side of whole thing is funny too, afaik quantum particles require fungibility, i.e. by doxxing atoms you unavoidably change the behavior of the system.

by rini17

2/18/2026 at 9:15:42 PM

> From real life we know that people prefer to have multiple anonymous IDs

There's nothing stopping a entity from requesting multiple IDs from one of the "devices"!

by pavel_lishin

2/19/2026 at 8:20:06 AM

The random uuid selection is far superior because of lifespan, you can only have so many functioning devices at the same time, and on the contrary to tree-based uuids once a device is decommissioned the uuid can be reclaimed. Practically though it would probably be a mixed algorithm where positioning would give the id root and the rest is selected randomly

by moktonar

2/18/2026 at 11:26:32 PM

Specifying a CSPRNG as an entropy source to avoid collision is incorrect.

CSPRNGs make prediction of the next number difficult (cracking-AES difficulty) but do not add entropy and must be seeded uniquely otherwise they will output the same numbers. Unless the author is proposing having the same machine generate a single universe-scale list in one run.

Also “banning” ids that are all 1s or 0s is silly; they are just as valid and unique as any other number if you’re generating them properly. Although I might suggest purchasing a lottery ticket if you get an UUID with all settable bits as 1.

by stonegray

2/18/2026 at 11:41:54 PM

Banning 0s might be to avoid conflicts of with testing? Kind of like how you’d want to block logins with emails that have a domain example.com. Idk I’m grasping at straws

by left-struck

2/19/2026 at 2:01:48 PM

It’s good to have some known invalid identifiers. They are times where you want to use one that can’t possibly be valid. Having them be easily memorable and obviously invalid is good too.

Imagine if example.com was freely available for anyone to register, think of all the email they could get.

by nkrisc

2/19/2026 at 6:22:33 AM

One could take anything like a cell and split it into genes, molecules, atoms, sub-atomic wave functions (with infinite value range) and take time which can be split into another infinite entity say even within a finite interval. How does this analysis account for that?

I could split this object into 10^500 or 10^50^500^5000 etc., with imagination being the limit.

These values Id'd at whatever imaginable resolution are far from practically useful but at a cosmic scale, there is no telling what is a useful value?

So this framework seems to be more limiting because we define a resolution ?

by cuttothechase

2/19/2026 at 4:36:08 AM

The Dewey section and Elias omega encoding was fun, but it reminded me of Project Xanadu's tumblers[1] - a variable length dotted notion where each segment is unbounded.

Tumblers are modeled using transfinite numbers which makes me wonder: what are the similarities and differences between transfinite numbers and Elias omega encoding? I'm not well versed in either, so I expect it's either a question from ignorance or I may have a lot to learn. :)

1. https://www.artandpopularculture.com/Tumbler_%28Project_Xana...

by kelseyfrog

2/19/2026 at 12:31:40 AM

The practical punchline buried in this analysis: at human scale, the real tradeoff isn't uniqueness vs collision risk — it's uniqueness vs legibility.

Pure random IDs are theoretically optimal but operationally hostile. When something breaks at 3am, you want the ID to whisper where it came from. That's why ULIDs embed timestamps, DNS is hierarchical, and git hashes are content-addressed. You're trading a negligible increase in collision probability for debuggability.

The article's proof that deterministic schemes can't beat linear worst-case growth is elegant. But at any scale where humans are in the loop, it doesn't matter — you're generating maybe 10^12 IDs total, not 10^120. The interesting design space is the middle: enough randomness for uniqueness, enough structure for humans.

by hifathom

2/18/2026 at 11:16:18 PM

Chiming in from the decentralized world - there’s an adversarial / cooperative dynamic in the assignment of these IDs - and the selection of parents, not discussed in the original. I think you could possibly get to sub linear by allowing a small number of cooperative nodes to assign new IDs.

On the contrary, having the right to assign IDs is powerful; on balance, to my mind the right thing to do is some sort of a ZK verifiable random function, e.g. sunspot-based transformations combined with some proof of ‘fair’ random choice. In that case, I think the 800 bit number seems like plenty. You could also do some sort of epoch-based variable length, where for the next billion years or so, we use 1/256 of the ID space, (forced first bit to 0), and so on.

by vessenes

2/19/2026 at 3:59:43 PM

Hmm. There might be 10^80 atoms in the universe, however there are 2^(10^80) possible combinations, more than 2^800.

by tenthirtyam

2/19/2026 at 12:05:56 AM

The obvious solution is a system like IP addresses. Every system has an address like universe.galaxy.region.system or whatever, then the system is subdivided in whatever way is logical for that system.

That way you can route ships or data or whatever to a specific system in a logical way. Each system decides how to allocate addresses. Since most systems won’t have anything or anyone to care, something like NASA or registrars would just allocate a block to the system and give large things like planets an address.

by MagicMoonlight

2/19/2026 at 3:05:23 AM

I’m going to vibe code an app that lets you register a computronium unique id (is that name taken?) I’ll corner the market.

I’m also going to devise a standard that arbitrarily breaks it into groups of hexadecimal digits of arbitrary length in the spirit of UUIDs, and reserve a prefix space for Planck-unit timestamps (computronium-ID-7) so that you can lexicographically sort your COMPID7s.

Man I got to get out in front of this.

by efitz

2/18/2026 at 7:58:37 PM

It is interesting how much of our infrastructure relies on the assumption that 'close enough' is actually 'good enough' for uniqueness. When we move from UUIDs to things like ULIDs or Snowflake IDs, we are really just trading off coordination cost for a slightly higher collision risk that we will likely never hit in several lifetimes. Thinking about it on a 'cosmological' scale makes you realize how much of a luxury local generation is without needing a central authority. It is that tiny bit of entropy that keeps the whole distributed system from grinding to a halt.

by alex_tech92

2/18/2026 at 11:29:46 PM

>the assumption that 'close enough' is actually 'good enough' for uniqueness

i'm pretty sure it's "far enough" that makes it "good enough"

by fsckboy

2/18/2026 at 8:00:49 PM

> In order to fix this, we might start sending out satellites in every direction

Minor correction: Satellites don't go in every direction; they orbit. Probes or spaceships are more appropriate terms.

by factotvm

2/18/2026 at 8:07:57 PM

Maybe they meant at every inclination. ;)

by fluoridation

2/18/2026 at 7:27:09 PM

Quite offtopic, but: I found UUIDs being overused in many cases. People then abused them to store data, making them effectively "speaking IDs" or "multi column indices".

by ktpsns

2/18/2026 at 7:54:39 PM

Unless it's a key that needs to be sortable (e.g. insertion order) or a metric/descriptor of some kind, I'm not sure why UUID would be overused or inappropriate for use.

by jmole

2/19/2026 at 3:22:44 AM

Random UUIDs are not compressible. They are also frequently stored as 38-character strings.

by efitz

2/18/2026 at 9:54:50 PM

I really love everything related to Cosmology but I always struggle with two contrary concepts that lead to paradox (for me) :

- Infinity : from school, we learn our universe is infinite.

- We often do calculation with upper limit like this one : 10^240. This is a big number butttttt it's not infinite you know. 10^240+1, 10^240+2...

So :

1. if it's infinite, why doing upper limit calculation ?

2. if it's limited, what is there outside that limit ?

Extremly paradoxal

by QuiCasseRien

2/19/2026 at 11:44:55 AM

People say the universe is "infinite" because spacetime's curvature is, as far as we can tell, flat, and so it should continue in all directions without ever wrapping back on itself (unlike, say, the Earth, which has spherical curvature).

But practically it's finite because we are only in causal contact with things up to 13.7b ly from us, and given space appears to be expanding at an accelerating rate, we probably will never get into causal contact with (almost all of) the part of the infinite universe outside of our light cone, even though things ought to exist over the "horizon". So only a tiny infinitesimal sliver of the infinite universe is knowable by us.

by brainwad

2/18/2026 at 7:56:42 PM

I'd propose using our current view of physical reality to own a subset of the UIID + version field if new physics is discovered.

10-20 bits: version/epoch

10-20 bits: cosmic region

40 bits: galaxy ID

40 bits: stellar/planetary address

64 bits: local timestamp

This avoids the potentially pathological long chain of provenance, and also encodes coordinates into it.

Every billion years or so it probably makes sense to re-partion.

by manofmanysmiles

2/18/2026 at 8:16:28 PM

As for coordinates, don’t forget galaxies are clouds of stars flowing around and interacting with each other.

by rbanffy

2/18/2026 at 8:33:31 PM

That's the problem with address type of systems is that they expect the object at that location to always be at that location. How do you encode the orbital speed, radius of orbit for not just the object, but also the object it is orbiting will need the same info as it is also in motion, then that object's parent galaxy's motion. Ugh, now I need a nap to calm down a bit.

by dylan604

2/18/2026 at 8:35:44 PM

You could estimate when the object was labelled by the coordinates used.

But where is the Greenwich meridian for the Milky Way?

by rbanffy

2/18/2026 at 11:25:09 PM

offset length

  00     04:    Version + Flags
  04     08:    Timestamp (uint64)
  12     16:    Node/Agent Hash
  28     16:    Namespace Hash
  44     32:    Random Entropy
  76     20:    Extra / Extension
  96     32:    Integrity Hash
Total: 128bytes

by skvmb

2/18/2026 at 8:43:06 PM

We will probably end up with something like each planet has its own local addressing, and the big router in the sky does NAT, each solar system has a router and so on.

by small_model

2/19/2026 at 1:48:10 AM

800 bits is an incomprehensible number of possibilities…yet tiny in comparison to the number of .gifs that could be drawn.

by WhitneyLand

2/19/2026 at 12:27:48 AM

but can you have an id for every id?

by dietsche

2/19/2026 at 11:11:34 AM

"372 bits for 1-gram nanobots"... smh, this is why people call us nerds

by let_tim_cook_

2/18/2026 at 9:30:14 PM

Note that they almost immediately contract from 'the universe' to 'the visible universe', which isn't the same thing at all.

by philipwhiuk

2/18/2026 at 11:26:31 PM

It's observable universe, and that's the only thing that matters. Events outside the observable universe are causally disconnected. We will never interact with anything outside the observable universe. For all practical purposes, it's the same thing.

by mr_mitm

2/18/2026 at 11:02:14 PM

I was going to read this, but it starts with an AI slop header image for no purpose, so I intuited that the article was similarly ill constructed.

by eudamoniac

2/19/2026 at 12:16:08 PM

[dead]

by kittbuilds

2/19/2026 at 12:34:51 AM

[dead]

by indiekitai

2/19/2026 at 3:21:24 AM

It seems like it would be a killer for any sharding systems.

by efitz

2/19/2026 at 3:24:56 AM

Why? Sharding on lower bits seems fine to me.

by linolevan

2/19/2026 at 2:20:09 AM

[dead]

by kittbuilds

2/18/2026 at 9:49:12 PM

Another blow to the "all electrons are the same electron" theory. Why have only 1 electron with so many possible ids /s

by dvh

2/19/2026 at 2:00:40 AM

xhxxhxhxhxhxhxhx

by qewartysuc

2/18/2026 at 8:47:26 PM

The best way to solve this is not to, and just giving up on the idea of identification.

If you have an infinite multiverse of infinite universes, and perhaps layers on that, with different physics, etc., you can’t have identity outside of all existence.

In Judaism, one/the name of God is translated as “I am”. I believe this is because God’s existence is all, transcending whatever concepts you have of existence or of IDs. That ID is the only ID.

So, the cosmic solution to IDs is the name of God.

by frikit

2/18/2026 at 8:52:56 PM

which name of god though - there are hundreds, and were right back at the same place of struggling to come up with a unique identifier.

by mock-possum

2/18/2026 at 8:56:45 PM

gotta be careful:

https://hex.ooo/library/nine_billion_names_of_god.html

by roywiggins

2/18/2026 at 11:42:46 PM

So we need to be careful. We do not know what happens after we assign all UUIDs.

by ccozan

2/19/2026 at 7:22:08 PM

The Nine Quintillion UUIDs of God

by roywiggins