alt.hn

2/23/2026 at 12:04:27 PM

Dictionary Compression is finally here, and it's ridiculously good

https://httptoolkit.com/blog/dictionary-compression-performance-zstd-brotli/

by pimterry

2/23/2026 at 11:13:38 PM

Seems like this only helps large (heavy) websites with consistent content. The real world examples are all large, like YouTube, Amazon, etc...

Small JSON responses that compress to <1k would fit in a single packet, so I don't see the advantage of going from "65 bytes with normal Zstandard compression, vs 28 bytes when using the past response as a dictionary - 57% smaller."

by irq-1

2/24/2026 at 12:00:26 AM

yes i was under the same impression, but i think this LUT/dictionary solution is counter intuitive to both of our current understandings of the web.

The "aha" moment for me was that, without this dict, the user is going to always request a full download of the data. For instance, let's say the NYT published an article and you read it. Then an editors note is added to the article. When you go back to read the article, the data transfer is miniscule. Now that is an edge case, but imagine a website that allows comments.. twitter.. reddit.. small text based pages that at first seem incosequential until you think about how we use the web, millions of users, returning to pages over and over again.

For me, my mental model of this structure is a LUT(key/value pair) wrapped in a Version Control(hash).

Now i think your comment is correct if we were to add how many requests the webpage is recieving and how frequently changes are happening to said webpage. My blog would recieve no benefits from implementing this tech, and using napkin math, my blog would need 1000 days to break even. Microsofts' blog however... less than a day, in theory.

by gnotstic

2/24/2026 at 12:22:47 AM

If the version control hash changes you have to re-download the dictionary, which is similar to redownloading the whole page.

Reddit/NYT would have to publish their changes without changing the dictionary, meaning some portions would be largely absent from the dictionary and have worse compression than gzip. Probably fine for NYT, something like Reddit might actually have worse ratios than gzip in that case.

by everforward

2/23/2026 at 11:30:12 PM

So this is just LZ with a pre-populated window? Any backreferenceing compression can be used this way - just prepopulate the backreference history on both client and server up front and off you go. Why is this new?

by dmitrygr

2/24/2026 at 12:18:51 AM

Per the article, it’s new to browsers, not compression generally, due to the lack of standardization. the future is already here, just not evenly distributed.

by setr