5/10/2026 at 9:47:40 AM
This is a huge achievement for Debian and the free software world.It took a while though until this was understood. In 2007 when pointing out on debian-devel that this is needed, I was still told what huge waste of time this would be. And indeed it took a huge amount of work by many people to get there, but it is well worth it.
by uecker
5/10/2026 at 10:09:37 AM
There was no bug or attack on Debian since 2007 that reproducible packages would prevent."Well worth it" is not correct. And it just ups the the contribution barrier to Debian higher, I already heard a lot of people complaining that contributing to Debian is hard and while in past I defended it by "they need all the checks and bounds to make sure packages play with eachother nicely", this is just step that makes it hard for no reason and little benefit.
by PunchyHamster
5/10/2026 at 10:34:25 AM
” If you are wondering why we are doing this at all, then hopefully the Reproducible Builds website will explain why this is useful.”https://reproducible-builds.org/
Could you perhaps respond to the argumentation here?
by savolai
5/10/2026 at 12:43:52 PM
(Not OP, but...) I still fail to see the current value in confirming that a reproducing builder also included the same compromised dependency that I did when I built it. I understand that reproducible builds are guarding against dynamic attacks within build infrastructure. However I just don't see those happening. Compromised source dependencies are a 100x more common problem.by dvogel
5/10/2026 at 2:24:47 PM
I agree that compromised source dependencies are the bigger problem, but that doesn't mean a compromised build infrastructure isn't. Just this last week, we had two Linux kernel LPEs that could have been leveraged to implement just such an attack, for example.Another thing to consider is that Debian has quite a few derivatives who may also rebuild packages from source, so you have a multiplier there.
by ckastner
5/10/2026 at 2:00:37 PM
https://en.wikipedia.org/wiki/XZ_Utils_backdoorby fsflover
5/10/2026 at 3:15:31 PM
that's example of attack reproducible packages do not protect from, why you are linking it ?by PunchyHamster
5/10/2026 at 3:23:27 PM
A distro automatically verifying that installed packages are reproducible would protect the user?by fsflover
5/10/2026 at 4:49:06 PM
No, it wouldn't. The xzutils attacker compromised the source repository. The build pipeline portions were used to obscure the purpose of the exploit embedded in the source code repository.by dvogel
5/11/2026 at 12:40:48 AM
Your wrong. It was both. The payload was embedded in the binary blob test file. The mechanism to pull it into the build was added to the release tarball only.Here's the quote from the guy that discovered it in the initial public disclosure [1]:
After observing a few odd symptoms around liblzma (part of the xz package) on Debian sid installations over the last weeks (logins with ssh taking a lot of CPU, valgrind errors) I figured out the answer. The upstream xz repository and the xz tarballs have been backdoored. At first I thought this was a compromise of debian's package, but it turns out to be upstream. One portion of the backdoor is *solely in the distributed tarballs* and debian's import of the tarball ... it is also present in the tarballs for 5.6.0 and 5.6.1.
[1]: https://www.openwall.com/lists/oss-security/2024/03/29/4
by theteapot
5/11/2026 at 5:42:16 AM
You're mistaking a compromised build pipeline versus a compromised source repo that only triggers in some build pipelines. You can do reproducible builds from compromised source tarballs. Nothing about reproducible builds necessarily requires source control. Yes, if some people who built from source control compared their builds to the builds from the tarballs it could detect the xzutils compromise. However I have yet to see a reproducible build project that includes such cross-build checks.by dvogel
5/11/2026 at 6:27:42 AM
> Yes, if some people who built from source control compared their builds to the builds from the tarballs it could detect the xzutils compromise.Good. Then we are on the same page.
by theteapot
5/11/2026 at 6:16:29 AM
Nowadays you would work in git and then you would be able to easily detect any discrepancy between the upstream tar ball and the upstream source imported via git. But yes, better support for securing more of the process is needed.by uecker
5/11/2026 at 12:33:20 AM
In xz-utils hack the attacker slipped changes into the Github release tarball that were not present in the Github version / git commit history. The Debian maintainer built from the release tarball instead of just pulling from the git repo directly. Shouldn't have been doing that but good luck convincing him not to use the workflow he's been using for the last X years (I tried). With repro builds we can clone the git directly confirm we get the same build.by theteapot
5/10/2026 at 2:40:44 PM
https://news.ycombinator.com/item?id=48083768by mmooss
5/10/2026 at 3:14:01 PM
I know why they are useful. I am arguing they are waste of time for effort involved.Forcing devs to use hardware keys to sign commits/CI requests would be actual security improvement, thwarting many supply chain attacks that only worked coz the attacker got to developer credentials. Hardware keys at least have option to make some operations require physically pressing the key so there is chance developer will notice.
But thanks to reproducible builds, at least someone can... validate that the binary code of vulnerable package can be reproduced. Very fucking useful.
I am not saying it is useless. I am saying it is one of highest hanging fruits on security tree.
by PunchyHamster
5/10/2026 at 4:11:24 PM
A hardware key does not help if the developer's machine is compromised, as there is no change to understand what is signed anymore, or do you think the hardware key will show all the source code on its little display before signing?With reproducible builds, you do not need to trust that the system that build the binary was not compromised, because this would be detected immediately.
Source compromises are still an issue, but there is a much bigger change that they are detected. Also if there is a compromise, reproducible builds allow you to later track it to the source. For an infected binary it is much more difficult to understand how it got there and what else might be compromised.
by uecker
5/10/2026 at 5:33:48 PM
The way at least YK works is that you can set it up that pressing a key is requires for signing so at least "silently steals creds and sends malicious code" case (which is vast majority of compromises) gets handled> Also if there is a compromise, reproducible builds allow you to later track it to the source.
They do not. Git log and build logs allow for that.
Reproductive builds only have value after the source. They protect build servers from being compromised (and then only if some other uncompromised environment is also running verification passes), if the bug is at source reproductive builds are exactly as valuable as writing commit that was used for build in app's code/package metadata.
by PunchyHamster
5/10/2026 at 6:53:12 PM
If your compiler (or other tool or automatic build environment) is compromised and inserts a backdoor in the binary during building, the fact that you need to hold a key while signing or not is completely irrelevant.git log and build logs do not help you at all, if you can not even determine that the compromised binary has any relation to the build log or the source you may want to look it. This is what reproducible builds give you. You are right that it does not protect against compromised sources.
by uecker
5/10/2026 at 10:10:51 PM
> I know why they are useful. I am arguing they are waste of time for effort involved.Not being reproducible is a bug.
There is no reason for a build to not be reproducible, but somehow we let the built binaries become infested with timestamps, login names and file system paths. We recently moved to reproducible builds at work, and discovered that our login names and local home directory paths were being shipped in every release. No one was was very happy about leaking PPI like that.
You may not consider it worth the effort, but you aren't the one putting in the effort so I'm not sure why that matters to you. It is very much worth the effort to those people doing the work. Debian is a do-orcracy and so the people doing the work get to make the decisions.
by rstuart4133
5/11/2026 at 8:14:04 AM
Sure. The site appears to be a bunch of warm-fuzzies that could apply to almost any other measure you take, it's nothing specific to reproducible builds. As the original poster said "There was no bug or attack on Debian since 2007 that reproducible packages would prevent". In fact, it could be argued that reproducible builds lead to a reduction in security, not an improvement: They give an attacker an exact fixed memory layout for all of the binaries, so if you develop something like a ROP exploit for a copy on your local system you know that exploit will work on every other system as well because the binary layout is identical. It's a perfect monoculture where everything is vulnerable to the same exploit. It seems to have been something created by geeks to impress other geeks, without any considerations for whether it has any actual benefit or not.by pseudohadamard
5/11/2026 at 8:48:00 AM
This comment is misinformed. Non-deterministic builds would also result in one tarball redistributed to all distro users. The ROP exploits don't work because of ASLR.by stabbles
5/12/2026 at 4:27:40 AM
ASLR makes ROP attacks harder, it doesn't stop them, as a great many successful attacks have demonstrated. Heck, bypassing ASLR is taught to students at MIT... can't find the direct link ATM but here's a student assignment, https://csg.csail.mit.edu/6.S983/labs/aslr/.by pseudohadamard
5/13/2026 at 5:36:08 AM
This does not make your comment above less wrong.by uecker
5/10/2026 at 11:02:46 AM
Reproducible builds are applicable not only to respond to ‘attacks’, a subject you seem to be bikeshedding, but also for other reasons too.Anyone having to maintain a code base or a distributed fleet of devices will gain from this decision, immensely, as their operational periods come and go.
Reproducible builds are about longevity as much as they are about security.
Please don’t make bold claims about ‘no reason and little benefit’ while demonstrating ignorance of this hard fact: reproducible builds should have been the norm, in computing, from the get-go.
by MomsAVoxell
5/10/2026 at 11:29:46 AM
I longevity is harmed though. Your certs need to expire in a few years we think that your toolchain will not be downloadable.by bluGill
5/10/2026 at 2:37:03 PM
> your toolchain will not be downloadable.Why not? Debian has a fantastic track record of providing old versions, for instance here's the build tools from Debian 2.0 from 1998:
https://mirrors.accretive-networks.net/debian-archive/debian...
by yjftsjthsd-h
5/10/2026 at 12:13:03 PM
Those problems need to be solved as well.by MomsAVoxell
5/10/2026 at 12:16:22 PM
I don't think they do, actually. Longevity sounds good, but in reality anything that's old probably has critical security holes and so you shouldn't use it anyway.by bluGill
5/11/2026 at 2:53:38 PM
You probably don’t know how industrial computing works.Toolchains absolutely need to be maintained with some degree of longevity.
The whole world doesn’t march to your consumer-user beat. Sometimes it functions at industrial-user tempo’s, too ..
by MomsAVoxell
5/10/2026 at 1:23:50 PM
A warning is sufficient. Old tech should continue to work, for preservation and archival reasons.by iamnothere
5/10/2026 at 10:10:53 PM
I've long ago realized that archival needs to be a separate task left to archivists and archive systems. If you take it into account when designing a live system it's liable to seriously compromise your system design.Say you're making a chat app - you wouldn't incorporate a delete feature, and you might be tempted to use some kind of blockchain to prove all messages were delivered without gaps. But if you ignore archival needs you design something similar to IRC which is much simpler.
by tardedmeme
5/11/2026 at 1:59:24 PM
That depends on your user base. If your group of target users includes professional, corporate, or governmental use, then you absolutely need to build in signing and archival for legal reasons. If your users include people who may only connect occasionally or with flaky connections, then you need a robust way to ensure that messages are held for delivery and that all messages are delivered in order. Basic chat (without delivery guarantees or archival) is already solved by IRC, long ago.by iamnothere
5/10/2026 at 5:37:34 PM
just archiving binary artifacts and source packages is enoughReproducibility adds nothing here
by PunchyHamster
5/11/2026 at 6:13:20 AM
It sure does: no need to keep the binaries around if they are reproducible.by MomsAVoxell
5/11/2026 at 6:12:39 AM
You’re not thinking like an industrial user but rather as a consumer. Maybe you should extend your scope a little bit.by MomsAVoxell
5/11/2026 at 1:07:36 PM
Industry is learning - often the hard way that out of date software is only acceptable if the device is not connected to a network at all. Even government labs with a separate top secret network that isn't supposed to be connected to anything else get hacked from the internet.Not that you are wrong, industry keeps thinking they can make themselves immune and so long term reproducibility is useful, but I submit they are wrong.
by bluGill
5/11/2026 at 2:55:39 PM
Disclaimer: I work in the safety-critical/industrial sector of software.Literally none of your statements are applicable to that realm, sorry.
Rail operators have long since been operating their air-gapped infrastructure with 99.999% safety results, literally not adhering to any of the policies you claim are endemic to the industry.
by MomsAVoxell
5/10/2026 at 5:36:48 PM
> Anyone having to maintain a code base or a distributed fleet of devices will gain from this decision, immensely, as their operational periods come and go.Just baking in build ID and commit is enough. What you think reproducible builds add ?
> Please don’t make bold claims about ‘no reason and little benefit’ while demonstrating ignorance of this hard fact: reproducible builds should have been the norm, in computing, from the get-go.
So far not a single person in the thread gave me concrete example (as in existing project, existing problem, no other solution can solve it). Just claiming it's better based on their feelings. Come on, be the first one.
by PunchyHamster
5/11/2026 at 6:15:54 AM
I already gave you an example, you dismissed it because you know better, but it is clear that you haven’t thought this through from the perspective of systems designers who have to deploy a base OS, with expected lifetimes of years, across a large fleet of devices.Think industrial applications, such as rail and heavy industry transportation. We use reproducible builds here as part of a wider safety-critical protocol which guarantees that what we are running is what we expect to run - nothing more, nothing less.
Reproducible builds are certifiable. They can be relied on in environments where certification costs millions and takes years.
Think outside your consumer box for a minute.
by MomsAVoxell
5/10/2026 at 10:36:11 AM
Reproducible builds reduce the need for trusted parties.Have many organizations produce the binaries independently and post the arifacts.
Once n of m parties agree on the arifact hash, take that as the trusted build.
If every party reaches a different hash then we cannot build consensus.
by azkalam
5/10/2026 at 12:25:41 PM
To move away from organizational dependence, there should be an installable project for debian where I can dedicate some configurable small percentage of my compute when idle to reproducibly building debian components to make a robust verification system, starting with the most critical code.Obviously, it would be a ton of work to make such a system resistant to gaming by malicious actors (see GNU Guix for useful efforts), but it would provide valuable diversity in architecture and (political or other) control.
It would be even cooler if we could have independent projects that could run on various distros and OS, and build packages for any of them. Having packages for bsd verified on linux and vice-versa with statistical logging (this code has been verified x times on y OSes) would be reassuring.
by sgc
5/10/2026 at 3:16:19 PM
I think that project is called Ubuntu.by PunchyHamster
5/10/2026 at 5:21:14 PM
I don't know of anything Ubuntu is doing that is significantly beyond what Debian is doing in this regard, nor that they have a distributed reproduction system set up???by sgc
5/10/2026 at 3:38:34 PM
Building Ubuntu does not produce identical binaries to Debian, so no, that's not what they're asking forby saghm
5/10/2026 at 1:15:04 PM
Is the "Jia Tan" XZ Utils compromise not a good example? That relied on code snuck into a release that was not in source.(It was caught before being promoted into a stable Debian release, yes, but this sort of relied on a happy accident, too close for comfort)
by benregenspan
5/10/2026 at 1:21:43 PM
The xz hack was still reproducible, because it was included in the distribution archive which did not match the upstream source -- even then, it was so obfuscated it likely would have gone unnoticed, but nevertheless it only lived in the uploaded tarball and not in the repo. Reproducibility is a good thing, but the next step is build provenance.Still, lots of good non-security benefits to reproducible builds too.
by chuckadams
5/10/2026 at 2:54:32 PM
The xz utils compromise is a very good example... of why reproducible packages doesn't actually solve anything security-wise!The backdoor relied first on a difference between building a package in a packaging environment versus building the package on your own. And also, it relied on the very common practice of checking in unreviewable artifacts into the source tree (e.g., the configure script, malicious binary test artifacts).
Reproducible builds guarantee that two people can follow the same instructions and get the same, bit-identical outcome. It does nothing to guarantee that those instructions have not been compromised, and all of the great packaging security failures of my lifetime that I can think of have relied on those instructions being compromised (e.g., xz utils, Debian OpenSSL keygen issues).
by jcranmer
5/10/2026 at 3:52:41 PM
An attack would be far easier without reproducible packages. One could upload a compromised binary to debian by becoming a debian developer, blackmail a debian developer to so, or compromise the computer of a debian developer or the distribution.At the time of xz attack, the package was already reproducible.
by uecker
5/10/2026 at 6:15:39 PM
I'll give an analogy to email and spam. A lot of effort has been spent making sure that if an email is from x@example.com, it actually came from x@example.com, giving us things like SPF, DKIM, and DMARC. And it turns out that the most eager adopters of the newest technology are... the spammers themselves! Because they don't need to lie about their email address; they can have that be completely honest, and instead resort to other tricks to mislead users as to who they are (e.g., the display name, which most email clients blindly trust and happily display).Similarly for package managers, the biggest issues are typo-squatting or maintainer credentials compromise. And in neither case does the attacker have any incentive to take advantage of it in a way that breaks reproducibility--they can be completely honest about what they're doing. Now even if I were an attacker who had compromised a maintainer's machine, I'd still probably go for compromising the source rather than compromising the final artifact-generation process... simply because compromising the source makes the exploit live longer.
As xz shows, once you have a compromised maintainer, there's basically nothing you can do to fix it except by having someone else discover the compromise and locking out the maintainer.
by jcranmer
5/10/2026 at 7:00:07 PM
Typo-squatting attacks are more of an issue for non-curated software collections, not so much for Debian. If you use npm or cargo or similar, then you have indeed far bigger worries. Compromising the source has the disadvantage for the attacker that it much easier to detect. Again, if you always install the newest things from an non-curated collection that may not matter much, but for something such as Debian, this increases the probability of detection a lot. One can argue that xz shows that it is possible to hide things in the source, but it also shows how much effort was needed to do this. (and the xz package was reproducible, so compromising the debian system and uploading a binary would have a high risk of detection. That this was not done can therefor not serve as evidence that binary attacks are not an issue. )by uecker
5/10/2026 at 3:18:32 PM
No, reproducible builds make such backdoors more difficult to sneak in together with other checks.by goodpoint
5/10/2026 at 10:46:10 AM
It makes shipping backdoors a whole lot harder, yes.by eptcyka
5/10/2026 at 5:38:16 PM
Unless someone spins entirely separate infrastructure dedicated just for verifying Debian packages, it doesn't.by PunchyHamster
5/10/2026 at 11:50:54 AM
Hmm, it prevents Trojan binaries which is a small subset of backdoor IMHO.Defense in depth obviously is a good thing
by zaphirplane
5/10/2026 at 10:12:29 AM
There was perhaps no detected bug or attack. There have most likely been bugs or attacks that reproducible builds would have prevented.by aborsy
5/10/2026 at 1:54:01 PM
There have most likely been bugs or attacks that reproducible builds would have prevented.Like what exactly?
by CyberDildonics
5/10/2026 at 10:17:26 AM
And you base it on what exactly ? It's "just" making sure the build process is always ordered.If anything it will make attacker's job easier, as Ubuntu package will have same files structured exactly same way as Debian one.
by PunchyHamster
5/10/2026 at 3:37:56 PM
> as Ubuntu package will have same files structured exactly same way as Debian one.As opposed to what? If Ubuntu uses the same source, of course they get the same binaries. And if Ubuntu applies patches, they'll get something different. And that's still true.
by yjftsjthsd-h
5/10/2026 at 2:00:34 PM
"mimimimi".Those people do not care about quality in opensource at all. For longliving software this is very important.
Of course, all those javascript and kubernetes packages which are irrelevant in a few years again, might complain, but let them complain.
by deknos
5/10/2026 at 2:08:50 PM
> There was no bug or attack on Debian since 2007 that reproducible packages would prevent.I'm reading this as a suggestion that the reproducible builds effort was an ineffective deterrent.
However, note that your observation could also be explained by the opposite: the reproducible builds effort was an effective deterrent, so nobody bothered with attempts.
> And it just ups the the contribution barrier to Debian higher
Until yesterday, the package just got flagged in the tracker, and you could either ignore it, or fix it yourself, or the kind people behind the reproducible builds effort supplied a patch themselves.
Now, you can no longer ignore it. But fixes are often trivial. Use a (stable) timestamp provided by the build, seed RNGs with some constant (instead of eg: time), etc. These are best practices anyway.
by ckastner
5/10/2026 at 3:19:15 PM
> However, note that your observation could also be explained by the opposite: the reproducible builds effort was an effective deterrent, so nobody bothered with attempts.There was no attack that reproducible builds would help protect from before 2007 either.
> Until yesterday, the package just got flagged in the tracker, and you could either ignore it, or fix it yourself, or the kind people behind the reproducible builds effort supplied a patch themselves.
> Now, you can no longer ignore it. But fixes are often trivial. Use a (stable) timestamp provided by the build, seed RNGs with some constant (instead of eg: time), etc.
that's the entirety of the problem. App developers don't want to be package experts or build experts.
> These are best practices anyway.
They are not. They are best practices if you want reproducible builds. They are entirely useless waste of time if you don't care.
by PunchyHamster
5/10/2026 at 3:35:30 PM
> that's the entirety of the problem. App developers don't want to be package experts or build experts.App developers and Debian package maintainers are already separate groups.
by yjftsjthsd-h
5/10/2026 at 4:15:55 PM
> They are not. They are best practices if you want reproducible builds.Or if you're writing a test suite, and you want failing test results to be actionable.
Or you have any other type of behavior that you'd like to reproduce somehow.
One of the first things app developers ask for in bug/issue templates are the steps to reproduce something. I wonder why you'd think that they would suddenly be opposed to the concept when thinking of a build peocess.
by ckastner
5/10/2026 at 5:44:19 PM
> Or if you're writing a test suite, and you want failing test results to be actionable.The class of bugs would be extremely small as the stuff that makes build hard to reproduce are 99% of the time stuff irrelevant to runtime like some build time embedded in binary, some file metadata having different timestamp, or maybe linker putting stuff in a bit different order.
> One of the first things app developers ask for in bug/issue templates are the steps to reproduce something. I wonder why you'd think that they would suddenly be opposed to the concept when thinking of a build peocess.
I think you will find amount of people that had problems reproducing because of non-100% exact build is vanishingly small, possibly non-existent.
And that is because if you get package version and want to reproduce it, you get the package, install it and try to reproduce it. The package WILL be 100% the same as the one you got in bug report because you both downloaded the same artifact from same mirror network. You don't need reproducibility to get same binary to reproduce bug
by PunchyHamster
5/10/2026 at 3:12:25 PM
That’s a big logical fallacy, I’m not sure if that’s what you want to go withby Atotalnoob