Posted by robalni 6 hours ago
It took a while though until this was understood. In 2007 when pointing out on debian-devel that this is needed, I was still told what huge waste of time this would be. And indeed it took a huge amount of work by many people to get there, but it is well worth it.
"Well worth it" is not correct. And it just ups the the contribution barrier to Debian higher, I already heard a lot of people complaining that contributing to Debian is hard and while in past I defended it by "they need all the checks and bounds to make sure packages play with eachother nicely", this is just step that makes it hard for no reason and little benefit.
https://reproducible-builds.org/
Could you perhaps respond to the argumentation here?
Have many organizations produce the binaries independently and post the arifacts.
Once n of m parties agree on the arifact hash, take that as the trusted build.
If every party reaches a different hash then we cannot build consensus.
Anyone having to maintain a code base or a distributed fleet of devices will gain from this decision, immensely, as their operational periods come and go.
Reproducible builds are about longevity as much as they are about security.
Please don’t make bold claims about ‘no reason and little benefit’ while demonstrating ignorance of this hard fact: reproducible builds should have been the norm, in computing, from the get-go.
If anything it will make attacker's job easier, as Ubuntu package will have same files structured exactly same way as Debian one.
(Orange = FTBR = "failed to build reproducibly")
I'm not good at reading numbers from charts, but I'd guess it's a few percent (4-5ish?).
> Forbidden
> <p>You are not allowed to access this!</p>
(yes, with HTML tags on display) :)
EDIT: I also found a "I Challenge Thee" page in history. did I just get blocked by antibot measures? why???
BTW, most Debian packages have reproducible builds. Those which have not (I'd say 5%) are shown in orange in the graph there: https://wiki.debian.org/ReproducibleBuilds
It's not like the Linux world where you have distinct projects like the Kernel, GNU, OpenSSL, and then it's the distributions job to assemble everything.
In the BSD projects, the scope is developing and distributing an entire base system, i.e., the kernel but also the libc, the shell/all posix utilities, and a few third parties like OpenSSH (which are usually "softforked").
It's quite visible in the sources, it's a lot more than just a kernel: https://github.com/NetBSD/src
Additional packages you could get from pkg_in/pkgsrc (NetBSD), pkg-ng/ports (FreeBSD) or pkg_add (OpenBSD) are clearly distinct from the base system, installed in a dedicated subtree (/usr/src in NetBSD, /usr/local/ OpenBSD/FreeBSD), and provided in a best effort manner.
The reproducible build target was almost certainly only for the base system, which is a few percent of what Debian tries to achieve, and on which NetBSD has a tighter control over (developer + distributor instead of downstream assembler+distributor).
A reproducible base system is useful, but given how quickly you typically need to install packages from pkgsrc, it's not quite enough.
Debian has come along way, but when Debian says reproducible they mean they grab third party binaries to build theirs. When we say reproducible we mean 100% bootstrapped from source code all the way through the entire software supply chain.
We think that distinction matters.
Reproducible builds are an essential method in industrial computing - Debian isn’t at the forefront of this, it is merely adopting industry wide techniques also applied to other operating systems in use in long-term and safety-related applications.
Certainly, a lot of the hard work of the Yocto and Debian developers is already in your hands.
What is interesting is that this is now being applied in a more forward-focused policy by the Debian developers, that it will now be the norm rather than an option…
You don't have permission to access this resource. Apache Server at lists.debian.org Port 443
:/