Posted by hornedhob 1/27/2026
See: “it’s just an init system”where it’s now also a resolver, log system, etc.
I can buy good intentions, but this opens up too much possibility for not-so-good-intended consequences. Deliberate or emergent.
it's a buggy-as-hell resolver, buggy-as-hell log system, buggy-as-hell ntp client, buggy-as-hell network manager, ...
Atomicity means you can track every change, and every change is so small that it affects only one thing and can be traced, replayed or rolled back. Like it's going from A to B and being able to return back to A (or going to B again) in a determinate manner.
See the "features" list from systemd 257/258 [0].
>We are building cryptographically verifiable integrity into Linux systems
I wonder what that means ? It could be a good thing, but I tend to think it could be a privacy nightmare depending on who controls the keys.
You. The money quote about the current state of Linux security:
> In fact, right now, your data is probably more secure if stored on current ChromeOS, Android, Windows or MacOS devices, than it is on typical Linux distributions.
Say what you want about systemd the project but they're the only ones moving foundational Linux security forward, no one else even has the ambition to try. The hardening tools they've brought to Linux are so far ahead of everything else it's not even funny.
That sort of things.
It's not propaganda in any sense, it's recognizing that Linux is behind the state of the art compared to Windows/macOS when it comes to preventing tampering with your OS install. It's not saying you should use Windows, it's saying we should improve the Linux boot process to be a tight security-wise as the Windows boot process along with a long explanation of how we get there.
It's only secure from evil maker attacks if it can be wiped and reinitialised at any time.
Is it possible someone will eventually build a system that doesn't allow this? Yes. Is this influenced in any way by features of Linux software? No.
No, not you. Someone else for you. And that's the scary part.
In a few years running random code on your computer would be seen a bit unethical.
I hope this never happens. I really want my data secure and I do have something to hide. So, no Microsoft keys on my computer and only I will decide what kind of software I get to run.
Absolutely fuck that.
Turning off SecureBoot only means any rando can decide what software runs on your device and install a bootkit. Not authenticating the rest of the boot process as outlined here (what Microsoft calls Trusted Boot) only means that randos can tamper with your OS using the bits that can't be encrypted.
Literally an own-goal in every sense of the word.
I see it as exactly the opposite: turning SecureBoot on means someone else can and will decide what software runs on my device.
> spite Microsoft or something you're going to make your data less secure
We all know very well Microsoft's track record with security and with data protection measures and practice. Trusting Microsoft is... irrational, let's put it that way.
the guys that copy your bitlocker keys in the clear
(London. On some of my relatives.)
But I'm sure in this case when they achieve some kind of dominant position and Microsoft offers to re-absorb them they will do the honorable thing.
These people don't, but people you've never heard of are always doing honorable things.
Might be some sort of connection there.
Somebody will use it and eventually force it if it exists and I don't think gaming especially those requiring anti-cheat is worth that risk.
If that means linux will not be able to overtake window's market share, that's ok. At-least the year of the linux memes will still be funny.
e.g. https://support.faceit.com/hc/en-us/articles/19590307650588-...
> IOMMU is a powerful hardware security feature, which is used to protect your machine from malicious software
The ring-0 anticheat IS that fucking malicious software
It will.
Then just a bit later no movies for you unless you are running a blessed distro. Then Chrome will start reporting to websites that you are this weird guy with a dangerous unlocked distro, so no banking for you. Maybe no government services as well because obviously you are a hacker. Why would you run an unlocked linux if you were not?
As said above, it's about who controls the keys. It's either building your own castle or having to live with the Ultimate TiVo.
We'll see.
I have my reservations, ideas, and what it's supposed to do, but this is not a place to make speculations and to break spirits.
I'll put my criticism out politely when it's time.
Not you. This technology is not being built for you.
Look, I hate systemd just as much as the next guy - but how are you getting "DRM" out of this?
Doing complex flows like "run app to load keys from remote server to unlock encrypted partition" is far easier under systemd and it have dependency system robust enough to trigger that mount automatically if app needing it starts
There are also bad forms of remote attestation (like Google's variant that helps them let banks block you if you are running an alt-os). Those suck and should be rejected.
Edit: bri3d described what I mean better here: https://news.ycombinator.com/item?id=46785123
No doubt. Fully agree with you on that. However Intel ME will make sure no system is truly secure and server vendors do add their mandatory own backdoors on top of that (iLO for HP, etc).
Having said that, we must face the reality: this is not being built for you to secure your servers.
Let's say I accept this statement.
What makes you think trusted boot == remote attestation?
No, it's not. (And for that matter, neither is remote attestation)
You're conflating the technology with the use.
I believe that you have only thought about these technologies as they pertain to DRM, now I'm here to tell you there are other valid use cases.
Or maybe your definition of "DRM" is so broad that it includes me setting up my own trusted boot chain on my own hardware? I don't really think that's a productive definition.
This company is explicitly all about implementing remote attestation (which is a form of DRM):
> Remote Attestation of Imutable Operating Systems built on systemd
> Lennart Poettering
Is there a HN full moon out?
Again, this is wrong.
DRM is a policy.
Remote attestation is a technology.
You can use remote attestation to implement DRM.
You can also use remote attestation to implement other things.
They literally don't.
For a decade, I worked on secure boot & attestation for a device that was both:
- firmware updatable - had zero concept or hardware that connected it to anything that could remotely be called a network
The update is predicated on a valid signature.
Would love to hear more of your thoughts on how the users of the device I worked on had their freedom restricted!
I guess my company, the user of the device that I worked on, was being harmed by my company, the creator of the device that I worked on. It's too bad that my company chose to restrict the user's freedom in this way.
Who cares if the application of the device was an industrial control scenario where errors are practically guaranteed to result in the loss of human life, and as a result are incredibly high value targets ala Stuxnet.
No, the users rights to run any code trumps everything! Commercial device or not, ever sold outside of the company or not, terrorist firmware update or not - this right shall not be infringed.
I now recognize I have committed a great sin, and hope you will forgive me.
IMO it's pretty clear that this is a server play because the only place where Linux has enough of a foothold to make client / end-user attestation financially interesting is Android, where it already exists. And to me the server play actually gives me more capabilities than I had: it lets me run my code on cloud provided machines and/or use cloud services with some level of assurance that the provider hasn't backdoored me and my systems haven't been compromised.
It's like designing new kinds of nerve gas, "quite sure" that it will only ever be in the hands of good guys who aren't going to hurt people with it. That's powerful naïveté. Once you make it, you can't control who has it and what they use it for. There's no take-backsies, that's why it should never be created in the first place.
The "bad" version, client attestation, is already implemented on Android, and could be implemented elsewhere but is only a parallel concept.
There is unmet industrial market demand for the (IMO) "not so bad / maybe even good" version, server attestation.
Interesting choice of analogy, to compare something with the singular purpose to destroy biological entities, to a computing technology that enforces what code is run.
Can you not see there might be positive, non-destructive applications of the latter? Are you the type of person that argues cars shouldn't exist due to their negative impacts while ignoring all the positives?
For individuals, IMO the risk mostly come from software they want to run (install script or supply chain attack). So if the end user is in control of what gets signed, I don't see much benefit. Unless you force users to use an app store...