cyphar 5 days ago

I'm Aleksa, one of the founding engineers. We will share more about this in the coming months but this is not the direction nor intention of what we are working on. The models we have in mind for attestation are very much based on users having full control of their keys. This is not just a matter of user freedom, in practice being able to do this is far more preferable for enterprises with strict security controls.

I've been a FOSS guy my entire adult life, I wouldn't put my name to something that would enable the kinds of issues you describe.

  • ingohelpinger 5 days ago

    Thanks for the clarification and to be clear, I don't doubt your personal intent or FOSS background. The concern isn't bad actors at the start, it's how projects evolve once they matter.

    History is pretty consistent here:

    WhatsApp: privacy-first, founders with principles, both left once monetization and policy pressure kicked in.

    Google: 'Don’t be evil' didn’t disappear by accident — it became incompatible with scale, revenue, and government relationships.

    Facebook/Meta: years of apologies and "we'll do better," yet incentives never changed.

    Mobile OS attestation (iOS / Android): sold as security, later became enforcement and gatekeeping.

    Ruby on Rails ecosystem: strong opinions, benevolent control, then repeated governance, security, and dependency chaos once it became critical infrastructure. Good intentions didn't prevent fragility, lock-in, or downstream breakage.

    Common failure modes:

    Enterprise customers demand guarantees - policy creeps in.

    Governments demand compliance - exceptions appear.

    Liability enters the picture - defaults shift to "safe for the company."

    Revenue depends on trust decisions - neutrality erodes.

    Core maintainers lose leverage - architecture hardens around control.

    Even if keys are user-controlled today, the key question is architectural: Can this system resist those pressures long-term, or does it merely promise to?

    Most systems that can become centralized eventually do, not because engineers change, but because incentives do. That’s why skepticism here isn't personal — it's based on pattern recognition.

    I genuinely hope this breaks the cycle. History just suggests it's much harder than it looks.

  • drdaeman 5 days ago

    Can you (or someone) please tell what’s the point, for a regular GNU/Linux user, of having this thing you folks are working on?

    I can understand corporate use case - the person with access to the machine is not its owner, and corporation may want to ensure their property works the way they expect it to be. Not something I care about, personally.

    But when it’s a person using their own property, I don’t quite get the practical value of attestation. It’s not a security mechanism anymore (protecting a person from themselves is an odd goal), and it has significant abuse potential. That happened to mobile, and the outcome was that users were “protected” from themselves, that is - in less politically correct words - denied effective control over their personal property, as larger entities exercised their power and gated access to what became de-facto commonplace commodities by forcing to surrender any rights. Paired with awareness gap the effects were disastrous, and not just for personal compute.

    So, what’s the point and what’s the value?

    • fc417fc802 4 days ago

      The value is being able to easily and robustly verify that my device hasn't been compromised. Binding disk encryption keys to the TPM such that I don't need to enter a password but an adversary still can't get at the contents without a zero day.

      Of course you can already do the above with secure boot coupled with a CPU that implements an fTPM. So I can't speak to the value of this project specifically, only build and boot integrity in general. For example I have no idea what they mean by the bullet "runtime integrity".

      • NekkoDroid 4 days ago

        > For example I have no idea what they mean by the bullet "runtime integrity".

        This is for example dm-verity (e.g. `/usr/` is an erofs partiton with matching dm-verity). Lennart always talks about either having files be RW (backed by encryption) or RX (backed by kernel signature verification).

      • drdaeman 4 days ago

        I don’t think attestation can provide such guarantees. To best of my understanding, it won’t protect from any RCE, and it won’t protect from malicious updates to configuration files. It won’t let me run arbitrary binaries (putting a nail to any local development), or if it will - it would be a temporary security theater (as attackers would reuse the same processes to sign their malware). IDSes are sufficient for this purpose, without negative side effects.

        And that’s why I said “not a security mechanism”. Attestation is for protecting against actors with local hardware access. I have FDE and door locks for that already.

      • giant_loser a day ago

        > The value is being able to easily and robustly verify that my device hasn't been compromised.

        That is impossible.

        "secure" devices get silently tampered with everyday.

        You can never guarantee that.

    • its-summertime 5 days ago

      https://attestation.app/about For mobiles, it helps make tampering obvious.

      https://doc.qubes-os.org/en/latest/user/security-in-qubes/an... For laptops, it helps make tampering obvious. (a different attestation scheme with smaller scope however)

      This might not be useful to you personally, however.

  • repstosb 5 days ago

    The "founding engineers" behind Facebook and Twitter probably didn't set out to destroy civil discourse and democracy, yet here we are.

    Anyway, "full control over your keys" isn't the issue, it's the way that normalization of this kind of attestation will enable corporations and governments to infringe on traditional freedoms and privacy. People in an autocratic state "have full control over" their identity papers, too.

  • teiferer 5 days ago

    > I've been a FOSS guy my entire adult life, I wouldn't put my name to something that would enable the kinds of issues you describe.

    Until you get acquired, receive a golden parachute and use it when realizing that the new direction does not align with your views anymore.

    But, granted, if all you do is FOSS then you will anyway have a hard time keeping evil actors from using your tech for evil things. Might as well get some money out of it, if they actually dump money on you.

    • cyphar 4 days ago

      I am aware of that, my (personal) view is that DRM is a social issue caused by modes of behaviour and the existence or non-existence of technical measures cannot fix or avoid that problem.

      A lot of the concerns in this thread center on TPMs, but TPMs are really more akin to very limited HSMs that are actually under the user's control (I gave a longer explanation in a sibling comment but TPMs fundamentally trust the data given to them when doing PCR extensions -- the way that consumer hardware is fundamentally built and the way TPMs are deployed is not useful for physical "attacks" by the device owner).

      Yes, you can imagine DRM schemes that make use of them but you can also imagine equally bad DRM schemes that do not use them. DRM schemes have been deployed for decades (including "lovely" examples like the Sony rootkit from the 2000s[1], and all of the stuff going on even today with South Korean banks[2]). I think using TPMs (and other security measures) for something useful to users is a good thing -- the same goes for cryptography (which is also used for DRM but I posit most people wouldn't argue that we should eschew all cryptography because of the existence of DRM).

      [1]: https://en.wikipedia.org/wiki/Sony_BMG_copy_protection_rootk... [2]: https://palant.info/2023/01/02/south-koreas-online-security-...

    • mikkupikku 5 days ago

      This whole discussion is a perfect example of what Upton Sinclair said, "It is difficult to get a man to understand something, when his salary depends on his not understanding it."

      A rational and intelligent engineer cannot possibly believe that he'll be able to control what a technology is used for after he creates it, unless his salary depends on him not understanding it.

    • faust201 5 days ago

      You could tell this sort of insinuation to anyone. Including you.

      Argument should be technical.

      • teiferer 5 days ago

        Insinuation? As a sw dev they don't have any agency over whether or by whom they get acquired. Their decision will be whether to leave if it's changing to the worse, and that's very much understandable (and arguably the ethical thing to do).

      • seanhunter 5 days ago

        That's a perfectly valid objection to this proposal. You only have to look at what happened to Hashicorp to see the risk.

      • majewsky 5 days ago

        > You could tell this sort of insinuation to anyone. Including you.

        Yes. You correctly stated the important point.

      • pseudalopex 4 days ago

        > Argument should be technical.

        Yes. Aleksa made no technical argument.

  • ahartmetz 5 days ago

    So far, that's a slick way to say not really. You are vague where it counts, and surely you have a better idea of the direction than you say.

    Attestation of what to whom for which purpose? Which freedom does it allow users to control their keys, how does it square with remote attestation and the wishes of enterprise users?

    • cyphar 4 days ago

      I'm really not trying to be slick, but I think it's quite difficult to convince people about anything concrete (such as precisely how this model is fundamentally different to models such as the Secure Boot PKI scheme and thus will not provide a mechanism to allow a non-owner of a device to restrict what runs on your machine) without providing a concrete implementation and design documents to back up what I'm saying. People are rightfully skeptical about this stuff, so any kind of explanation needs to be very thorough.

      As an aside, it is a bit amusing to me that an initial announcement about a new company working on Linux systems caused the vast majority of people to discuss the impact on personal computers (and games!) rather than servers. I guess we finally have arrived at the fabled "Year of the Linux Desktop" in 2026, though this isn't quite how I expected to find out.

      > Attestation of what to whom for which purpose? Which freedom does it allow users to control their keys, how does it square with remote attestation and the wishes of enterprise users?

      We do have answers for these questions, and a lot of the necessary components exist already (lots of FOSS people have been working on problems in this space for a while). The problem is that there is still the missing ~20% (not an actual estimate) we are building now, and the whole story doesn't make sense without it. I don't like it when people announce vapourware, so I'm really just trying to not contribute to that problem by describing a system that is not yet fully built, though I do understand that it comes off as being evasive. It will be much easier to discuss all of this once we start releasing things, and I think that very theoretical technical discussions can often be quite unproductive.

      In general, I will say that there a lot of unfortunate misunderstandings about TPMs that lead people to assume their only use is as a mechanism for restricting users. This is really not the case, TPMs by themselves are actually more akin to very limited HSMs with a handful of features that can (cooperatively with firmware and operating systems) be used to attest to some aspects of the system state. They are also fundamentally under the users' control, completely unlike the PKI scheme used by Secure Boot and similar systems. In fact, TPMs are really not a useful mechanism for protecting against someone with physical access to the machine -- they have to trust that the hashes they are given to extend into PCRs are legitimate and on most systems the data is even provided over an insecure data line. This is why the security of locked down systems like Xbox One[1] don't really depend on them directly and don't use them at all in the way that they are used on consumer hardware. They are only really useful at protecting against third-party software-based attacks, which is something users actually want!

      All of the comments about DRM obviously come from very legitimate concerns about user freedoms, but my views on this are a little too long to fit in a HN comment -- in short, I think that technological measures cannot fix a social problem and the history of DRM schemes shows that the absence of technological measures cannot prevent a social problem from forming either. It's also not as if TPMs haven't been around for decades at this point.

      [1]: https://www.youtube.com/watch?v=U7VwtOrwceo

      • ahartmetz 4 days ago

        >I think that technological measures cannot fix a social problem

        The absence of technological measures used to implement societal problems totally does help though. Just look at social media.

        I fear the outlaw evil maid or other hypothetical attackers (good old scare-based sales tactics) much less than already powerful entities (enterprises, states) lawfully encroaching on my devices using your technology. So, I don't care about "misunderstandings" of the TPM or whatever other wall of text you are spewing to divert attention.

  • iamnothere 5 days ago

    Thanks, this would be helpful. I will follow on by recommending that you always make it a point to note how user freedom will be preserved, without using obfuscating corpo-speak or assuming that users don’t know what they want, when planning or releasing products. If you can maintain this approach then you should be able to maintain a good working relationship with the community. If you fight the community you will burn a lot of goodwill and will have to spend resources on PR. And there is only so much that PR can do!

    Better security is good in theory, as long as the user maintains control and the security is on the user end. The last thing we need is required ID linked attestation for accessing websites or something similar.

  • LooseMarmoset 5 days ago

    that’s great that you’ll let users have their own certificates and all, but the way this will be used is by corporations to lock us out into approved Linux distributions. Linux will be effectively owned by RedHat and Microsoft, the signing authority.

    it will be railroaded through in the same way that systemD was railroaded onto us.

    • giant_loser a day ago

      > but the way this will be used is by corporations to lock us out into approved Linux distributions. Linux will be effectively owned by RedHat and Microsoft, the signing authority.

      This is the intent of the Poettering and Brauner.

    • cyphar 4 days ago

      > but the way this will be used is by corporations to lock us out into approved Linux distributions. Linux will be effectively owned by RedHat and Microsoft, the signing authority.

      This is basically true today with Secure Boot on modern hardware (at least in the default configuration -- Microsoft's soft-power policies for device manufacturers actually requires that you can change this on modern machines). This is bad, but it is bad because platform vendors decide which default keys are trusted for secure boot by default and there is no clean automated mechanism to enroll your own keys programmatically (at least, without depending on the Microsoft key -- shim does let you do this programmatically with the MOK).

      The set of default keys ended up being only Microsoft (some argue this is because of direct pressure from Microsoft, but this would've happened for almost all hardware regardless and is a far more complicated story), but in order to permit people to run other operating systems on modern machines Microsoft signed up to being a CA for every EFI binary in the universe. Red Hat then controls which distro keys are trusted by the shim binary Microsoft signs[1].

      This system ended up centralised because the platform vendor (not the device owner) fundamentally controls the default trusted key set and is what caused the whole nightmare of the Microsoft Secure Boot keys and rh-boot signing of shim. Getting into the business of being a CA for every binary in the world is a very bad idea, even if you are purely selfish and don't care about user freedoms (and it even makes Secure Boot less useful of a protection mechanism because it means that machines where users only want to trust Microsoft also necessarily trust Linux and every other EFI binary they sign -- there is no user-controlled segmentation of trust, which is the classic CA/PKI problem). I don't personally know how the Secure Boot / UEFI people at Microsoft feel about this, but I wouldn't be surprised if they also dislike the situation we are all in today.

      Basically none of these issues actually apply to TPMs, which are more akin to limited HSMs where the keys and policies are all fundamentally user-controlled in a programmatic way. It also doesn't apply to what we are building either, but we need to finish building it before I can prove that to you.

      [1]: https://github.com/rhboot/shim-review

  • 5d41402abc4b 5 days ago

    What was it that the Google founders said about not adding advertisements to Google search?

  • dTal 5 days ago

    Thanks for the reassurance, the first ray of sunshine in this otherwise rather alarming thread. Your words ring true.

    It would be a lot more reassuring if we knew what the business model actually was, or indeed anything else at all about this. I remain somewhat confused as to the purpose of this announcement when no actual information seems to be forthcoming. The negative reactions seen here were quite predictable, given the sensitive topic and the little information we do have.

  • curt15 5 days ago

    > The models we have in mind for attestation are very much based on users having full control of their keys.

    If user control of keys becomes the linchpin for retaining full control over one's own computer, doesn't it become easy for a lobby or government to exert control by banning user-controlled keys? Today, such interest groups would need to ban Linux altogether to achieve such a result.

  • inetknght 5 days ago

    Can I build my own kernel and still use software that wants attestation?

    • surajrmal 5 days ago

      Do you have a way to tell the software to trust your kernel? If so, yes. Things like the web show how we can achieve distributed trust.

      • account42 5 days ago

        "Trust" has become such an orwellian word in tech.

      • cferry 4 days ago

        That's the thing. I can only provide a piece of software with the guarantee it can run on my OS. It can trust my kernel to let it run, but shouldn't expect anything more. The editor is free to run code it wants to guarantee the integrity of on its own infrastructure; but whatever reaches my machine _may_ at best run as the editor intends.

  • wooptoo 4 days ago

    > The models we have in mind for attestation are very much based on users having full control of their keys.

    FOR NOW. Policies and laws always change. Corporations and governments somehow always find ways to work against their people, in ways which are not immediately obvious to the masses. Once they have a taste of this there's no going back.

    Please have a hard and honest think on whether you should actually build this thing. Because once you do, the genie is out and there's no going back.

    This WILL be used to infringe on individual freedoms.

    The only question is WHEN? And your answer to that appears to be 'Not for the time being'.

  • account42 5 days ago

    > I've been a FOSS guy my entire adult life, I wouldn't put my name to something that would enable the kinds of issues you describe.

    The road to hell is paved with good intentions.

  • endgame 5 days ago

    That's not the intention, but how do you stop it from being the effect?

  • trelane 5 days ago

    Glad to hear it! I am not surprised given the names and the fact you're at FOSDEM.

  • qmr 5 days ago

    What engineering discipline?

    PE or EIT?

  • [removed] 5 days ago
    [deleted]
  • michaelmrose 5 days ago

    This is extremely bad logic. The technology of enforcing trusted software is without inherent value good or ill depending entirely on expected usage. Anything that is substantially open will be used according to the values of its users not according to your values so we ought instead to consider their values not yours.

    Suppose you wanted to identify potential agitators by scanning all communication for indications in a fascist state one could require this technology in all trusted environments and require such an environment to bank, connect to an ISP, or use Netflix.

    One could even imagine a completely benign usage which only identified actual wrong doing alongside another which profiled based almost entirely on anti regime sentiment or reasonable discontent.

    The good users would argue that the only problem with the technology is its misuse but without the underlying technology such misuse is impossible.

    One can imagine two entirely different parallel universes one in which a few great powers went the wrong way in part enabled by trusted computing and the pervasive surveillance enabled by the capability of AI to do the massive and boring task of analyzing a massive glut of ordinary behaviour and communication + tech and law to ensure said surveillance is carried out.

    Even those not misusing the tech may find themselves worse off in such a world.

    Why again should we trust this technology just because you are a good person?

    • [removed] 5 days ago
      [deleted]
    • michaelmrose 5 days ago

      TLDR We already know how this will be misused to take away people's freedom not to run their own software stack but to dissent against fascism. It's immoral to build even with the best intentions.

  • quotemstr 5 days ago

    You're providing mechanism, not policy. It's amazing how many people think they can forestall policies they dislike by trying to reject mechanisms that enable them. It's never, ever worked. I'm glad there are going to be more mechanisms in the world.

enriquto 5 days ago

half of the founders of this thing come from Microsoft. I suppose this makes the answer to your question obvious.

  • stackghost 5 days ago

    My thoughts exactly. We're probably witnessing the beginning of the end of linux users being able to run their own kernels. Soon:

    - your bank won't let you log in from an "insecure" device.

    - you won't be able to play videos on an "insecure" device.

    - you won't be able to play video games on an "insecure" device.

    And so on, and so forth.

    • dijit 5 days ago

      Unfortunately the parent commenter is completely right.

      The attestation portion of those systems is happening on locked down devices, and if you gain ownership of the devices they no longer attest themselves.

      This is the curse of the duopoly of iOS and Android.

      BankID in Sweden will only run with one of these devices, they used to offer a card system but getting one seems to be impossible these days. So you're really stuck with a mobile device as your primary means of identification for banking and such.

      There's a reason that general purpose computers are locked to 720p on Netflix and Disney+; yet AppleTV's are not.

      • yxhuvud 5 days ago

        Afaik bankid will actually run as long as you can install play store (IE the device don't need Google certificate), which isn't great but a little bit better than what it could have been.

      • LtWorf 5 days ago

        I just received by mail a card to replace my soon expiring one… (not a debt card, the one to do internet banking and so on).

        However the problem is that A LOT of things only work with the mobile app.

      • ahepp 5 days ago

        as you say, a lot of this stuff is already happening. Won’t it be good to have a FOSS attestation stack that breaks the iOS/android duopoly?

    • seba_dos1 5 days ago

      This is already the world we live in when it comes to the most popular personal computing devices running Linux out there.

      • stefan_ 5 days ago

        This is already the world you live in just running some recent Ubuntu. Try writing, building and loading a kernel module!

        Of course its all nonsense make believe, the "trust root" is literally a Microsoft signed stub. For this dummy implementation you can't modify your own kernel anymore.

        • plagiarist 5 days ago

          And you cannot remove it on every motherboard because some of the firmware blobs are signed. You cannot remove their keys and leave only your own.

    • anonym29 4 days ago

      Torrenting is becoming more popular again. The alternative to being allowed to pay to watch on an "insecure" device isn't switching to an attested device, it's to stop paying for the content at all. Games industry, same thing (or just play the good older games, the new ones suck anyway).

      Finances, just pay everything by cheque or physical pennies. Fight back. Starve the tyrants to death where you can, force the tyrants to incur additional costs and inefficiencies where you can't.

    • JasonADrury 5 days ago

      Is the joke here that all of those things have already been happening for a while now?

  • blibble 5 days ago

    that's a silver lining

    the anti-user attestation will at least be full of security holes, and likely won't work at all

    • sam_lowry_ 5 days ago

      Dunno about the others but Pottering has proven himself to deliver software against the grain.

      • dijit 5 days ago

        You think?

        It took us nearly a decade and a half to unfuck the pulseaudio situation and finally arrive at a simple solution (pipewire).

        SystemD has a lot more people refining it down but a clean (under the hood) implementation probably won't be witnessed in my lifetime.

      • nacozarina 5 days ago

        LP is the Thomas Midgley Jr of Computer Science.

      • wang_li 5 days ago

        I thought he had proven that he leaves before the project is complete and functioning according to all the promises made.

      • tonoto 4 days ago

        agent Smith, the one that don't care at all about conforming to POSIX?

        "In fact, the way I see things the Linux API has been taking the role of the POSIX API and Linux is the focal point of all Free Software development. Due to that I can only recommend developers to try to hack with only Linux in mind and experience the freedom and the opportunities this offers you. So, get yourself a copy of The Linux Programming Interface, ignore everything it says about POSIX compatibility and hack away your amazing Linux software. It's quite relieving!" -- https://archive.fosdem.org/2011/interview/lennart-poettering...

      • mikkupikku 5 days ago

        Poettering gas a track record of recognizing good ideas from Apple, then implementing them poorly. He also has a track record of closing bug reports for plain and simple bugs in his software to protect his own ego, and this kind of mentality isn't a great basis for security sensitive software.

        Audio server for linux: Great idea! Pulseaudio: Genuinely a terrible implementation of it, Pipewire is a drop in replacement that actually works.

        Launchd but for Linux: Great idea! SystemD: generally works now at least, but packed with insane defaults and every time this is brought up with the devs they say its the distro packagers jobs to wipe SystemD's ass and clean up the mess before users see it.

        Security bug in SystemD when the user has a digit in their username: Lennart closes the bug and says that SystemD is perfect, the distros erred by permitting such usernames. Insane ego-driven response.

      • [removed] 5 days ago
        [deleted]
qmr 5 days ago

"At long last, we have created the Torment Nexus from classic sci-fi novel Don't Create The Torment Nexus."