Comment by PunchyHamster
Comment by PunchyHamster 5 days ago
see latest "MS just divilged disk encryption keys to govt" news to see why this is a horrid idea
Comment by PunchyHamster 5 days ago
see latest "MS just divilged disk encryption keys to govt" news to see why this is a horrid idea
I can see usefulness if the flow was "the device is unlocked by default, there are no keys/certs on it, and it can be reset to that state (for re-use purpose)"
Then the user can put their own key there (if say corporate policies demand it), but there is no 3rd party that can decide what the device can do.
But having 3rd party (and US one too!) that is root of all trust is a massive problem.
oh hi ChatGPT
The giveaway is that LLMs love bulleted lists with a bolded attention-grabbing phrase to start each line. Copy-pasting directly to HN has stripped the bold formatting and bullets from the list, so the attention-grabbing phrase is fused into the next sentence, e.g. “Potential for abuse Attestation enables blacklisting”
Calling this a "giveaway" is kind of hilarious. LLMs use bulleted lists because humans have always used bulleted lists—in RFCs, design docs, and literally every tech write-up ever. Structure didn't suddenly become artificial in 2023. lol.
I’m skeptical about the push toward third-party hardware attestation for Linux kernels. Handing kernel trust to external companies feels like repeating mistakes we’ve already seen with iOS and Android, where security mechanisms slowly turned into control mechanisms.
Centralized trust Hardware attestation run by third parties creates a single point of trust (and failure). If one vendor controls what’s “trusted,” Linux loses one of its core properties: decentralization. This is a fundamental shift in the threat model.
Misaligned incentives These companies don’t just care about security. They have financial, legal, and political incentives. Over time, that usually means monetization, compliance pressure, and policy enforcement creeping into what started as a “security feature.”
Black boxes Most attestation systems are opaque. Users can’t easily audit what’s being measured, what data is emitted, or how decisions are made. This runs counter to the open, inspectable nature of Linux security today.
Expanded attack surface Adding external hardware, firmware, and vendor services increases complexity and creates new supply-chain and implementation risks. If the attestation authority is compromised, the blast radius is massive.
Loss of user control Once attestation becomes required (or “strongly encouraged”), users lose the ability to fully control their own systems. Custom kernels, experimental builds, or unconventional setups risk being treated as “untrusted” by default.
Vendor lock-in Proprietary attestation stacks make switching vendors difficult. If a company disappears, changes terms, or decides your setup is unsupported, you’re stuck. Fragmentation across vendors also becomes likely.
Privacy and tracking Remote attestation often involves sending unique or semi-unique device signals to external services. Even if not intended for tracking, the capability is there—and history shows it eventually gets used.
Potential for abuse Attestation enables blacklisting. Whether for business, legal, or political reasons, third parties gain the power to decide what software or hardware is acceptable. That’s a dangerous lever to hand over.
Harder incident response If something goes wrong inside a proprietary attestation system, users and distro maintainers may have little visibility or ability to respond independently.