Lennart Poettering, Christian Brauner founded a new company
(amutable.com)373 points by hornedhob 5 days ago
373 points by hornedhob 5 days ago
Daan here, founding engineer and systemd maintainer.
So we try to make every new feature that might be disruptive optional in systemd and opt-in. Of course we don't always succeed and there will always be differences in opinion.
Also, we're a team of people that started in open source and have done open source for most of our careers. We definitely don't intend to change that at all. Keeping systemd a healthy project will certainly always stay important for me.
Hi Daan,
Thanks for the answer. Let me ask you something close with a more blunt angle:
Considering most of the tech is already present and shipping in the current systemd, what prevents our systems to become a immutable monolith like macOS or current Android with the flick of a switch?
Or a more grave scenario: What prevents Microsoft from mandating removal of enrollment permissions for user keychains and Secure Boot toggle, hence every Linux distribution has to go through Microsoft's blessing to be bootable?
So adding all of this technology will certainly make it more easy to be used for either good or bad. And it will certainly become possible to build an OS that will be less hackable than your run of the mill Linux distro.
But we will never enforce using any of these features in systemd itself. It will always be up to the distro to enable and configure the system to become an immutable monolith. And I certainly don't think distributions like Fedora or Debian will ever go in that direction.
We don't really have any control over what Microsoft decides to do with Secure Boot. If they decide at one point to make Secure Boot reject any Linux distribution and hardware vendors prevent enrolling user owned keys, we're in just as much trouble as everyone else running Linux will be.
I doubt that will actually happen in practice though.
> What prevents Microsoft from mandating removal of enrollment permissions for user keychains and Secure Boot toggle
Theoretically, nothing. But it's worth pointing out that so far they have actually done the opposite. They currently mandate that hardware vendors must allow you to enroll your own keys. There was a somewhat questionable move recently where they introduced a 'more secure by default' branding in which the 3rd party CA (used e.g. go sign shim for Linux) is disabled by default, but again, they mandated there must be an easy toggle to enable it. I don't begrudge them to much for it, because there have been multiple instances of SB bypass via 3rd party signed binaries.
All of this is to say: this is not a scenario I'm worried about today. Of course this may change down the line.
> What prevents Microsoft from mandating removal of enrollment permissions for user keychains and Secure Boot toggle, hence every Linux distribution has to go through Microsoft's blessing to be bootable?
Why are you buying hardware that Microsoft controls if you're concerned about this?
Thanks Daan for your contributions to systemd.
If you were not a systemd maintainer and have started this project/company independently targeting systemd, you would have to go through the same process as everyone and I would have expected the systemd maintainers to, look at it objectively and review with healthy skepticism before accepting it. But we cannot rely on that basic checks and balances anymore and that's the most worrying part.
> that might be disruptive optional in systemd
> we don't always succeed and there will always be differences in opinion.
You (including other maintainers) are still the final arbitrator of what's disruptive. The differences of opinion in the past have mostly been settled as "deal with it" and that's the basis of current skepticism.
Systemd upstream has reviewers and maintainers from a bunch of different companies, and some independent: Red Hat, Meta, Microsoft, etc. This isn't changing, we'll continue to work through consensus of maintainers regardless of which company we work at.
>We are building cryptographically verifiable integrity into Linux systems. Every system starts in a verified state and stays trusted over time.
What problem does this solve for Linux or people who use Linux? Why is this different from me simply enabling encryption on the drive?
Drive encryption is only really securing your data at rest, not while the system is running. Ideally image based systems also use the kernels runtime integrity checking (e.g. dm-verity) to ensure that things are as they are expected to be.
> we try to make every new feature that might be disruptive optional in systemd and opt-in
I find it hard to believe. Like, at all. Especially given that the general posture of your project leader is the exact opposite of that.
> systemd a healthy project
I can see that we share the same view that there are indeed differences in opinion.
systemd is the most well supported init systemd there.
Trusted computing and remote attestation is like two people who want to have sex requiring clean STD tests first. Either party can refuse and thus no sex will happen. A bank trusting a random rooted smartphone is like having sex with a prostitute with no condom. The anti-attestation position is essentially "I have a right to connect to your service with an unverified system, and refusing me is oppression." Translate that to the STD context and it sounds absurd - "I have a right to have sex with you without testing, and requiring tests violates my bodily autonomy."
You're free to root your phone. You're free to run whatever you want. You're just not entitled to have third parties trust that device with their systems and money. Same as you're free to decline STD testing - you just don't get to then demand unprotected sex from partners who require it.
A fundamentally flawed way to make an argument?
Yeah I know what analogies are.
Why does my bank need to know whether the machine in my hands that is accessing their internet APIs was attested by some uninvolved third party or not?
You know we used to hand people pieces of paper with letters and numbers on them to do payments right? For some reason, calling up my bank on the phone never required complicated security arrangements.
TD Bank never needed to come inspect my phone lines to ensure nobody was listening in.
Instead of securing their systems and working on making it harder to have your accounts taken over (which by the way is a fruitful avenue of computer security with plenty of low hanging fruit) and punishing me for their failures, they want to be able to coerce me to only run certain software on my equipment to receive banking services.
This wasn't necessary for banking for literally thousands of years.
Why now? What justification is there?
A third party attesting my device can only be used to compel me to only use certain devices from certain third parties. The bank is not at all going to care whether I attest to it or not, they are going to care that Google or Microsoft will attest my device.
And for what? To what end? To prevent what alleged harm?
In what specific way does an attested device state make interacting with a publicly facing interface more secure?
It WILL be used to prevent you from being able to run certain code that benefits you at corporation's expense, like ad blockers.
Linux is supposed to be an open community. Who even asked for this?
"Why does my bank need to know whether the machine in my hands that is accessing their internet APIs was attested by some uninvolved third party or not?"
Because there are an infinite ways for a computer to be insecure and very few ways for it to be secure.
Checks were a form of attestation because they contained security features that banks would verify.
Would YOU be willing to use a bank that refused to use TLS? I didn't think so. How is you refusing to accept remote attestation and the bank refusing to connect to you any different?
So both consent to sex and now one thinks they're entitled to marriage. That's where this inevitably leads, user/customer lock-in and control.
While the bank use case makes a compelling argument, device attestation won't be used for just banks. It's going to be every god damned thing on the internet. Why? Because why the hell not, it further pushes the costs of doing business of banks/MSPs/email providers/cloud services onto the customer and assigns more of the liabilities.
It will also further the digital divide as there will be zero support for devices that fail attestation at any service requiring it. I used to think that the friction against this technology was overblown, but over the last eighteen months I've come to the conclusion that it is going to be a horrible privacy sucking nightmare wrapped in the gold foil of security.
I've been involved in tech a long, long time. The first thing I'm going to do when I retire is start chucking devices. I'm checking-out, none of this is proving to be worth the financial and privacy costs.
"It's going to be every god damned thing on the internet. Why? Because why the hell not"
This is not a persuasive argument.
You are also ignoring the fact that YOU can use remote attestation to verify remote computers are running what they say they are.
"I've been involved in tech a long, long time. The first thing I'm going to do when I retire is start chucking devices. I'm checking-out, none of this is proving to be worth the financial and privacy costs."
You actually sound like you are having a nervous breakdown. Perhaps you should take a vacation.
You are trying to portrait it as an exchange between equal parties which it isn't. I am totally entitled not to have to use a thrid-party-controlled device to access government services. Or my bank account.
Trusted computing boil down to restricting what software I'm allowed to run on hardware I own and use. The technical means to do so are irrelevant.
Can it sets terms on my religious and political views? I'm not speaking about race and sex, you cannot choose them (ok, sex you could in some jurisdictions, and there is difference between sex and gender, please, don't be nitpicky here), but about things I can choose same as I can choose my hardware and software to run.
If there is real effective market (which is not in any country on Earth, especially for banks), you could say: vote with you money, choose bank which suits you. But it is impossible even with bakery, less with banks on market which is strictly regulated (in part as result of lobbying by established institutions, to protect themselves!).
So, on one hand, I must use banks (I cannot pay for many things in cash, here, where I live most of bars and many shops doesn't accept cash, for example, and it is result of government politics and regulations), and on other hand banks is not seen as essential as access to air and water, they could dictate any terms they want.
I see this situation completely screwed.
Frankly this disgusts me. While there are technically user-empowering ways this can be used, by far the most prevalent use will be to lock users/customers out of true ownership of their own devices.
Device attestation fails? No streaming video or audio for you (you obvious pirate!).
Device attestation fails? No online gaming for you (you obvious cheater!).
Device attestation fails? No banking for you (you obvious fraudster!).
Device attestation fails? No internet access for you (you obvious dissident!).
Sure, there are some good uses of this, and those good uses will happen, but this sort of tech will be overwhelmingly used for bad.
I just want more trustworthy systems. This particular concept of combining reproducible builds, remote attestation and transparency logs is something I came up with in 2018. My colleagues and I started working on it, took a detour into hardware (tillitis.se) and kind of got stuck on the transparency part (sigsum.org, transparency.dev, witness-network.org).
Then we discovered snapshot.debian.org wasn't feeling well, so that was another (important) detour.
Part of me wish we had focused more on getting System Transparency in its entirety in production at Mullvad. On the other hand I certainly don't regret us creating Tillitis TKey, Sigsum, taking care of Debian Snapshot service, and several other things.
Now, six years later, systemd and other projects have gotten a long way to building several of the things we need for ST. It doesn't make sense to do double work, so I want to seize the moment and make sure we coordinate.
These kinds of problems are very common in certain industries.
Can you share more details at this point about what you are trying to tackle as a first step?
> Favorite color?
As per the announcement, we’ll be building a favorite color over the next months and sharing more information as it rolls out.
Probably also some of the things that were described here? https://0pointer.net/blog/fitting-everything-together.html
Are you guys hiring? I can emulate a grim smile and have no problem being diabolical if the pay is decent so maybe I am a good fit? I can also pet goats
The photos depict these people as funny hobbits :D. Photographer trolled them big time. Now, the only question left is whether their feet are hairy.
---
Making secure boot 100 times simpler would be a deffo plus.
I'm not seeing any big problems with the portraits.
Having said that, should this company not be successful, Mr Zbyszek Jędrzejewski-Szmek has potentially a glowing career as an artists' model. Think Rembrandt sketches.
I look forward to something like ChromeOS that you can just install on any old refurbished laptop. But I think the money is in servers.
this is very interesting... been watching the work around bootc coupling with composefs + dm_verity + signed UKI, I'm wondering if this will build upon that.
- How different is this from Fedora BlueFin or silverblue?
- it looks like they want to build a ChromeOS without Google.
fantastic news, congrats on launching! it's a great mission statement a fanstastic ensemble for the job
So much negativity in this thread. I actually think this could be useful, because tamper-proof computer systems are useful to prevent evil maid attacks. Especially in the age of Pegasus and other spyware, we should also take physical attack vectors into account.
I can relate to people being rather hostile to the idea of boot verification, because this is a process that is really low level and also something that we as computer experts rarely interact with more deeply. The most challenging part of installing a Linux system is always installing the boot loader, potentially setting up an UEFI partition. These are things that I don't do everyday and that I don't have deep knowledge in. And if things go wrong, then it is extra hard to fix things. Secure boot makes it even harder to understand what is going on. There is a general lack of knowledge of what is happening behind the scenes and it is really hard to learn about it. I feel that the people behind this project should really keep XKCD 2501 in mind when talking to their fellow computer experts.
> I actually think this could be useful
Yeah it could be. Could. But it also could be used for limiting freedoms with general purpose computing. Guess what is it going to be?
> hostile to the idea of boot verification, because this is a process that is really low level
Not because of that.
Because it's only me who gets to decide what runs on my computer, not someone else. I don't need LP's permission to run binaries.
I personally do not worry about an evil maid attack _at all_. But I do worry about someone restricting what I can do with _my_ computer.
I mean, in theory, the idea is great. But it WILL be misused by greedy fucks.
it won’t matter if you disable it. You simply won’t be able to use your PC with any commercial services, in the same way that a rooted android installation can’t run banking apps without doing things to break that, and what they’re working on here aims to make that “breakage“ impossible.
How do they plan to make Linux (with MLoCs...) deterministic?
Why not adopt seL4 like everybody else who is not outright delusional[0][1]?
How long until you have SIL-4 under control and can demonstrate it?
Hmph, AFAIK systemd has been struggling with TPM stuff for a while (much longer than I anticipated). It’s kinda understandable that the founder of systemd is joining this attestation business, because attestation ultimately requires far more than a stable OS platform plus an attestation module.
A reliably attestable system has to nail the entire boot chain: BIOS/firmware, bootloader, kernel/initramfs pairs, the `init` process, and the system configuration. Flip a single bit anywhere along the process, and your equipment is now a brick.
Getting all of this right requires deep system knowledge, plus a lot of hair-pulling adjustment, assuming if you still have hair left.
I think this part of Linux has been underrated. TPM is a powerful platform that is universally available, and Linux is the perfect OS to fully utilize it. The need for trust in digital realm will only increase. Who knows, it may even integrate with cryptocurrency or even social platforms. I really wish them a good luck.
Terrible idea, I hope go bankrupt.
I can see like a 100 ways this can make computing worse for 99% people and like 1-2 scenarios where it might actually be useful.
Like if the politicians pushing for chat control/on device scanning of data come knocking again and actually go through (they can try infinitely) tech like this will really be "useful". Oops your device cannot produce a valid attestation, no internet for you.
The typical HN rage-posting about DRM aside, there's no reason that remote attestation can't be used in the opposite direction: to assert that a server is running only the exact code stack it claims to be, avoiding backdoors. This can even be used with fully open-source software, creating an opportunity for OSS cloud-hosted services which can guarantee that the OSS and the build running on the server match. This is a really cool opportunity for privacy advocates if leveraged correctly - the idea could be used to build something like Apple's Private Cloud Compute but even more open.
In addition, the benefit is a bit ridiculous, like that of DRM itself. Even if it worked, literally your "trusted software" is going to be running in an office full of the most advanced crackers money can buy, and with all the incentive to exploit your schema but not publish the fact that they did. The attack surface of the entire thing is so large it boggles the mind that there are people who believe on the "secure computing cloud" scenario.
WHAT is the usage and benefit for private users? This is always neglected.
avoiding backdoors as a private person you always can only solve with having the hardware at your place, because hardware ALWAYS can have backdoors, because hardware vendors do not fix their shit.
From my point of view it ONLY gives control and possibilities to large organizations like governments and companies. which in turn use it to control citizens
You're absolutely right, but considering Windows requirements drive the PC spec, this capability can be used to force Linux distributions in bad ways.
So, some of the people doing "typical HN rage-posting about DRM" are also absolutely right.
The capabilities locking down macOS and iOS and related hardware also can be used for good, but they are not used for that.
> but considering Windows requirements drive the PC spec, this capability can be used to force Linux distributions in bad ways
What do you mean by this?
Is the concern that systemd is suddenly going to require that users enable some kind of attestation functionality? That making attestation possible or easier is going to cause third parties to start requiring it for client machines running Linux? This doesn't even really seem to be a goal; there's not really money to be made there.
As far as I can tell the sales pitch here is literally "we make it so you can assure the machines running in your datacenter are doing what they say they are," which seems pretty nice to me, and the perversions of this to erode user rights are either just as likely as they ever were or incredibly strange edge cases.
Microsoft has a "minimum set of requirements" document about "Designed for Windows" PCs. You can't sell a machine with Windows or tell it's Windows compatible without complying with that checklist.
So, every PC sold to consumers is sanctioned by Microsoft. This list contains Secure Boot and TPM based requirements, too.
If Microsoft decides to eliminate enrollment of user keys and Secure Boot toggle, they can revoke current signing keys for "shims" and force Linux distributions to go full immutable to "sign" their bootloaders so they can boot. As said above, it's not something Amutable can control, but enable by proxy and by accident.
Look, I work in a datacenter, with a sizeable fleet. Being able to verify that fleet is desirable for some kinds of operations, I understand that. On the other hand, like every double edged sword, this can cut in both ways.
I just want to highlight that, that's all.
> there's no reason that remote attestation can't be used in the opposite direction
There is: corporate will fund this project and enforce its usage for their users not for the sake of the users and not for the sake of doing any good.
What it will be used for is to bring you a walled garden into Linux and then slowly incentivize all software vendors to only support that variety of Linux.
LP has a vast, vast experience in locking down users' freedom and locking down Linux.
> There is: corporate will fund this project and enforce its usage for their users not for the sake of the users and not for the sake of doing any good.
I'd really love to see this scenario actually explained. The only place I could really see client-side desktop Linux remote attestation gaining any foothold is to satisfy anti-cheat for gaming, which might actually be a win in many ways.
> What it will be used for is to bring you a walled garden into Linux and then slowly incentivize all software vendors to only support that variety of Linux.
What walled garden? Where is the wall? Who owns the garden? What is the actual concrete scenario here?
> LP has a vast, vast experience in locking down users' freedom and locking down Linux.
What? You can still use all of the Linuxes you used to use? systemd is open source, open-application, and generally useful?
Like, I guess I could twist my brain into a vision where each Ubuntu release becomes an immutable rootfs.img and everyone installs overlays over the top of that, and maybe there's a way to attest that you left the integrity protection on, but I don't really see where this goes past that. There's no incentive to keep you from turning the integrity protection off (and no means to do so on PC hardware), and the issues in Android-land with "typical" vendors wanting attestation to interact with you are going to have to come to MacOS and Windows years before they'll look at Linux.
> client-side desktop Linux remote attestation gaining any foothold is to satisfy anti-cheat for gaming, which might actually be a win in many ways.
It will be, no doubt. As soon as it is successfully tested and deployed for games, it will be used for movies, government services, banks, etc. And before you know you do not have control of your own computer.
> Who owns the garden?
Not you.
> everyone installs overlays over the top of that
Except this breaks cryptography and your computer is denied multiple services. Because you are obviously a hacker, why else would anyone want to compile and run programs.
> turning the integrity protection off (and no means to do so on PC hardware)
It's a flip of a switch, really. Once Microsoft decides you have had enough, the switch is flipped and in a couple of years no new Intel computer will boot your kernel.
it doesn't stop remote code injection. Protecting boot path is frankly hardly relevant on server compared to actual threats.
You will get 10000 zero days before you get a single direct attack at hardware
The idea is that by protecting boot path you build a platform from which you can attest the content of the application. The goal here is usually that a cloud provider can say “this cryptographic material confirms that we are running the application you sent us and nothing else” or “the cloud application you logged in to matched the one that was audited 1:1 on disk.”
I knew they had an authoritarian streak. This is not surprising, and frankly horrifyingly dystopian.
"Those who give up freedom for security deserve neither."
Really excited to a company investing into immutable and cryptographically verifiable systems. Two questions really:
1. How will the company make money? (You have probably been asked that a million times :).)
2. Similar to the sibling: what are the first bits that you are going to work on.
At any rate, super cool and very nice that you are based in EU/Germany/Berlin!
1. We are confident we have a very robust path to revenue.
2. Given the team, it should be quite obvious there will be a Linux-based OS involved.
Our aims are global but we certainly look forward to playing an important role in the European tech landscape.
"We are confident we have a very robust path to revenue."
I take it that you are not at this stage able to provide details of the nature of the path to revenue. On what kind of timescale do you envisage being able to disclose your revenue stream/subscribers/investors?
"Ubuntu Core" is a similar product [1]
As I understand it, the main customers for this sort of thing are companies making Tivo-style products - where they want to use Linux in their product, but they want to lock it down so it can't be modified by the device owner.
This can be pretty profitable; once your customers have rolled out a fleet of hardware locked down to only run kernels you've signed.
Appreciate the clarification, but this actually raises more questions than it answers.
A "robust path to revenue" plus a Linux-based OS and a strong emphasis on EU / German positioning immediately triggers some concern. We've seen this pattern before: wrap a commercially motivated control layer in the language of sovereignty, security, or European tech independence, and hope that policymakers, enterprises, and users don't look too closely at the tradeoffs.
Europe absolutely needs stronger participation in foundational tech, but that shouldn't mean recreating the same centralized trust and control models that already failed elsewhere, just with an EU flag on top. 'European sovereignty' is not inherently better if it still results in third-party gatekeepers deciding what hardware, kernels, or systems are "trusted."
Given Europe's history with regulation-heavy, vendor-driven solutions, it's fair to ask:
Who ultimately controls the trust roots?
Who decides policy when commercial or political pressure appears?
What happens when user interests diverge from business or state interests?
Linux succeeded precisely because it avoided these dynamics. Attestation mechanisms that are tightly coupled to revenue models and geopolitical branding risk undermining that success, regardless of whether the company is based in Silicon Valley or Berlin.
Hopefully this is genuinely about user-verifiable security and not another marketing-driven attempt to position control as sovereignty. Healthy skepticism seems warranted until the governance and trust model are made very explicit.
We detached this subthread from https://news.ycombinator.com/item?id=46784719.
No personal attacks on HN, please.
This is relevant. Every project he's worked on has been a dumpster fire. systemd sucks. PulseAudio sucks. GNOME sucks. Must the GP list out all the ways in which they suck to make it a more objective attack?
My comment was not a personal attack. But I can rephrase it if you want it more in the spirit of the guidelines. Here we go:
I'm interested in what Amutable is building, but I'm personally uneasy about Lennart Poettering being involved. This isn't about denying his technical ability or past impact. My concern is more about the social/maintenance dynamics that have repeatedly shown up around some of the projects he's led in the Linux ecosystem - highly centralizing designs, big changes quickly landing in core technology, and the kind of communication/governance style that at times left downstream maintainers and parts of the community feeling steamrolled rather than brought along. I've watched enough of those cycles to be wary when the same leadership style shows up again, especially in something that might become infrastructure people depend on.
To keep this constructive: for folks who've followed his work more closely than I have, do you think those past community frictions were mostly a function of the environment (big distro politics, legacy constraints, etc), or are they intrinsic to how he approaches projects? And for people evaluating Amutable today, what signals would you look for to distinguish "strong technical leadership" from "future maintenance and ecosystem headaches" ?
If anyone from the company is reading, I'd be genuinely reassured by specifics like:
- a clear governance/decision process (who can say "no", how major changes are reviewed)
- a commitment to compatibility and migration paths (not just "it's better, switch")
- transparent security and disclosure practices
- a plan for collaboration with downstream parties and competitors (standards, APIs, interop)
I realize this is partly subjective. I’m posting because I expect I'm not the only one weighing "technical upside" against "community cost," and I'd like to hear how others are thinking about it.
If you don't think that's a community opinion, it's at least an AI's opinion, since all I prompted it with was "rewrite my comment to follow the HN guidelines"That's a proxy metric for what we really care about: acceptance of differences, tolerance of others, diversity of perspectives, etc. In principle, you can achieve these goals with a team whose members are all one ethnicity and gender; it's just that a fair selection process won't produce such a team often. And, as it turns out, optimising for the "people who look different" proxy metric doesn't do a terrible job of optimising for the true metric, provided the "cultural fit"-type selection effects are weak enough.
The systemd crowd are perhaps worse than GNOME, as regards "my way or the highway", and designing systems that are fundamentally inadequate for the general use-case. I don't think ethnicity or gender diversity quotas would substantially improve their decision-making: all it would really achieve is to make it harder to spot the homogeneity in a photograph. A truly diverse team wouldn't make the decisions they make.
People demonize attestation. They should keep in mind that far from enslaving users, attestation actually enables some interesting, user-beneficial software shapes that wouldn't be possible otherwise. Hear me out.
Imagine you're using a program hosted on some cloud service S. You send packets over the network; gears churn; you get some results back. What are the problems with such a service? You have no idea what S is doing with your data. You incur latency, transmission time, and complexity costs using S remotely. You pay, one way or another, for the infrastructure running S. You can't use S offline.
Now imagine instead of S running on somebody else's computer over a network, you run S on your computer instead. Now, you can interact with S with zero latency, don't have to pay for S's infrastructure, and you can supervise S's interaction with the outside world.
But why would the author of S agree to let you run it? S might contain secrets. S might enforce business rules S's author is afraid you'll break. Ordinarily, S's authors wouldn't consider shipping you S instead of S's outputs.
However --- if S's author could run S on your computer in such a way that he could prove you haven't tampered with S or haven't observed its secrets, he can let you run S on your computer without giving up control over S. Attestation, secure enclaves, and other technologies create ways to distribute software that otherwise wouldn't exist. How many things are in the cloud solely to enforce access control? What if they didn't have to be?
Sure, in this deployment model, just like in the cloud world, you wouldn't be able to run a custom S: but so what? You don't get to run your custom S either way, and this way, relative to cloud deployment, you get better performance and even a little bit more control.
Also, the same thing works in reverse. You get to run your code remotely in a such a way that you can trust its remote execution just as much as you can trust that code executing on your own machine. There are tons of applications for this capability that we're not even imagining because, since the dawn of time, we've equated locality with trust and can now, in principle, decouple the two.
Yes, bad actors can use attestation technology to do all sorts of user-hostile things. You can wield any sufficiently useful tool in a harmful way: it's the utility itself that creates the potential for harm. This potential shouldn't prevent our inventing new kinds of tool.
> People demonize attestation. They should keep in mind that far from enslaving users, attestation actually enables some interesting, user-beneficial software shapes that wouldn't be possible otherwise. Hear me out.
But it won't be used like that. It will be used to take user freedoms out.
> But why would the author of S agree to let you run it? S might contain secrets. S might enforce business rules S's author is afraid you'll break. Ordinarily, S's authors wouldn't consider shipping you S instead of S's outputs.
That use case you're describing is already there and is currently being done with DRM, either in browser or in app itself.
You are right in the "it will make easier for app user to do it", and in theory it is still better option in video games than kernel anti-cheat. But it is still limiting user freedoms.
> Yes, bad actors can use attestation technology to do all sorts of user-hostile things. You can wield any sufficiently useful tool in a harmful way: it's the utility itself that creates the potential for harm. This potential shouldn't prevent our inventing new kinds of tool.
Majority of the uses will be user-hostile things. Because those are only cases where someone will decide to fund it.
> Attestation, secure enclaves, and other technologies create ways to distribute software that otherwise wouldn't exist. How many things are in the cloud solely to enforce access control? What if they didn't have to be?
To be honest, mainly companies need that. personal users do not need that. And additionally companies are NOT restrained by governments not to exploit customers as much as possible.
So... i also see it as enslaving users. And tell me, for many private persons, where does this actually give them for PRIVATE persons, NOT companies a net benefit?
additionally:
> This potential shouldn't prevent our inventing new kinds of tool.
Why do i see someone who wants to build an atomic bomb for shit and giggles using this argument, too? As hyperbole as my argument is, the argument given is not good here, as well.
The immutable linux people build tools, without building good tools which actually make it easier for private people at home to adapt a immutable linux to THEIR liking.
The atomic bomb is good example of what I'm talking about. The reason we haven't had a world war in 80 years is the atomic bomb. Far from being an instrument of misery, it's given us an age of unprecedented peace and prosperity. Plus, all the anti-nuclear activism in the world hasn't come one step closer to banishing nuclear weapons from the earth.
In my personal philosophy, it is never bad to develop a new technology.
I will put some trust into these people if they make this a pure nonprofit organization at the minimum. Building ON measures to ensure that this will not be pushed for the most obvious cases, which is to fight user freedom. This shouldn't be some afterthought.
"Trust us" is never a good idea with profit seeking founders. Especially ones who come from a culture that generally hates the hacker spirit and general computing.
You basically wrote a whole narrative of things that could be. But the team is not even willing to make promises as big as yours. Their answers were essentially just "trust us we're cool guys" and "don't worry, money will work out" wrapped in average PR speak.
> trust us we're cool guys
I'm guessing you're referencing my comment, that isn't what I said.
> But the team is not even willing to make promises as big as yours.
Be honest, look at the comment threads for this announcement. Do you honestly think a promise alone would be sufficient to satisfy all of the clamouring voices?
No, people would (rightfully!) ask for more and more proof -- the best proof is going to be to continue building what we are building and then you can judge it on its merits. There are lots of justifiable concerns people have in this area but most either don't really apply what we are building or are much larger social problems that we really are not in a position to affect.
I would also prefer to be to judged based my actions not on wild speculation about what I might theoretically do in the future.
Shall it be backdoorable like systemd-enabled distro nearly had a backdoorable SSH? For non-systemd distro weren't affected.
Why should we trust microsofties to produce something secure and non-backdoored?
And, lastly, why should Linux's security be tied to a private company? Oooh, but it's of course not about security: it's about things like DRM.
I hope Linus doesn't get blinded here: systemd managed to get PID 1 on many distros but they thankfully didn't manage, yet, to control the kernel. I hope this project ain't the final straw to finally meddle into the kernel.
Currently I'm doing:
Proxmox / systemd-less VMs / containers
But Promox is Debian based and Debian really drank too much of the systemd koolaid.So my plan is:
FreeBSD / bhyve hypervisor / systemd-less Linux VMs / containers
And then I'll be, at long last, systemd-free again.This project is an attack on general-purpose computing.
Cheating was solved before any of this rootkit level malware horseshit.
Community ran servers with community administration who actually cared about showing up and removing bad actors and cheaters.
Plenty of communities are still demonstrating this exact fact today.
Companies could 100% recreate this solution with fully hosted servers, with an actually staffed moderation department, but that slightly reduces profit margins so fuck you. Keep in mind community servers ran on donations most of the time. That's the level of profit they would lose.
Companies completely removed community servers as an option instead, because allowing you to run your own servers means you could possibly play the game with skins you haven't paid for!!! Oh no!!! Getting enjoyment without paying for it!!!
All software attempts at anti-cheat are impossible. Even fully attested consoles have had cheats and other ways of getting an advantage that you shouldn't have.
Cheating isn't defined by software. Cheating is a social problem that can only be solved socially. The status quo 20 years ago was better.
Everyday the world is becoming more polarized. Technology corporations gain ever more control over people's lives, telling people what they can do on their computers and phones, what they can talk about on social platforms, censoring what they please, wielding the threat of being cutoff from their data, their social circles on a whim. All over the world, in dictatorships and also in democratic countries, governments turn more fascist and more violent. They demonstrate that they can use technology to oppress their population, to hunt dissent and to efficiently spread propaganda.
In that world, authoring technology that enables this even more is either completely mad or evil. To me Linux is not a technological object, it is also a political statement. It is about choice, personal freedom, acceptance of risk. If you build software that actively intends to take this away from me to put it into the hands of economic interests and political actors then you deserve all the hate you can get.
> To me Linux is not a technological object, it is also a political statement. It is about choice, personal freedom ...
I use Linux since the Slackware day. Poettering is the worse thing that happened to the Linux ecosystem and, of course, he went on to work for Microsoft. Just to add a huge insult to the already painful injury.
This is not about security for the users. It's about control.
At least many in this thread are criticizing the project.
And, once again of course, it's from a private company.
Full of ex-Microsofties.
I don't know why anyone interested in hacking would cheer for this. But then maybe HN should be renamed "CN" (Corporate News) or "MN" (Microsoft News).
> Poettering is the worse thing that happened to the Linux ecosystem and, of course, he went on to work for Microsoft. Just to add a huge insult to the already painful injury.
agreed, and now he's planning on controlling what remains of your machine cryptographically!
> I use Linux since the Slackware day. Poettering is the worse thing that happened to the Linux ecosystem
Same here, Linux since about 1995. Same opinion.
> And, once again of course, it's from a private company. Full of ex-Microsofties.
And funded. And confident they will sell the product well.
For all those people saying negative please see all the comments when RedHat was acquired by IBM (2018)
https://news.ycombinator.com/item?id=18321884
- Linux is better now
- Nothing bad
The immediate concern seeing this is will the maintainer of systemd use their position to push this on everyone through it like every other extended feature of systemd?
Whatever it is, I hope it doesn't go the usual path of a minimal support, optional support and then being virtually mandatory by means of tight coupling with other subsystems.