Microsoft forced me to switch to Linux
(himthe.dev)1859 points by bobsterlobster 4 days ago
1859 points by bobsterlobster 4 days ago
> UI framework balkanization has always been, and remains a hideous mess
I thought you were talking about Windows there. There are 4 (5?) different UI paradigms within Windows, and doing one thing sometimes requires you to interact with each of them.
At least on Linux, with GTK/KDE, you can pick a camp and have a somewhat consistent experience, with a few outliers. Plus many apps now just use CSD and fully integrate their designs to the window, so it's hopeless to have every window styling be consistent.
I never had to mind X vs Wayland when starting user applications tho.
If we're talking about mass adoption of Linux then there really has to be no concept of even "picking a camp". The vast majority of users - even techy people - will not understand what a window manager is, never mind be capable of choosing one.
Yes, there are many UI implementations in Windows but they are almost totally transparent to the user (no pun intended), and they can all run on the same system at once.
Hard disagree. You can run the same programs on any DE or Window Manager or even without one (on pure X11 for example). That's not a hurdle it's a feature.
Users who don't know about the feature can just use a pre-configured system like Mint Cinnamon and never know about any of these things.
The same is true of Linux - GTK3 apps runs just fine on Plasma, and so do GTK4, and Qt 5, and Qt 6, and X11 apps, and on.
Sure they all look slightly different, but it's definitely worse on Windows in that regard.
> Yes, there are many UI implementations in Windows but they are almost totally transparent to the user (no pun intended), and they can all run on the same system at once.
Just like Linux. You can run most if not all apps in any DE. Yes gnome will look ugly, but that's gnome's way of doing things. If you pick a decent DE, you will have most basic apps using the same styling, and the rest have CSD anyway.
Each GUI toolkit has its own specialties, but you'll use at most two of them, and they will be kept in separate apps. (Apart from flatpak portals which use gtk instead of the system's).
Windows has 3-5 different UI/UX layers within the same application ... And the rest have CSD anyway, so they look the same no matter the OS.
> Yes, there are many UI implementations in Windows but they are almost totally transparent to the user (no pun intended), and they can all run on the same system at once.
I mean this is a solved problem on linux using modern distributions like NixOS or even 'normal' distros with flatpak, appimage, etc. I haven't had to deal with anything like this in years.
The windows UIs are way more different than linux was. There was a time in the 90s where UIs were expected to follow platform specifics. These days, most UIs don't and they're almost kind of like the branding. Thus, this is not as big a deal as you're making it out to be. If anything, things like the gnome apps and gtk4 are more consistent than any windows app.
No, it's not about users picking a camp, it's about developers.
It's been a long, long time since I've seen an application utterly fail to load because it's a GTK/QT/etc framework running under a totally different DE.
Gnome apps look ugly as hell under KDE[0], but they still work. As a user, you don't need to know or care in any way. It'll run on your machine.
[0]I don't know if they're ugly because of incompatibility or if that's just How Gnome Is. I suspect the latter
>many apps now just use CSD
If there's something I hate about Linux, it's CSD (Client-Side Decorations, in case people don't know what it is).
If I wanted all my apps to look different from each other, I'd use macOS. I want a clean desktop environment, with predictable window frames that are customizable and they all look the same. CSD destroys that.
Having no CSD at all is unacceptable on small screens IMHO, far too much real estate is taken up by a title bar, you can be competitive with SSD by making them really thin, but then they are harder to click on and impossible with touch input. At the moment I have firefox setup with CSD and vertical tabs, only 7% of my vertical real estate is taken up by bars (inc. Gnome), which is pretty good for something that supports this many niceties.
Conversely, I don't want all of my apps to look identical to each other. I want to be able to tell with a submoment of a glance what app I am working on or looking for without having to cognitively engage to locate it, breaking my state of flow in the process.
Linux doesn't mean GNOME.
KDE favors server-side decorations.
I mean, most apps I use daily, no matter the OS, have CSD. Teams, Spotify, Slack, Firefox, Postman, IntelliJ etc...
Doesn't matter which OS they all have different styles. I can understand it's not liked by everyone, but that ship has sailed and no "big" app will use SSD anymore.
> UI framework balkanization has always been, and remains a hideous mess.
At least things look more or less the same over time. With commercial offerings one day you open your laptop and suddenly everything looks different and all the functions are in a different submenu because some designer thought it was cool or some manager needed a raise.
> It'll probably work fine out of the box, but if it doesn't. Hoo boy.
LLMs are actually very useful for Linux configuration problems. They might even be the reason so many users made the switch recently.
Pair-programming Nix with Gemini has taught me a lot about the assistive power of LLMs.
They're still slow and annoying at languages I'm good at. But it's really handy to be able to take one I'm not (like Nix or even C++) and say "write me a patch that does …" Applying a lot of the same thinking/structuring skills, but not tripping on the syntax.
They're pretty good for most things, yes... but man was it rough figuring out getting my IP allocation routing right on my Proxmox server. The system is issued a primary IP, and need to route my subnet through that to my VMs... wasn't too bad once I got it working... I'd also wanted a dnat for "internal" services, and that's where it got tricky.
I need to refresh myself as I'm wanting to move from a /29 to a /28 ... mostly been lazy about not getting it done, but actually mqking progress oo some hobby stuff with Claude Code... definitely a force multiplier, but I'm not quite at a "vibe code" level of trust, so it's still a bit of a slog.
> Wayland has fractional scaling as a sort-of workaround if you can tolerate the entire screen being blurry. Every other major OS can deal with this.
I think Windows is the only other one which really does this properly, macOS also does the hack where they simulate fractional scales by rendering with an integer scale at a non-native resolution then scaling it down.
> I think Windows is the only other one which really does this properly
Windows is the only one that does this properly.
Windows handles high pixel density on a per-application, per-display basis. This is the most fine-grained. It's pretty easy to opt in on reasonably modern frameworks, too; just add in the necessary key in the resource manifest; done. [1]
Linux + Xorg has a global pixel density scale factor. KDE/Qt handles this OK; GNOME/GTK break when the scaling factor is not an integer multiple of 96 and cause raster scaling.
Linux + Wayland has per-display scaling factors, but Chromium, GNOME, and GTK break the same way as the Xorg setup. KDE/Qt are a bit better, but I'm quite certain the taskbar icons are sharper on Xorg than they are on Wayland. I think this boils down to subpixel rendering not being enabled.
And of course, every application on Linux in theory can handle high pixel density, but there is a zoo of environment variables and command-line arguments that need to be passed for the ideal result.
On macOS, if the pixel density of the target display is at least some Apple-blessed number that they consider 'Retina', then the 'Retina' resolutions are enabled. At resolutions that are not integer multiples of the physical resolution, the framebuffer is four times the resolution of the displayed values (twice in each dimension), and then the final result is raster-scaled with some sinc/Lanczos algorithm back down to the physical resolution. This shows up as ringing artifacts, which are very obvious with high-contrast, thin regions like text.
On non-retina resolutions, there is zero concept of 'scaling factor' whatsoever; you can choose another resolution, but it will be raster-scaled (usually up) with some bi/trilinear filtering, and the entire screen is blurry. The last time Windows had such brute-force rendering was in Windows XP, 25 years ago.
[1]: https://learn.microsoft.com/en-gb/windows/win32/hidpi/settin...
> Windows is the only one that does this properly. Windows handles high pixel density on a per-application, per-display basis.
This is not our [0] experience. macOS handles things on a per-section-of-window, per-application, per-display basis. You can split a window across two monitors at two different DPIs, and it will display perfectly. This does not happen on Windows, or we have not found the right way to make it work thus far.
[0] ardour.org
> then the final result is raster-scaled with some sinc/Lanczos algorithm back down to the physical resolution. This shows up as ringing artifacts, which are very obvious with high-contrast, thin regions like text.
I don't think this is true. I use non-integer scaling on my Mac since I like the UX to be just a little bit bigger, and have never observed any kind of ringing or any specific artifacts at all around text, nor have I ever heard this as a complaint before. I assume it's just bilinear or bicubic unless you have evidence otherwise? The only complaint people tend to make is ever-so-slight additional blurriness, which barely matters at Retina resolution.
ChromeOS also does fractional scaling properly because Chrome does it properly. The scaling factor is propagated through the rendering stack so that content is rastered at the correct scale from the beginning instead of using an integer scaling factor and then downscaling later. And it takes subpixel rendering into account too, which affects things like what elements can be squashed into layers backed by GPU textures.
I think Android does it properly too because they have to handle an entire zoo of screen sizes and resolutions there. Although they don't have the issue of dealing with subpixel rendering.
This is a surprising opinion to encounter, given my experience with scaling on Windows, where simple things like taking my laptop off its dock (going from desktop monitors to laptop screen) causes applications to become blurry, and they stay blurry even when I've returned the laptop to the dock. Or how scaling causes some maximized window edges to show up on the adjacent screen. Or all manner of subtle positioning and size bugs crop up.
Is this more of an aspirational thing, like Windows supports "doing it right", and with time and effort by the right people, more and more applications may be able to be drawn correctly?
[edit] I guess so, I see your comment about setting registry keys to make stuff work in Microsoft's own programs. That aligns more closely with my experience.
> Windows is the only one that does this properly.
How can you say this when applications render either minuscule or gigantic, either way with contents totally out of proportion, seemingly at random?
I don’t have to pull out a magnifying glass to notice those issues.
> The last time Windows had such brute-force rendering was in Windows XP, 25 years ago.
To be fair, UXGA was a thing 20 years ago. I don't think it makes sense for Apple to care all that much about low DPI monitors. They don't sell any, and they wouldn't be acceptable to most Apple people, who have had crisp displays available for > 10 years now. I wouldn't be surprised if the number of Apple users on low dpi is single digit percentage.
That's roughly what I did for my ANSI console/viewer... I started with EGA resolution, and each ega pixel renders 3x4 in its' buffer then a minor blur, then scaled to fit the render area. The effect is really good down to about 960px wide, which is a bit bigger in terms of real pixels than the original... at 640px wide, it's a little hard to make out the actual pixels... but it's the best way I could think of to handle the non-square pixels of original EGA or VGA... I went with EGA because the ratio is slightly cleaner IMO. It's also what OG RIPterm used.
> * UI framework balkanization has always been, and remains a hideous me
I'd take balkanization over the "we force-migrate everyone to the hot new thing where nothing works".
> It'll probably work fine out of the box, but if it doesn't.
Drivers are a pain point and will probably stay so until the market share is too large for the hardware vendors to ignore. Which probably aren't happening any time soon, sadly.
This is not a driver issue I'm talking about. It's a "best way to adjust the white balance is with this GTK+-2.0 app that hasn't seen maintenance since the Bush administration" issue.
> I'd take balkanization over the "we force-migrate everyone to the hot new thing where nothing works".
The UI framework for macOS has not changed in any substantial design-update-requiring ways since OS X was first released. They did add stuff (animations as a core concept, most notably).
The UI framework for Windows has changed even less, though it's more of a mess because there are several different ones, with an unclear relationship to each other. win32 won't hurt though, and it hasn't changed in any significant ways since dinosaurs roamed the silicon savannahs.
The UI framework for Linux ... oh wait, there isn't one.
I had to dump a perfectly fine c.2012 workstation recently because of video driver limitations. Could no longer stay current on my flavor of Linux (OpenSUSE) and have better than hideous display resolution limited to just one monitor. NVIDIA’s proprietary drivers are great, but the limited support lifecycle plus poor open source coverage is actually making Linux turn fine systems into trash just the way Windows used to do.
>poor open source coverage is actually making Linux turn fine systems into trash just the way Windows used to do.
I'd blame Linux as a very small percentage of the problem here. This is on NVIDIA ensuring their hardware doesn't last to long and forcing you to throw it away eventually. Open source can make the monitor 'work' but really aren't efficient, and really can never be efficient because NVIDIA doesn't release the needed information and directly competes with their proprietary driver.
On UI frameworks... mostly agree, I say this as a COSMIC user even... so many apps still don't show up right in the tray, but it's getting a bit better, I always found KDE to be noisy, and don't like how overtly political the Gnome guys are. So far Wayland hasn't been bad, X apps pretty much just work, even if they don't scale right.
I'm on a very large OLED 3440x1440 display and haven't had too many issues... some apps seem to just drop out, I'm not sure if they are on a different workspace or something as I tend to just stick to single screen, single display. I need to take the time to tweak my hotkeys for window pinning. I'll usually have my browser to half the screen and my editor and terminal on the other half... sometimes stretching the editor to 2/3 covering part of the browser. I'm usually zoomed in 25-30% in my editor and browser... I'd scale the UI 25% directly, like on windows or mac, but you're right it's worse.
For webcams, I don't use anything too advanced, but the Nexigo cams I've been using lately have been working very well... they're the least painful part of my setup, and even though I tend to use a BT headset, I use the webcam mic as switching in and out of stereo/mono mode for the headset mic doesn't always work right in Linux.
On audio filtering, I can only imagine... though would assume it's finally starting to get better with whatever the current standard is (pipewire?), which from what I understand is closer to what mac's interfaces are. I know a few audio guys and they hate Windows and mostly are shy to even consider Linux.
FWIW, I can do the same with KDE on Xorg with Gentoo Linux.
Since the introduction of the XSETTINGS protocol in like 2003 or 2005 or so to provide a common cross-toolkit mechanism to communicate system settings, the absence of "non-integer" scaling support has always been the fault of the GUI toolkits.
> I think your experience comes from GNOME which lacks behind in this regard.
When doesn't GNOME lag behind? Honestly, most of Wayland's problems have been because a project that expects protocol implementers and extenders to cooperate in order to make the project work set those expectations while knowing that GNOME was going to be one of those parties whose cooperation was required.
Mint/cinnamon here at 150%, X11, not blurry. It’s FUD.
The issue with X11 is that it's not dynamic. Think using a laptop, which you sometimes connect to a screen on which you require a different scale. X11 won't handle different scales, and it also won't switch from one to the other without restarting it.
the scaling and UI framework issues are by far my biggest pain point. I will inevitably end up with an app with tiny and/or blurry UI elements every few weeks and have to spend a ton of time figuring out the correct incantation to make it better.
This is on a pretty clean/fresh install of current ubuntu desktop
I'm a lifelong Mac user, but a gaming handheld has gotten me into some of these topics. I dual-boot SteamOS and Windows.
On SteamOS, my 5.1 stereo just works.
On Windows, apparently there was some software package called DTS Live (and/or Dolby Live) needed to wrap the audio stream in a container that the stereo understands. There was a time when there was a patent pool on the AC-3 codec (or something like that - I'm handwaving because I don't know all the details). So Microsoft stopped licensing the patent, and now you just can't use AC-3 on Windows. I spent an evening installing something called Virtual CABLE and trying to use it to juryrig my own Dolby Live encoder with ffmpeg… Never got it to work.
It's easy to fall deep into the tinkerhole on Linux, which has kept me away for a long time, but as mainstream platforms get more locked down, or stop supporting things they decide should be obsolete, it's nice to have a refuge where you're still in control, and things still work.
(Insert meme about the Windows API in Proton being a more stable target than actual Windows.)
I use Linux as my daily driver, with a Mac laptop. I only use Windows when I absolutely have to (i.e., testing), and usually through a VM.
Some other rough edges in Linux I've encountered:
- a/v support in various apps. We use Slack for everything (I can't just use something else) and a/v support is pretty bad to where my video frame rate is now ~1Hz and screen share shows a black rectangle. I think that's mostly Slack's fault as Google Hangouts works fine, but it's probably low on their priority list.
- sleep / hibernation is still sometimes flakey. occasionally it won't wake up after hibernating overnight, and I have to hard reboot (losing any open files though that's not an issue)
- power management on laptops (and therefore battery life) is still worse than Windows, and way worse than Mac. I tried Framework + Linux for a while and really wanted to love it, but switched to a Mac and am not going back (still run Linux on desktop). There is nothing out there that compares to the M-series MacBooks.
- occasional X/Wayland issues, as mentioned
Hardware support for esoteric things such as the new generation of Wacom EMR is still awkward --- I was able to get the previous gen working on a ThinkPad X61T using Lubuntu --- wish that there was such an easy way to try out Linux on my Samsung Galaxy Book 3 Pro 360....
> * Support for non-standard DPI monitors sucks, mostly because of the previous point. Wayland has fractional scaling as a sort-of workaround if you can tolerate the entire screen being blurry. Every other major OS can deal with this.
This sounds like you're using some old software. GNOME and sway have clean fractional scaling without blurring, though that hasn't always been the case (it used to be terrible).
> Audio filtering is a pain to set up.
Like noise filtering for your microphone? It was pretty trivial to set up: https://github.com/werman/noise-suppression-for-voice
Fractional scaling on Wayland is only blurry for X apps, and even then, most apps have Wayland support at this point, so for the remaining apps, just turn off Xwayland scaling, and using native scaling through env vars and flags, and no more blurriness.
- Yes. I think big players in Linux should start supporting core functionalities in GNOME and KDE, and make it polished for laptops and desktops and that would be very cool. For a long time, KDE had a problem of having too many things under its umbrella. Now, with separation of Plasma Desktop and Applications, focusing on Plasma Desktop and KDE PIM should be a good step.
- Kind of ties to the old point: KDE on Wayland does this extremely well.
- You're back to 20 years because problems are exactly from 20 years ago. Vendors refusing to support linux with d rivers.
- Audio filtering? Interesting. I know people who use Pipewire + Jack quite reasonably. But may be you have usecase I am now aware of? Would be happy to hear some.
It would help if Gnome wasn't so hostile towards proper cross-DE interop. A famous quote by a Gnome dev goes, "I guess you have to decide if you are a GNOME app, an Ubuntu app, or an Xfce app unfortunately"
They seem to genuinely believe that their way is the right way and everyone else is "holding it wrong" so there's no need for things that would make cross-DE apps easier (or even possible).
I've also been running fractional scaling on Sway for many years now and native wayland applications are not blurry. X11 apps run through XWayland will be blurry, but I don't have any legacy X11 apps remaining on my system.
> UI framework balkanization has always been, and remains a hideous mess.
Amen.
But, which OS doesn't have this problem? I'm currently running windows on a work laptop and even freaking first-party apps have a different look and behave differently from one another. Teams can't even be assed to use standard windows notifications! And don't get me started on electron apps, of which most apps are nowadays, each coming with their own look and feel.
Also, have you tried switching from light to dark mode, say at night? The task manager changes only partially. The explorer copy info window doesn't even have a dark mode! On outlook the window controls don't change colour, so you end up with black on black or white on white. You can't possibly hold up windows as a model of uniform UI.
So while I agree that this situation is terrible, I wouldn't pin it on the linux ecosystem(s).
> Every other major OS can deal with [high dpi].
Don't know about mac os, but on Windows it's a shitshow. We use some very high DPI displays at work which I have to run at 200%, every other screen I use is 100%. Even the freaking start menu is blurry! It only works well if I boot the machine with the high-dpi display attached. If I plug it in after a while (think going to work with the laptop asleep), the thing's blurry! Some taskbar icons don't adapt, so I sometimes have tiny icons, or huge cropped ones if I unplug the external monitor. Plasma doesn't do this.
IME KDE/Plasma 6 works perfectly with mixed DPI (but I admit I haven't tried "fractional" scales). The only app which doesn't play ball 100% is IntelliJ (scaling works, it's sharp, but the mouse cursor is the wrong size).
> Audio filtering is a pain to set up.
What do you mean? I've been using easyeffects for more than five years now to apply either a parametric EQ to my speakers or a convolver to my headphones. Works perfectly for all the apps, or I can choose which apps it should alter. The PEQ adds a bit of a latency, but applications seem to be aware of it, so if I play videos (even youtube on firefox with gpu decoding!) it stays in sync. It detects the output and loads presets accordingly. I also don't have to reboot when I connect some new audio device, like BT headphones (well, technically, on Windows I don't anymore, either, since for some reason it can't connect to either of my headphones at all). I would love to have something similar on windows, but the best I found isn't as polished. It also doesn't support dark mode, so it burns my eyes at night.
macOS and Windows have a much smaller set of variants, and tend to ship a single UI with everything included with OS. Even the best single desktop Linux distros will ship divergent KDE and Gnome apps.
If you want essentially perfect high-DPI support out of the box and can afford higher end displays, use macOS. It just works. I see the comments above about scaling, and to that, I say: most people will never notice. However, a Win32 app being the wrong scale? They'll notice that.
But the real display weak point of Linux right now vs Windows is HDR for gaming. That's a real shitshow and it tends to just work on Windows.
I came to rely pretty heavily on Docker and WSL(2) in Windows. I was an insiders user for a bit over a decade, and worked with .Net and C# since it was "ASP+" ...
I had setup a dual boot when I swapped my old GTX 1080 for an RX 5700XT, figuring the "open source" drivers would give me a good Linux experience... it didn't. Every other update was a blank/black screen and me without a good remote config to try to recover it. After about 6 months it was actually stable, but I'd since gone ahead and paid too much for an RTX 3080, and gone back to my windows drive...
I still used WSL almost all day, relying mostly on VS Code and a Browser open, terminal commands through WSL remoting in Code and results etc. on the browser.
Then, one day, I clicked the trusty super/win menu and started typing in the name of he installed program I wanted to run... a freaking ad. In the start menu search results. I mean, it was a beta channel of windows, but the fact that anyone thought this was a good idea and it got implemented, I was out.
I rebooted my personal desktop back to Linux... ran all the updates and it's run smoothly since. My current RX 9070XT better still, couldn't be happier. And it does everything I want it to do, and there's enough games in Steam through Proton that I can play what I want, when I want. Even the last half year on Pop Coxmic pre-release versions was overall less painful than a lot of my Windows experiences the past few years. Still not perfect, but at least it's fast and doesn't fail in ways that Windows now seems to regularly.
Whoever is steering Windows development at Microsoft is clearly drunk at the wheel over something that should be the most "done" and polished product on the planet and it just keeps getting worse.
I want to chime in here. It's advertisements on my desktop that repels me. There is something deeply personal about ads in my desktop that feels like being violated. This is a computer that I paid for, with software that I pay for, that includes all my most personal files and data. Seeing ads on the OS completely eroded my trust.
Of course, I still use Windows for various things, but I have too much "ick" for it to be the system where I check my email, manage my business, keep my important files, etc.
Windows is really great for lots of things, but I don't trust it.
Yeah. The ads in he start menu are a sign that you are no longer the customer, you are the product. Windows has other similar “features”.
I do not have ads in my start menu, and no, I didn't "debloat" my PC. This is a base install where I flipped a couple of settings in the start menu options.
It was a test they ran on Insiders channel to see how people reacted to them. It never mated it into GA, or for that matter the entire insiders channels... They'll feature gate things to some insiders users and A/B test them to see how the user response looks. There was a bit of an uproar at the time for those that saw them, including myself... I ditched windows altogether (except my assigned work laptop).
How generous of them to allow their paying user to disable the ads. It's only a matter of time until this either becomes some sort of premium feature.
You're missing the point entirely.
The problem isn't that ads can be disabled. The problem is that a paid operating system ships with ads in the first place. Full stop. There's no universe where that's acceptable product design, and the fact that you can disable them (for now, at least) doesn't make it less offensive.
I don't understand why you're going to bat for a trillion-dollar corporation here. Your settings work now. Great. They won't after the next feature update, this is a well-documented pattern. Windows updates routinely re-enable telemetry, Bing integration, and promotional content that users explicitly disabled. You're not configuring your OS, you're fighting it.
The TPM2 requirement is pure planned obsolescence. Millions of perfectly good machines binned because Microsoft decided hardware from 2016 is suddenly "insecure"... whilst the actual benefit is DRM enforcement and remote attestation.
It's a corporate compliance tool, not a security feature.
The Insiders build being referenced had actual web advertisements in search results. That's where this is headed. If you're comfortable defending that trajectory, carry on flipping those settings.
For those that want to remove items, You can quickly disable these options by going into Settings > Personalization > Start and turn off "Show recommendations for tips, shortcuts, new apps, and more".
It's like a 10 second fix and basically everything is gone.
It's really hard to maintain a product team where the mandate is just "don't break anything and keep the quality high". Especially something with as big of an installed base as Windows.
The team will look for excuses to build new and exciting stuff and new opportunities to increase revenue. Even if the product is pretty much "done".
I disagree, I think companies mostly just don't want to spend development money on existing "finished" products. That's the smell I'm getting from microsoft.
There are plenty of easily identifiable issues with performance in windows 11. There should be people in the windows team dedicated to eliminating "jank". MS product owners, on the other hand, are much more interested in getting copilot integrations into every menu. That's an "easy" task which looks good on a scorecard when you complete it.
No, it really shouldn't be... You can reduce headcount a lot, which they did, and concentrate on bugs (including security reports), while working with hardware vendors for if/when new features need to be integrated for better usability.
If/when you decide to do a redesign, it should be limited to a specific area, or done in such a way that all functionality gets moved to its' new UI/UX in a specified timeframe and released when done. Not, oh, here's a new right click menu that you now have an extra click 1/3 of the time for the old menu that has what you are actually looking for because the old extension interface was broken.
Want a real exercise in fun ... just for fun, because I know it's not as useful on a laptop, but was fun on desktops... get a screensaver working in windows that runs for an hour or so before going to sleep... just try it... that's a fun exercise in frustration... oh, it's still in there, but every third update will disable it all again. I get it... but you know what, I want my matrix screensaver to run when I'm only away for a few minutes or over lunch.
Every month more and more people switch to Linux and I just love it. I'm tired of one company controlling the core operating system of 85% of desktop computers and users being at their whim.
You want proprietary programs? Alright, fine, one can argue for that. But the central, core operating system of general purpose computers should be free and fully controllable by the users that own them!
It's a form of Stockholm Syndrome for most people. They'll have some bit of software they imagine is irreplaceable because they have some special use case that means that they just have to tolerate the relentless abuse. Or some other excuse. Whatever. It all boils down to people being afraid of change.
Most of that fear is not all that rational. It's not unlike kidnap victims falling in love with their captors. Your mind just tries to make the most of what fundamentally is a really messed up situation. You'll tell yourself it isn't that bad or that the next update will fix it or that you can get some magic software thingy that makes it go faster. Whatever.
Once you realize you are being abused, you can make some choices and do something about it. Most tools can be replaced if you look around a bit and do a bit of research. And virtual machines on Linux can run Windows just fine if you have one or two things that just really need it (been there done that). There's also wine and proton which aren't half bad these days. And they work for lots of things other than games. You can dual boot. Etc. Try it and find out. The absolute worst case is that you have to go back to being a lame abuse victim here. You'll feel extra bad because now you know. The best case gets you out of that abusive relation ship for the rest of your life. Life is too short to get subjected to this kind of abuse.
This might have some merit for some people.
But the talk of abuse is also heavy-handed.
I've spent months testing and trying out RAW photo editors, and months trying out Linux gaming.
Linux is incredible, but my experience with Windows is still better. As many that still use it can attest, you can disable almost any annoyance. It's extremely stable. Things just work including brightness controls, fractional scaling, high refresh rates and high FPS gaming, and my favorite RAW photo editing. I could switch to a less enjoyable experience with Linux but I choose not to after extensive evaluation. I don't spend any money on Microsoft services, no Office or OneDrive subscription.
But my decision isn't permanent. My hobbies, software use, gaming selection, etc. can change over time, and Linux is getting better while Windows is getting worse. If it's ever "abuse" and I can have a better experience with Linux, I won't hesitate to change. But it's also a lot of effort to try out alternatives, and dual booting is slow and annoying. Plus when I dual boot to Linux Mint the kernel fails to boot every other time and I have to select an older one, reboot, select a newer one, reboot. It's a huge waste of time. A bad experience and I have chosen to avoid it and try again in another year or two.
Worthy mention of RapidRAW for photo editing on Linux: https://github.com/CyberTimon/RapidRAW
I don't think it's Stockholm Syndrome, rather it's a classic case of sunken cost fallacy. For me at least, that's what it was. I had invested so much time in Ableton (~14 years) and didn't feel like starting from scratch with another DAW. And let's be real, no one likes that kind of friction.
It had to get worse to finally break the inertia and also make me realize that it's only going downhill.
I'll note that sunken cost is not always a fallacy, or perhaps I should phrase that as "things that look like the sunken cost fallacy aren't always that fallacy". In your specific case, for example, you didn't feel like starting from scratch, because that would involve paying a cost (in terms of time learning a new system) that you didn't want to pay. So it's not actually "I sunk so much time into this, I want to get my money's worth" as it is "cost of learning something new: high. Cost of sticking with what I know: zero." So not exactly the same as the sunk-cost fallacy.
> Every month more and more people switch to Linux
We've been hearing this for decades and yet the home Linux userbase is microscopic and somehow even smaller than ever. Unless we're going to count Google's Android and Chrome OS. Those are the only Linux-based distributions that have ever gained market share over desktop Windows.
Somehow I think the stars might be aligning this time though. People are genuinely fed up with Windows and governments around the world are loudly thinking about how to reduce dependence on US tech. And then there is Proton which makes it much easier for Gamers to jump ship. To me it feels like there is more momentum than ever for this.
On the other hand I am also a realist and I don't think that Linux will take over the Desktop, but it will certainly have its biggest growth year ever in 2026.
> On the other hand I am also a realist and I don't think that Linux will take over the Desktop, but it will certainly have its biggest growth year ever in 2026.
I _love_ Linux, but I agree with this as well. I don't think Linux will ever be easy enough that I could recommend it to an elderly neighbor. I hope to be proven wrong, though.
What frustrates me about this particular moment is that at the same time Windows is getting worse, I feel that OS X is _also_ getting worse. This _is_ an opportunity for Apple to put a big dent in Windows market share.
> Somehow I think the stars might be aligning this time though
> governments around the world are loudly thinking about how to reduce dependence on US tech
I am definitely sympathetic, after all, I worked for a major Linux company for quite a few years, started using Linux RH) in 1994, and even wrote some network related kernel modules.
However, this switch to Linux is not going to happen (apart from where it is already used heavily, from servers to many non-PC systems).
I have been in projects for large companies but also government on and off. Now, I manage the IT of a small (<50 employees) non-IT business with people in several countries.
People who actually comment in these discussions seem to be entirely focused on the OS itself. But that is what matters the least in business. Office is another, and even there most people who don't deal with it at scale are way too focused on some use case where individuals write documents and do some spreadsheeting. It's almost always about a very small setup, or even just a single PC.
However, the Microsoft stack is sooooo much more. ID management. Device management. Uncountable number of little helpers in form of software and scripts that you cannot port to a Linux based stack without significant effort. Entire mail domains are managed by Office 265 - you own the domain and the DNS records, you get licenses for Office365 from MS, you point the DNS records to Microsoft, you are done.
Sure, MS tools and the various admin websites are a mess, duplicating many things, making others hard to find. But nobody in the world would be able to provide soooo much stuff while doing a better job. The truth is, they keep continuously innovating and I can see it, little things just conveniently showing up, like that I now have a Teams button to create an AI script of my conversations, or that if more than one person opens an Office document that is stored in OneDrive we can see each other inside the document, cursor positions, and who has it open.
Nobody in their right mind will switch their entir4e org to Linux unless they have some really good reasons, a lot of resources to spare, and a lot of experience. Most businesses, for whom IT is not the be-all-end-all but just a tool will not switch.
But something can be done.
The EU could, for example, start requiring other stacks for new special cases. They cannot tell the whole economy to switch, not even a fraction of it, but they could start with new government software. Maybe - depends on how it has to fit into the existing mostly Microsoft infrastructure.
They could also require more apps to be web-only. I once wrote some code for some government agency to manage business registrations, and it was web software.
The focus would have to be to start creating strong niches for local business to start making money using other stacks, and to take the long road, slowly replace US based stacks over the next two or three decades. At the same time, enact policies that let local business grow using alternative stacks, providing a safe cache-flow that does not have to compete with US based ones.
The EU also needs some better scaling. The nice thing about the MS stack is that I can use it everywhere, in almost all countries. The alternative cannot be that a business would have to use a different local company in each country.
I read a month ago that EU travel to the US is down - by only ~3%. Just like with any calls for boycott of this and that, the truth is that those commenting are a very tiny fraction. The vast majority of people and businesses are not commenting in these threads (or at all), and their focus is on their own business and domain problems first of all. Switching their IT stack will only done by force, if the US were to do something really drastic that crashes some targeted countries Microsoft- and Cloud-IT.
Go and download the archives of Reddit, there are plenty of torrents out there. Filter to a sub like r/gaming. Relative frequency graph of Linux mentions. You'll see a magnitude increase over the last 12 months compared to years before. It's real.
Must admit, not sure if the data torrents are uptodate now that Reddit anti-scrapes so hard to raise their premium on the exclusive contract to the highest bidder, OpenAI.
Calling 4-5% marketshare microscopic is not fair. I get it if it was still stuck at 1%, but it's growing, and the rate of growth has been increasing too.
Is the desktop/laptop linux market share really over 4%? What is that based on?
As phones replace desktop computers for non-technical users, leaving a concentration of "skilled" users, my suspicion is that the pattern will resemble the quote "Slowly, then all at once."
Have a look at the Steam Hardware [and software] Survey [0] results. Linux has been trending upwards whist Windows has been trending down for a wee while. And the population this looks at is primarily interested in gaming, which means that this is despite a compatibility layer being needed for a large amount of the software used. I imagine in other communities (software, old people) it's trending much faster.
E.g. I recently installed Linux Mint for my grandma so she could use email and an up-to-date web browser on her old laptop that can't run (secure) Windows anymore. The UI differences are marginal for her, and she can do everything she needs to much better than she could before (which was not at all).
I mean, this is literally false? Desktop Linux userbase is growing, it's bigger than it has ever been even without including ChromeOS, and more OEMs are shipping devices with desktop linux than ever before (Valve's suite of devices, multiple laptop vendors including major ones like Lenovo, a few SteamDeck competitors)
More and more desktop apps are just becoming websites. More and more desktop apps are using Electron rather than some native app. Windows is slowly becoming a dumpster fire in terms of usability and issues. Most games these days Just Work on Linux without any tinkering.
While I hardly think that this year will be "the year of the Linux desktop" or whatever, but if these trends keep going, I really foresee Linux market share growing, slowly, each year, until it's not so microscopic anymore.
According to the Steam Hardware Survey (https://store.steampowered.com/hwsurvey/Steam-Hardware-Softw...) only ~3.6% Steam users use Linux and these statistics include the Steam Deck users. SteamOS accounts for ~26% of Linux users, which in turn brings down the count to ~2.6%. For comparision, MacOS is ~2.1% of the market share at the moment. Wake me up when Linux gets to 10%.
As a long-time Linux user who fairly recently dropped the Windows partition entirely, I do think the remaining chafing points are these:
* UI framework balkanization has always been, and remains a hideous mess. And now you don't just have different versions of GTK vs QT to keep track off, but also X vs Wayland, and their various compatibility layers.
* Support for non-standard DPI monitors sucks, mostly because of the previous point. Wayland has fractional scaling as a sort-of workaround if you can tolerate the entire screen being blurry. Every other major OS can deal with this.
* Anything to do with configuring webcams feels like you're suddenly in thrown back 20 years into the past. It'll probably work fine out of the box, but if it doesn't. Hoo boy.
* Audio filtering is a pain to set up.