Linux 6.1 will be maintained for another 10 years by the CIP. The hardware in question will be almost 40 years old at that point. I don’t have a violin small enough for users losing free support after 40 years from maintainers who most likely don’t even own the same hardware to test on…
On the other hand, they were probably unchanged for decades. Did anything really change, or is this just a case of we need to remove 500k lines of code, what is most useless ? Let’s cut that.
In other words, removed because it’s a KPI to remove lines, and this makes number go up.
So if it’s been unchanged for decades then you can just add it yourself and recompile the kernel. Elsewhere you argue that you can’t just add old drivers to a newer kernel, which implies these drivers require some nontrivial amount of maintaince. Which is it.
Keeping code around isn’t free. Interfaces change, regressions pop up. You have to occasionally put in work just to keep it in a working state. Usually in cases like this there are discussions on the mailing list about who is going to maintain them and nobody volunteers. You can do that if you’re so passionate about keeping these drivers around.
They were fine all this time, what changed suddenly ? I bet it’s the security nerds stirring shit, making it all a liability and easier deleted than fixed.
I doubt any hardware 25+ years can even run a modern vanilla linux kernel, you’d have to compile it yourself with some serious customization for it even work
To be honest, Ubuntu likely has nothing to do with it and I find the headline therefore misleading. It’s mostly the Linux kernel from how it reads.
Ubuntu 23.10 was run for providing a clean, out-of-the-box look at this common desktop/workstation Linux distribution. Benchmarks of other Linux distributions will come in time in follow-up Phoronix articles. But for the most part the Ubuntu 23.10 performance should be largely similar to that of other modern Linux distributions with the exception of Intel’s Clear Linux that takes things to the extreme or those doing non-default tinkering to their Linux installations.
Proprietary snap store backend that is controlled by Canonical: that’s it.
I used Ubuntu for years: installed it for family and friends. I moved away around a year ago.
Moving packages like Firefox to snap was what first started annoying me.
If the backend was open source, and the community could have hosted their own (like how flatpak repositories can be), I might have been slightly more forgiving.
Did a quick Google to find if someone had elaborated, here’s a good one:
It is also a commercial distribution. If you ever used a community distribution like Arch, Gentoo or even Debian, then you will notice that they much more encourage participation. You can contribute your ideas and work without requiring to sign any CLAs.
Because Ubuntu wants to control/own parts of the system, they tend to, rather then contributing to existing solutions, create their own, often subpar, software, that requires CLAs. See upstart vs openrc or later systemd, Mir vs Wayland, which they both later adopted anyway, Unity vs Gnome, snap vs flatpak, microk8 vs k3s, bazar vs git or mercurial, … The NIH syndrom is pretty strong in Ubuntu. And even if Ubuntu came first with some of these solutions, the community had to create the alternative because they where controlling it.
Serving files over HTTPS is not difficult to implement If anyone cared. Even if the cloud backend was open source you still wouldn’t use it. Downvote now!
I’ll add one more grip: Amazon integration. It’s been resolved for like 7 years now, but I still hold it against them a bit for placing Amazon search results in my desktop all those years back. Not that I don’t have an Ubuntu server running as we speak, but it still does taint them a tad in my eyes (and probably acts as an anachronism to the “it’s a corporate distro” theme of dislike around here).
Ahh, okay, so nothing new under the sun: Hipsters hate normies and September never ended.
Although I’m under the impression that Mint and Pop have taken a bite out of the “beginner desktop” market, Ubuntu is most of what I observe in the office when everybody else is booting Windows.
I can understand selecting for novelty; I’m usually in that camp. But novelty shouldn’t come at the expense of an argument to IT departments that they should support at least one Linux distro.
It’s supposed to be tuned more toward heavy workflows, such as rendering and CAD. It has support for more RAM (6TB) and quad SMP along with ReFS, and SMB Direct.
I only found out about it because we needed a beastly set up for combining lidar and drone aerials in Autodesk.
Is there some reason to think that running Windows 11 Pro for Workstations would have made a difference in a CPU benchmark? I’m not seeing anything obvious on the feature list for that version that would make that be the case.
20% is a LOT. That’s probably because of the random shit that nobody ever asked for but windows is always doing in the background anyway. Building a search index, windows update (which consumes an insane amount of CPU for a completely unreasonable amount of time sometimes), other individual updater services (because there can’t be one program that updates everything because every vendor does their own proprietary bullshit to handle updates), compressing and sending all you personal data to microsoft and of course the pre-installed McAffee (on trial license) that works hard to make your system less secure (that HP probably installed for you because apperently you haven’t paid enough money for the computer, so you must pay with your patience and your privacy as well). Depending on the benchmark, the pathetic legacy file system windows uses might also play a role.
The Windows scheduler is so stupid chip manufacturers manipulate the BIOS/ACPI tables to force it to make better decisions (particularly with SMT) rather than wait on MS to fix it.
Linux just shrugs, figures out the thread topology anyway and makes the right decisions regardless.
I have to use Chocolatey, Winget, Windows Store and invididual updating to use the tools I need in Windows, It’s ridiculous. I only use Flatpak and Zypper in my Linux partition.
While this is cool, but I am interested in a comparison with a fresh windows install. This article says it’s out of the box from HP, I wouldn’t be surprised if they have some dumb processes running, chunking performance… I’m confident linux would still outperform but this is quite an insane gap on display.
That’s a fair comment. But on the other hand, if you are spending a fortune on a CPU the size of your hand (look at that thing in the article!) then there’s a good chance you’re using it for business purposes, and either you or your IT department will be very keen to have a completely vender-supported stack. Enthusiasts with fresh OS installs will not be representative of users of this tech - AMD haven’t really been targetting it at gamer desktops.
Of course, comparing both would be even better, see whether it is an HP crapware issue…
Totally agree, it’s two different tests and use cases. Most people will run it how it comes out of the box and that’s probably more representative of the real world.
I just think it’s not entirely fair to say “windows is 20% slower” when we have no idea which trash HP loaded it up with. If I managed an IT Dept and learned my $$$$ hardware lost 1/5 of it’s performance I’d certainly be pushing HP for solutions. Or maybe they’d prefer to take 20% off the price?
Don’t most businesses cut the bloat out and put their own builds on it? Sure they put their own software on that will hurt performance but it seems fresh vs fresh would be give better metrics.
Always did on my hardware at least. When I was using Windows, my old laptop started lagging very much and it was becoming unbearable. I could not get a new one immediately. I got to know about Linux one day and installed it to try it out because there was not really anything else I could try.
I could not believe myself how buttery smooth my laptop became after that. 95% of the games that I used to play on Windows run with more performance on Linux.
I’m typing this on an 8-or-9-year-old laptop that used to be a Windows machine years ago. Exact same experience–it got too sluggish so I wiped it and installed Linux and it’s been fine ever since.
It’s not a “shitty title”, because Ubuntu Linux is the thing they actually tested.
Whether Debian or Fedora or Alpine or Void or whatever would do better or worse is not a given, and isn’t something the OP can comment on because they didn’t test it.
We can probably infer that gains of a similar amount would be seen on most mainstream distros (as they’re all pretty similar under the covers), but that’s not on the OP.
In particular, Ubuntu ships with various non-free drivers and kernel patches that will be present in some, but not all other distros.
If course it’s not on the OP, it’s on Phoronix. This is a shitty title from any party, but from them last least I would have expected more, instead of just attributing the performance to a specific distribution, the most corporate-y one no less.
Linux, the kernel, doesn’t operate in isolation. The system under test was Ubuntu, which comes with specific packages, package versions, patches, kernel configuration, and so on. It is reasonable to say that the combination between this specific operating system and hardware led to the observed outcome. Different combinations of software and hardware may yield other results or replicate the same outcome. The certainty of these outcomes can only be established through testing. Therefore, your outrage seems unwarranted, and your assertion is not only baseless but incorrect.
I’m also looking forward to Bcachefs, but rather for storage of large amounts of data. Just hoping the multi device feature works as well as advertised
@leo KDE with Wayland was all crashy when I tried it. If Wayland windowing is as buggy and crashy as their browser we'll all need to switch to Windows or Mac just to get any work done.
I’m daily driving Firefox with Wayland on KDE Plasma since years, not on Xwayland, and can’t remember it not working well. This on two different distributions (Arch and NixOS). Not saying this is your fault but your experience is not representative for everyone
@Laser My experience is representative for enough people to show that Linux Desktop is a mess and is not suitable for production work. I don't identify myself by my choice of software. I just want to get work done.
@crypto@Laser Linux desktop is not one thing. If you have a company that standardizes on Gnome, then the software you need to work will work as they will likely have been tested to work. As for work, well, not everyone uses it for work.
I suppose it really depends on when you tried it. Ubuntu 23.10 has been working quite well on Wayland. I haven’t once failed down to X, and the only papercut I run into now is with differently scaled displays (100% and 150%) where OBS will crash the session when moving back and forth.
Everything else seems good as I haven’t really seen anything else break at all and I use Firefox, Kdenlive, Audacity, lots of chat apps, and played some games. Specifically, playing Vivaldia 2 while I was remotely compiling Gentoo using OBS to livestream.
phoronix.com
Newest