EndeavourOS is a pretty decent setup, it has been working well for me so far, and I prefer Arch-based distros because of how quickly Linux has been moving.
Manjaro have let their SSL cert problem happen twice since I’ve been in the loop, and they were unintentionally DDOSing the AUR for a while.
Yes. I know Manjaro got bad press several times, about their SSL cert and about firing their treasurer but as a Linux distribution Manjaro is pretty decent for the average user, in my opinion.
SSL cert expiring stopped access to updates. That’s not just bad press, that’s poor form overall, especially for an Arch-based distro. Even worse, this happened while certbot exists, so there’s no excuse. It tells me they are less reliable as a distro, especially to have let it happen twice.
I highly recommend Fedora (just the regular Gnome version). I used to be all Ubuntu, but they’ve shoved snaps down everyone’s throats to the point that I simply cannot recommend it to anyone, especially newcomers.
Fedora has been working really well for me. You’ll probably want to play around with Gnome Tweaks to get the maximize and minimize buttons back, and install the Gnome extension “AppIndicator and KStatusNotifierItem Support” from the Gnome Extensions website. Those I would consider the essential post install steps.
After that you’ll have a rock-solid and enjoyable setup.
I had to bail from Fedora when they pulled the video codecs from RPM. It may be fixed, but the threat of pulling a tool from the repository still lingers in my mind.
Ah, they were being pulled from RPM fusion at one point if I recall. It didn’t go through, but the fact that it was even being discussed told me all I needed to know.
Since you want a just works deal, I’d go with a ublue based immutable distro, my favorite is Bazzite. You can pick between KDE and Gnome, and change between them cleanly at any point. User apps auto update in the background, your system also updates while it’s running and you only need to reboot to apply. If anything ever goes wrong, you have painless rollbacks. All that with up-to-date fedora packages and kernel.
I’ve been running it on my deck for a while now and it’s never let me down so far, really pleasant experience. It generally keeps out of your way and takes care of the chores while still allowing you to mess around if you want.
I second bazzite. Been running it on my gaming laptop for a few months now and loving it. My main desktop is running Garuda Linux, which I also absolutely love but I was weary of a rolling release arch based distro on my laptop which isn’t on and running 24/7 - tried manjaro on my laptop previously and it was broken more often than not. (although I am learning that is likely more a manjaro problem than an “arch-based” problem, it gave me a reason to try bazzite)
If you avoid Nvidia, it have been ready for many years. And to be honset, not sure X11 was really stable with Nvidia either. My main issue with Wayland, is that X doesn’t have multi dpi support… and for that I really cannot blame Wayland. Also, Skype doesn’t have screensharing, well, they actually had for a while, but then removed it… still, hard to blame on Wayland.
But as a general rule, if you have Nvidia, then you are not allowed to complain about anything… that was your choice, and with Nvidia under Linux, all bets are off. I thought that was clear a long time a go, especially after Linus not so subtle outburst.
My personal experience could never agree with that. I could never use Wayland on KDE on either one of my laptops with Intel graphics due to numerous glitches and incompatibilities, so nvidia is not even the scapegoat I wish it was.
I’m looking forward to plasma 6 next month, but at least on KDE, Wayland has not really been usable so far.
Machine learning pays my bills, and I never had a choice on my graphics card brand. To be sure, I wanted an AMD for the open source drivers, but CUDA remains essential to me. RocM support from AMD is a joke, and isn’t anywhere close to an alternative. Reseachers release code that only runs on CUDA for a good reason. To say that I don’t get to complain is going too far
Thanks to nouveau, I can still use GNOME even after dropping X11 🥳 I have a GeForce 6800M GT, I think, which would need a proprietary nvidia driver that is not supported (but patched by community) since kernel 5 I believe. Only thing that needed to be considered is, that one has to boot via legacy BIOS and not EFI, even on a mac laptop which normally uses EFI to boot into macOS and the grafic card still works. Would be nice if the nouveau team would get the card running on EFI as well.
But as a general rule, if you have Nvidia, then you are not allowed to complain about anything… that was your choice, and with Nvidia under Linux, all bets are off. I thought that was clear a long time a go, especially after Linus not so subtle outburst.
See, this attitude is exactly why Linux will never become mainstream. On Windows you don’t need to research if your machine will be able to run your operating system of choice, it just works.
If you’re a user, and you can install Linux without seeing a single warning that your hardware is going to cause issues, your distro is at fault. The moment you boot the installer, it knows damn well that your using Nvidia hardware and what the implications are. Distros either ignore the predictable instability, or they believe there is no problem, and either way the end user isn’t to blame for taking that at face value.
The truth is, Linux on Nvidia works fine, except for some very specific laptops with stupid mux chips, and even that is something Linux should fix, not the end user. Luckily, Linux installers don’t even boot on those machines, so the end user can just ignore Linux and continue using Windows.
You just can’t use Wayland if you want your Linux system to be stable, but X11 works fine and it will continue to do so for many years. Part of the Wayland issues still come from intermediate code refusing to work around Nvidia’s bullshit, ignoring known bugs and technically-spec-compliant-but-different stuff because it’s easier to blame Nvidia for everything. Wayland also makes some weird assumptions that I disagree with (“if the Wayland socket dies, your application must crash, there is no recovery”) which make minor stability problems a lot worse in practice.
Nvidia may be to blame for their shitty drivers when it comes to the core problem of the bad experiences Nvidia owners will have, not the end users buying the wrong hardware. You can’t seriously expect people who try it out for the first time to read up on the drama and controversy Linus Torvalds has caused over the years.
And even with all that, many serious Linux users who know full well the pain they’re about to subject themselves to still need Nvidia. ROCm is great but it’s nowhere near to as efficient and well-supported as CUDA. Whatever Intel has doesn’t come close and whatever macOS offers doesn’t work because even Nvidia has perfect Linux support compared to Apple.
I got my Nvidia GPU before I even considered moving to Linux. I am honestly getting pretty tired of reading these gatekeeping comments telling me “I’m not allowed to complain about anything” or how I’m a trash person for buying an Nvidia card in the first place. Nvidia is the largest GPU manufacter, people are going to own Nvidia cards, you need to live with it. Be constructive and nice to other people.
X11 is rock solid with Nvidia, never had a single problem.
I had a lot of issues with Wayland on KDE, lots of flickering issues all the time. I moved to Hyprland and things are mostly fine. IntelliJ has ocasional problems but they are working on a Wayland version anyways.
Ha, your first sentence is just plain wrong. It was quite broken under “normal” usecases with per-DE bugs.
For example, on KDE, about 1.5 years ago the bug finally got fixed where your Wayland session would completely crash if your monitor lost any signal whatsoever (monitor sleep or shutting off the monitor). If you ask me, that is an very standard usecase without which there is no world where said action crashing the entire session would be considered ready for general use.
I think we are there now, just some visual glitches nowadays, also some recent glitches with monitor sleep, but Wayland very rarely crashes anymore.
None of those people have a slightest clue. Your options really are: ubuntu vanilla and maybe pop os.
Everything else will very quickly require you to read through some obscure docs and bash your head against the terminal.
Vanilla Ubuntu, not kubuntu/xubuntu/whateverbuntu is the only polished and documented distro. After a year or two of that you’ll be ready to consider this “what distro” question.
Besides Fedora (maybe) I’m not sure other non-deb distros really are recommended for new users.
Besides that, like it or not, nowadays most software is distributed as deb files (until Flatpak fixes it). Using something not debian based requires learning how to port .deb files or use manual dependency resolution for tarballs.
What would you suggest is a better distro for a new Linux user? I’ve found Mint to be great out of the box, and only needs minor tweaks if you want the Microsoft fonts, for example.
For something that “just works” and feels quite like home, without being KDE, I’d recommend Zorin.
It’s stable, beautiful to look at and works as expected. I’d not recommend Arch-based distros to begin (but if you want to go the troubleshooting and fixing things way, that would be choice #1).
Maybe it is me but Cinnamon, while being very user friendly, feels limited. I feel that when you want to start tweaking, the options are not there yet.
Ubuntu uses LTS with five year support, which is why they like to keep a lot of software versions back. Linux Mint is based on Ubuntu I think. PPA is something you can add to Ubuntu or Ubuntu based Linux distributions to have newer or specific software repositories as extra on your system. Here’s a guide on PPA : itsfoss.com/ppa-guide/
Ubuntu used to get a lot of undeserved hate but lately the hate feels deserved. Ubuntu has been the face of the usable desktop Linux for a long time and they just keep tripping over themselves every time they try to move forward.
Their intentions are usually good. A lot of things they propose usually end up being adopted by the community at large (just not their implementation). They seem to just yank everyone’s chain a little too hard in the direction we’re eventually going to go and we all resent them for that.
Off the top of my head, there was Upstart (init system), there was unity (desktop), and now snaps (containerized packaging). All of these were good ideas but implemented poorly and with a general lack of support from the community. In almost each case in the past what’s happened is that once they run out of developers who champion the tech, they eventually get onboard with whatever Debian and Rhel are doing once they were caught up and settled.
Valve’s lack of interest in maintaining the snap makes sense. The development on the Ubuntu platform is very opinionated in a way where the developers of the software (valve) really want nothing to do with Canonicals snaps.
On another note: my favorite thing about the Ubuntu server was LXD + ZFS integration. Both have been snapified. It was incredibly useful and stable. Stephane Graber has forked the project now into INCUS. It looks very promising.
This might be an unpopular opinion but I really don’t get this trend of wanting to containerized just about everything, it feels like a FOTM rather than doing something that makes sense.
I mean, containers are fantastic tools and can help solve compatibility problems and make things more secure, especially on servers, but putting everything into containers on the desktop doesn’t make any sense to me.
One of the big advantages Linux always had over Windows is shared components, so packages are much smaller and updating the whole system is way faster, if every single application comes with its own stuff (like it does on Windows) you lose that advantage.
Ubuntu’s obsession with snaps is one of the reasons I stopped using it years ago, I don’t want containers forced upon me, I want to be free to decide if/when to use them (I prefer flatpack and appimage).
Debian derivatives that don’t “reinvent the wheel” is the way to go for me, I’ve been using Linux MX on my gaming desktop and LMDE on laptop for years and I couldn’t be happier, no problem whatsoever with Steam either.
Shared components work brilliantly in a fantasy world where nothing uses new features of a library or depends on bug fixes in new versions of a library, and no library ever has releases with regressions or updates that change the API. That’s not the case, though, so often there’ll exist no single version of a dependency that makes all the software on your machine actually compile and be minimally buggy. If you’re lucky, downstream packagers will make different packages for different versions of things they know cause this kind of problem so they can be installed side by side, or maintain a collection of patches to create a version that makes everything work even though no actual release would, but sometimes they do things like remove version range checks from CMake so things build, but don’t even end up running.
Shared containers work beautifully for a lot of things, though, many programs aren’t all that sensitive either. Making snaps for the tricky ones makes sense. Having snaps for all of them is ridiculous.
I can count the software requiring repo-pins on one hand on my desktop. For those, snaps make sense, replacing the need for any pins. Snaps are less confusing than pins. IMO.
It reminds me of Python programming, with requirements pinned to version ranges. Some dev-teams forget, and their apps won’t work out of the box. Sometimes, software still works ten years later, if they only use the most common arguments and commands from the packages.
I agree with a lot of your points but I do think containers a great solution.
I’ve been a really big fan of Universal Blue lately. It presents a strong argument for containerizing everything. Your core is immutable and atomic which makes upgrades seamless. User land lives in a container and just gets layered back on top afterwards.
I do think the idea behind snap isn’t all about pushing the Linux platform as such forward, but to specifically gain a market advantage to Ubuntu.
Why else is finding documentation for changing the default store so difficult? And I don’t think you can even have multiple “repositories” there–quite unlike all other Linux packaging systems out there. (Corrections welcome!)
How I interpret OP’s message is that it isn’t specific to Linux, but seeing as we are the Linux community, we might be more inclined to advocate for Linux features? 🤷♂️
linux
Active
This magazine is from a federated server and may be incomplete. Browse more on the original instance.