As far as I know, only the kernel module was open sourced and in doing that Nvidia moved a lot of stuff from the driver, to the firmware/software part of their stack instead. So you would still need those, which are not open.
I think it is very interesting in terms of the easy deployment of specific environments, and in terms of writing recipes for new packages.
Having said that, outside of these two rather niche areas for home use, I think it is rather unintuitive and offers no real advantages over more established players that offer a more polished experience, like Fedora for workstation and gaming use.
Dev work is not specific enough. Pip is a nightmare because it just wanted to modify folders that were read only and you never know what it wants to do to your system. Your experience may vary depending on how much the language package manager assumes about your system. If you’re in a container, it will work perfectly, though
Firefox just works, and I installed Steam from nixpkgs and it worked after enabling a few settings. Then I just enabled Proton on every game and it works okay, with a few weird bugs sometimes (although I blame Gnome for messing up alt tabbing sometimes)
Yup, that or if buying new then check out older models that may be in clearance/sale. You don’t need something with a 4070 etc to run Linux, but you could potentially manage to find something with an older-gen video card and decent/upgradeable RAM. There should also be more easily found discussion over Linux compatibility
With laptops, also watch out for models with soldered-on RAM or low maximums, which can limit upgrades.
Some of them would be recommend even for non-Linux users, apart from being entertaining, they are extremely informative about open source/tech topics in general.
While I don’t mind BSDs, that would lead to even worse outcomes though in my view. Companies wouldn’t even have to release the source code, and they routinely don’t.
What we need is more copyleft to ensure companies contribute back to the communities they leach from, not less.
I already used open source programs on Windows. The programs I’m using to do all my work with are Krita, Blender 3D, Gimp, and Libre Office.
They either started out on Linux or support Linux natively, so switching to Linux didn’t really change any of the programs I use. The biggest change is playing games, but Valve has made it very pain free.
As others have mentioned, secondhand laptops and surplus business laptops are very affordable and probably better value for the money than a chromebook. My understanding is that drivers for things like fingerprint sensors, SD card readers, or oddball Wi-Fi chipsets can be issues to watch out for. But personally I don’t care about the fingerprint sensor and only the Wi-Fi would be a major issue to me.
A couple years ago now I picked up a used Acer Swift with 8th gen intel and a dent in the back lid for something like $200 to use as my “throw in a backpack for travel” laptop, and it has been working great. In retrospect, I would have looked for something with 16GB of RAM or upgradeable RAM (8GB soldered to the motherboard, ugh), but aside from that minor gripe it has been a good experience.
Having an Nvidia-card, should I be worried about this? So far I’ve read so many “Nvidia bad, Wayland no work” posts that I have just stayed clear waiting for a final confirmation that everything is smooth sailing.
I’ve been using Wayland on Nvidia with plasma for about a year and it’s been mostly fine. Only a few minor issues like night color not working or some Xwayland apps flickering, but the system feels far more responsive on Wayland so it’s well worth it to me
On much more recent driver versions Wayland support has been further improved. I suggest going with Fedora Silverblue since RPM Fusion is pretty quick to roll out new driver versions.
Having swapped to Linux on Pop OS and later onto Nobara recently, I strongly disagree.
As my personal experience on 525, 535 and even beta 545 with a 3080, so much as swapping onto a Wayland session implied lag, screen tearing issues, and stability issues / crashes on KDE and GNOME, to the point that I ended up selling the 3080 for a 7900 XTX because of how everyone said the AMD experience is so much better and it is.
True that I havent tested it on a laptop so maybe Optimus support from Nvidia or the latest drivers have added stability overall, but this was definitely a problem in desktop for the last months to me.
I routinely 100% my root volume accidentally (thanks docker), but my machine has never crashed, it does tend to cause other issues though. Does having a full /usr, /var or /tmp not cause other issues, if not full crashes?
Of course it does, it’s actually filling those that crashes the machine, not /.
When space runs out it runs out, there’s no magical solution. Separating partitions like that is done for other reasons, not to prevent runaway fill: filesystems with special properties, mounting network filesystems remotely etc.
It depends, if your docker installation uses /var, it will surelly help to keep it separated.
For my home systems, I have: UEFI, /boot, /, home, swap.
For my work systems, we additionally have separate /opt, /var, /tmp and /usr.
/usr will only grow when you add more software to your system. /var and /tmp are where applications and services store temporary files, log files and caches, so they can vary wildly depending on what is running. /opt is for third-party stuff, so it depends if you use it or not.
Managing all that seems like a lot of effort, and given my disk issues havent yet been fatal, ill probably not worry about going that far. Thanks for the info though.
Last time i used LVM was way back in fedora 8 days, when it was the default partition. It was super annoying to use, as gparted didnt support it, and live cds often had trouble with it. Having to read doco to resize it was pretty not good for a newbie to linux. Has it improved since?
linux
Active
This magazine is from a federated server and may be incomplete. Browse more on the original instance.