I have no experience for this matter, nor a lot of Linux either, but there seem to be some interesting choices here (there isn’t best and worst, it’s just a list, and the most adapted to what you need).
Not sure I’d recommend getting anything resembling a computer with 4 GB RAM and 64 GB storage nowadays, but it’ll certainly still work.
I’d probably start with a minimal Debian installation (or Arch if you prefer being on the bleeding edge I guess) and then add GNOME desktop and whatever else I need afterwards. I don’t recommend checking the box that says “GNOME” in the Debian installer, as that installs a whole bunch of packages you’ll probably never use, and disk space is at a premium here.
Performance should be doable as long as you don’t multitask a lot, but don’t expect any wonders as 2 physical cores really isn’t a lot these days.
I’d say 4GB of RAM is barely enough. It’ll probably do for the things you mentioned. But opening a browser and surfing the web, or using modern Electron apps/software will quickly get you to the limit.
Another idea would be buying something second-hand / refurbished. It’ll get you better specs for roughly the same money. But probably not a Surface or a tablet, so YMMV with that approach.
Thanks for the hint. I guess I was a bit over-eager since I’ve been thinking about getting one for quite some time and now this “bargain” appeared out of nowhere. :/
I used to use Ubuntu before unity and switched to Debian 👑 in 2012. I still have to use Ubuntu for work and I just get on with it. It could be worse… I could have to use windows.
Anyway my main gripes with Ubuntu are snaps and how they keep swapping packages in apt to be installed as snaps .
I dont hate it, its a tool and in most cases I can use it and there is no problem if not there are other options.
The downside of NixOS is bad documentation. Which makes it take quite a while to get your config setup the way you want. Its so worth it though. I used arch for 5+ years and have been on NixOS for about 6 weeks now. I’m definitely never going back. My conifg is done, I barely have to change anything now. Its all saved in a git repo so I never have to make it again. I’ve already switched all of my machines over. And even a few of my friends. Which has been super easy to do cause I just give them my config then remove everything they don’t need. I’ve only been using it for a little while but it feels so reliable and Unbreakable even though I’m running unstable packages. Because if anything breaks you just go back to the last generation that worked. Which made me willing to just try anything when I was setting it up.
Also you could run Nix package manager on arch for this, but the nix package repo is amazing. It has everything i’ve needed or even thought about installing. And in my opinion its way better than using AUR packages. Most of the time you just DL them and don’t have to build them. Its just so much faster and more reliable then using Paru or Yay. Plus there is a NUR( nix user repo) but tbh I’ve never even looked at it.
The other con I know of is issues running binaries and app images. But there are was work arounds for them. I use a few app-images by just running ‘appimage-run <appimage filename>’. And so far its worked perfectly. As for a binaries you can use steam-run or I think using distrobox would work. But I haven’t had to do anything like that yet.
I found this YouTube channel quite useful when I was setting mine up. Vimjoyer
I found it fairly difficult to set up nixos on one of my machines, because it simply didn’t ship with a certain, relatively common kernel module/user space app. I also couldn’t find a usable workaround (only compiling my own kernel on every update, which is not exactly my kind of fun).
You can specify custom parts of the config that enables that module and/or extra module packages.
If you specify a custom part of the config then ye sure you’ll be compiling the kernel on each kernel update but you don’t need to manually configure it
These are some good tweaks! Personally, especially the smaller changes like uniform sizes for controls, would be worth taking up with the KDE community because they might consider merging them into breeze proper. Unless it doesn’t work like that for theming of course!
Ubuntu attacted a lot of control freaks because Shuttleworth was originally splashing some money when it started and a bunch of nerds saw dollar signs. As a result they have a culture of “not invented here” syndrome where someone just has to reinvent the wheel in only the way they see it and they don’t work well with others or accept their input because they want all the credit.
Personally, I got sick of it having been pretty involved early on in the project. It’s easier and saner to just use a distro based on what everyone else is doing.
Not x86_64 based, but the PineTab2 and PineTab-V are 2 alternatives. The PineTab2 is aarch64 (ARM) based while the PineTab-V is, you guessed it, RISC-V based.
Both 8 GB RAM versions go for about $210 on their website.
Unfortunately, those don’t support a stylus. Although I love seeing a RISC-V tablet (although I wouldn’t be able to use it, since I’m not a kernel developer ;)
No worries. Just wanted to throw some alternatives your way, since I think €300 is a steep price for a 4 GB RAM tablet with no upgrade option. :) PS: Didn’t know stylus support was a thing. TIL about EMR.
“Stability” is probably the most mis-used word in the Linux world.
It means that how your system looks and behaves doesn’t change, which is really important for servers, especially in business, where you want to plan any change in advance before you commit to it.
Arch is not stable in this sense. It constantly changes, and those changes can come up on short notice with any upgrade.
But when people read that Arch isn’t stable, they think the system can break at any time.
I’d say this hasn’t been the case for at least 10 years now. If you RTFN (read the fucking news) and use the AUR sensibly, Arch has become a really boring system, regarding breakage.
Arch breaks all the time. It has to because upstream is usually always changing so breakage is inevitable.
Though a person’s mileage on this may vary (less update frequency, less no of programs etc.), the constant thing about rolling release is that breakages within software releases are to be expected.
My experience with Arch is that it has been very solid and stable. It is just “makes sense” for the most part and so issues are very resolvable.
If you use the AUR, you can get times when packages need to be excluded ( held back ) in order for the overall system to update. I do not see that as an Arch problem and it is easy to handle.
One thing that is an Arch problem is that, if you do not update often enough, you can end-up with outdated keys that prevent you from installing before packages. The solution is just to update the keyring before updating everything else but this is confusing for a new user and kind of dumb in my opinion. I feel like the system should do this for me.
Ironically, I find Arch is most stable if you update very frequently ( which makes the updates smaller and more incremental ). I do a quick update almost every day without any fear of breaking my system. Any “problems” I have had with Arch updates are trying to update a system that has not been updated forever. Even then, it is just a bit more work.
Another thing that can happen if you leave it too long is that packages will have been replaced by newer ones. Keeping up to date means there are only going to be a small number of those. An update after a year can run into a surprising number of them.
I dug out an old laptop that had Arch on it from 3 years before. Updating it was annoying but in the end it was totally up to date and stable.
Arch is not stable but it’s easy to fix issues arising from its rolling release nature. One of the ways being utilizing the AUR packagedowngradefor easy package version rollbacks. I should also note that the most common reason for Arch breaking is rarely ever because of the distro itself but because upstream has introduced breaking changes. You can see this when an upstream feature breaks in Arch, then Fedora picks up the same bug a few weeks/month later.
Arch is however the most solid distro I’ve ever used since I began using Linux many many moons ago.
One thing that is an Arch problem is that, if you do not update often enough, you can end-up with outdated keys that prevent you from installing before packages. The solution is just to update the keyring before updating everything else but this is confusing for a new user and kind of dumb in my opinion. I feel like the system should do this for me.
Arch already does this. Could be that your install has the keyring refresh service disabled but I’ve had it enabled for a good while now and I’ve never encountered that outdated pacman keyring issue.
Ofc, Arch users should learn how to resolve a package conflict, or how to downgrade packages, or generally how to debug the system. Sometimes you also have to migrate config files.
On the other hand, as an arch user, I can tell that it mostly just works. If you customize heavily an ubuntu, it will break more likely. And while you can fix an arch, you probably have to reinstall an ubuntu.
Moreover, Arch has a testing repository which is not the default.
Been a while since I had a VM but iirc it was pretty easy to have a shared directory to the VM, which is very useful to (obviously) share files but it also means that since the files aren’t actually on the VM itself they’ll still be there even if you remove the VM since they’re not part of the image.
How I learned my lesson to have a shared directory was this: I had been having audio issues on the VM and at one point just decided to start over with a new VM, completely forgetting that the files I had been working on for a project were part of the VM and would be gone.
linux
Active
This magazine is from a federated server and may be incomplete. Browse more on the original instance.