If you’re looking for something that won’t break, Debian and openSUSE Tumbleweed are two good options. Both offer the Plasma desktop, though openSUSE may have an easier installation interface for some. Note that some video codecs aren’t (officially) available, so if playing “differently acquired” media is a concern, perhaps Debian would be easier.
If you wanna try arch, consider EndeavourOS. It simplifies the installation process significantly, though it doesn’t do much to help maintain the system. That’s on you. Avoid Manjaro like a plague.
2
The Arch Wiki is universally considered the best source. 99% of what you’ll see on there will work on any linux distro, so don’t worry about the name. Aside from that, your favourite search engine is your best bet.
3
No clue what FALGCS means, but wallhaven.cc a great place for wallpapers.
Edit: seems the Manjarno site is down. Shame, genuinely useful site for times like this.
I’ve thought about it, but right now everything works exactly the way I need it and the only complaint I have is the occasional pop-ups from MS trying to get me to upgrade to win11 or switch my browser. My main uses for my devices are games and I just started back to school, so MS Office is nice to have. So, it’s hard to justify putting in the effort to change things now, especially when I know how to use MS products very well, particularly modding games.
Yeah. I feel ya. I still have windows on dual boot for certain things and it’s been a struggle at times but I gotta say I dread the times I need to boot windows! So much slower and annoying
I would propose you try to split the data you have manually into logically separate parts, so that you could logically fit 0.8 TB on one drive, 0.4 TB on another, and maybe sets of 0.2TB+0.2TB on a third one. Then you’d have a script that uses traditional backup approaches with modern backup apps to back up the particular data set for the disk you have attached to the system. This approach will allow you to access painlessly modern “infinite increments” backups where you persist older versions of data without doing full and incremental backups separately. You should then write a script to ensure no important data is forgotten to be backed up and that there are no overlapping backups (except for data you want to back up twice?).
For example, you could have a physical drive with sticker “photos and music” on it to back up your ~/Photos and ~/Music.
At some point some of those splits might become too large to fit into its allocated storage, which would be additional manual maintenance. Apply foresight to avoid these situations :).
If that kind of separation is not possible, then I guess tar+multi volume splitting is one option, as suggested elsewhere.
That is actually what I’m currently doing, in fact my file server is already organized in this way, but i personally don’t like it for offline backups because it still forces me to play digital tetris and work out what directories will fit on what drive, and there is also the issue that some of my directories, particularly the one containing all the lossless files from my (hobby) photography work, is getting close to growing larger than 1 TB at this point (I do a ton of urban and industrial photography and I honestly might have most of the interesting parts of my city documented at this point, plus different versions the same scene with different settings which is how I ended up with so much data). Though I suppose I can just split it into separate years instead of just one huge directory. I’m personally hoping for something that can automate this process so I don’t have to consciously keep track of it as much (I don’t trust my brain sometimes), currently experimenting with some of the suggested solutions, maybe I’ll find one that works better, if not then I’ll stick to the method you mentioned. Thank you for the suggestion though!
AppImages suck because I can’t pin them to my dashboard, can’t set them to open at startup and can’t set them as default apps for the appropriate filetypes.
You realize your computer won’t work without scripts, don’t you? And if you want your computer to do something it doesn’t do on its own, a simple script will make it do what you want. If that is your definition of sucking, then you need to go back to Windows, which is also loaded with scripts, by the way, so that sucks too.
You realize your computer won’t work without scripts, don’t you?
No, I don’t. In fact every Windows, Android or Mac computer I’ve used in my entire life works perfectly fine without manually running any scripts at all.
I would do it by manually splitting it up into sets and writing scripts to back up each of those sets. Then you only have to figure out the split once.
I wonder if rsync has an option to do what you are asking for?
It also sounds like the kind of thing the old tape backup software would do. Maybe look into something that can pretend the drives are tapes.
Usually projects on github. Personally I use Appimages for things like Mypaint a digital drawing application, krita and most other KDE applications as to avoid all the dependency’s KDE has in its eco system or at least to put them somewhere easier to manage
Since you’re interested in KDE, why not try Fedora Kinoite?
It’s an immutable distribution in much the same was as Steam OS 3. For individual pieces of software, you just install Flatpak versions. It’s deeply convenient if you don’t want to perform maintenance on your PC and want it to “just work”.
If you want immutability, Vanilla is the only good option right now. Services can be a nightmare on Fedora’s immutable systems, and some applications (qbittorrent, in my experience, though I haven’t seen anyone else have issues with it specifically) sometimes just outright decide to off themselves. I wouldn’t say it’s a bad distro, but recommending it to someone who states they don’t know much about computers could cause them trouble in the long run.
I still prefer to run everything built directly from reliable deb sources.
As an end user… sure, flatpaks and appimages and snaps are I guess neat if you are constantly distro hopping or something, at least in theory.
But uh, I have already found the ability to play games, develop games and other software, use basic daily software for everyday needs, and have a stable and predictable OS that doesnt crash or have insane misconfigurations caused by some esoteric conflict by just basing everything directly off of deb sources.
Every once in a while I will have to compile my own build, but this is rare and usually only occurs when trying out something experimental, or, also rare, something that doesnt have an actively and well maintained deb source. In that case its just a matter of doing a build from github when a new version comes out.
And I can do builds from github because I have saved a lot of storage space from not using bundled installers for all my software, allowing me to store the sources. This is also neat because it allows me to quickly /use/ one of those sources in a project, after I have already seen that it is stable via the software I use that is built on it.
Finally there is the security angle. Using a myriad of different containerized installers for everything is convenient in that you don’t have to directly worry about source management… until you do, when a source lib is discovered to have a critical flaw.
When a serious flaw is found in a source library… what’s gonna get updated faster? A containerized installer that you have to wait for the devs, who are busy managing tons of cross platform dependency issues and have to do a new safe stable build everytime any of their many dependencies for their many supported platforms? Or an app specifically built from source libs that either doesnt focus on cross platform, or has different teams specific to maintaining its different supported flavors?
In my experience, literally all of the time, the ‘direct from source’ software gets updated more quickly than the cross platform bundled installer.
Further, this whole approach here gives you experience with software that is built on source packages that, as you become more familiar with, and tinker with yourself, gives you insight into what source libs are well coded in terms of cpu/gpu/ram optimization, and which are resource hogs and should be avoided if youre interested in promoting and using software built off of efficient code. I enjoy learning from the good coding techniques of stable, lean and fast programs, and avoiding code that is comparatively unstable, boated, or slow.
Use anything you want, because all of them are safe and speedy.
Flatpaks allow packaging together all dependencies with specific versions with the package. Snaps take it to the next level by allowing to run system integrated sandboxed programs, because Flatpaks cannot have system integration. Appimages are simply the equivalent of portable USB software on Windows.
That is not how it works though, because I have been a part of these religious cults for basically forever. The hobbyist enthusiasm has a threshold, the cultism does not. It is animalistic nature to form and live as tribes. It does not become different just because the congregation tool is virtual instead of real.
Capitalism is not human nature. It is formulated around abuse of human psychology. The documentary Century Of The Self by Adam Curtis will be something you love.
Except Western imperialist countries have exploited hundreds of trillions of dollars from rest of the world, kept them subjugated for centuries, causing these luxurious software development cultures to not formulate in them. You are falsely equating software and politics being affected similarly and to a similar degree.
linux
Hot
This magazine is from a federated server and may be incomplete. Browse more on the original instance.