Tuxedo computers could be a good fit I think? It’s like system76, but from Germany. You can pick from a few OS including an Ubuntu fork they made ( tuxedo os ). You can tweak the laptop yourself ( different you/CPUs/disk sizes/… ) to fit your use case.
An overarching principle of security is that of minimum privilege: everything (every process, every person) should have the minimum privileges it needs to do what it does, and where possible, that privilege should be explicitly granted temporarily and then dropped.
This means that any issue: a security breach or a mistake can’t access or break anything except whatever the component or person who had the issue could access or break, and that that access is minimal.
Suppose that you hit a page which exploits the https://www.hkcert.org/security-bulletin/mozilla-firefox-remote-code-execution-vulnerability_20230913 vulnerability in Firefox, or one like it, allowing remote code execution. If Firefox is running as root, the remote attacker now completely controls that machine. If you have SSH keys to other servers on there, they are all compromised. Your personal data could be encrypted for ransom. Anything that server manages, such as a TV or smart home equipment, could be manipulated arbitrarily, and possibly destroyed.
The same is true for any piece of software you use, because this is a general principle. Most distributions I believe don’t let you ssh in as root for that reason.
In short: don’t log in to anything as root; log in as a regular user and use sudo to temporarily perform administrator actions.
P.S. your description of the situation shows you don’t know the nature of vulnerabilities and security - if you’re running servers then this is something you should learn more about in short order.
Uncharitability to those you disagree with, style without substance, and all built upon thought-terminating cliches.
This isn’t helpful or enlightening or informative, it’s entertaining but not in an interesting nor original way. It reminds me of 2010s Reddit memes where everything was about adding as many “fucks” as possible because our moms aren’t supervising our internet time anymore. It espouses a consoomer mindset of “gotta have bigger numbers and shinier visuals because all that matters is appealing to lizard-brain.”
And it’s all couched in the obvious mindset that any criticism will be met with “ok boomer” (I’ll almost be insulted if I don’t get one) because being superior is more important than being right. Y’know… like a boomer?
You’ve got a point, focus on that: you can make the case that Linux fits your use case, or that certain mindsets within the Linux community are hindering progress. But please do so in a way that doesn’t just lend itself to more infighting and drama. That shit is for shallow people who have nothing to contribute and only serve as the cultural detritus that destroys communities and community-driven projects.
KDE Plasma has a desktop effect called "Track Mouse" after you activate it you can use it by pressing Ctrl+Meta. It doesn't look like the MacOS variant, but it does the job.
Thanks for pointing that out, I found the setting on my laptop and tried it out. I do like the jiggle approach better, though, simply because that is something many people (myself included) instinctively do when losing track of the mouse cursor.
Would be interesting if this is more on Firefox side, or on compositor side. I’ve been running Firefox in Wayland for about 9 months now, without any issues.
this is a wayland issue. Due to how wayland works, it cannot drop messages, this means if the messages stop being accepted (IE. the program becomes very slow and not very responsive) the application will wind up dying. EEVDF helped resolve a lot of these issues. but they arent gone yet.
a fairly easy replication cause is to start a large rust project compile since cargo will thread to oblivion if it gets the chance, then use the PC on wayland. Applications can frequently die, Firefox, MPV, Kate, gnome web, chromium, games, etc. it also doesn’t matter what compositor you use right now as gnome, kde sway all share the issue
EEVDF really does help stop a lot of these crashing though
You’re describing Wayland running into issues due to overall high system load, and not been given enough scheduler time to accept messages?
edit: This issue? gitlab.freedesktop.org/wayland/wayland/-/…/159 - didn’t find anything else matching the description, and personally have never seen that, both on my low specs notebook or my workstation, which probably counts as higher spec.
correct, this is the same issue, this generally really only happens with a sustained all core workload that will consistently leave you cpu at 100%, since if it’s not sustained, the kernel will allot some time to the programs, and the crash wont happen
I agree. The proxy solution they’re proposing seems like a band-aid on a fundamental design issue to me. It’s easier to just tack yet another library onto a big project than to refactor large amounts of code. This is exactly why a lot of software is getting more and more shit.
Also this is the kind of issues Wayland will be facing now that it’s starting to see widespread adoption, issues that arise from more and more complex situations created by interconnecting more apps with it in more ways.
How the devs handle this will be crucial and imo it can make or break the project in the long run. It’s one thing to successfully run a hobby project at a small scale, it’s another to shoulder the entire Linux desktop for the foreseeable future. That’s the bar that X had to meet; if Wayland intends to be the Linux desktop it has to step up. “Not our problem, deal with it outside Wayland” will not do.
while this is good on theory, when your CPU is being absolutely hammered, you need to re-adjust priorities to make a system responsive again, it’s actually not a simple thing to do without a context aware scheduler. Even though EEVDF is pretty good, it still struggles some times
its not in any stable release of sddm, but its one of the exceptions Fedora makes for git releases in its stable branch. KDESIG devs were desperate to get an end to end wayland experience happening for the KDE spin.
In addition to all of the open source options that have been offered, Davinci Resolve runs well on Linux and has all of the above features (and many, many more). It’s also a buy once keep forever situation rather than a subscription since they make their real money on hardware. OSS it isn’t, but it’s incredibly powerful, has an extensive free (as in beer) edition and beats the hell out of paying a monthly fee.
As for DaVinci Resolve, installation can be a bit weird if you don’t happen to run one of the officially supported Distros. Because of that, the easiest way to run it is probably via DistroBox, Michael Horn made a great tutorial about that: youtu.be/wmRiZQ9IZfc
Personal example: Fedora (38 - 39). Resolve uses libs which depends on some older versions of a lib, which they don't ship in the installer.
So I had to replace the depending libs so that Resolve can run with Fedoras more recent libs.
It wouldn’t be trivial to package such a big app as a flatpak (or snap for that matter) and also maintain it properly, so as long as the original developers don’t do the work I think it is unlikely to happen. But for a tool that I’m going to be using a lot in the future I think it makes sense to invest the time once to install it, even if it’s a bit more complicated.
Pacman’s cache isn’t in ~/.cache though, it’s in /var/cache. So whatever is taking up this much space isn’t the package manager.
That being said, I think the arch devs should add a config option to automatically delete old packages without having to run paccache manually and have it default to the last 2 versions of a package or so. It can grow quite big over time.
it doesn’t matter if you use paru, yay or heck makepkg if you are compiling packages with hilariously large sources like for example webbrowser (librewolf, brave, ungoogled-chromium, firedragon take each like ~30 GB) without pruning the build cache afterwards
Something I noticed was that in this case it was mostly binary AUR programs taking up the space.
I think maybe since yay/AUR use cloned git repos, and old versions of binaries get stored in the git diff and then add up because different versions of the binary are basically like keeping multiple copies of it instead of just the changes to the source code.
My update script handles mirrors, updates and cleans the cache automatically. I’d definitely recommend creating one. It’s aliased to sysupdate for me and I also check if it’s a debian or arch based distro so the command works on my servers and desktop
I don’t think I’ve posted it before, but here it is. If you use different utilities you’d have to swap those out. Also excuse the comments, I had GH Copilot generate this script
I’ve heard of tools like that, but this works fine for me. This way I’m not dependent on it being packaged for my distro and having to install it through other means. I’m fine running things manually, this is just for convenience
You can use yay -Sc to clean the cache. It’ll also ask you if you want to clean the pacman cache, which I’m assuming you also haven’t cleaned (check the size of /var/cache/pacman).
I was on windows and I was forced to update and then it bricked my computer and I had to reinstall windows except when I did it asked me for a windows license key. I tried everything to recover my license key but wasn’t able to.
This was around the time linus texh tips was teasing his upcoming month on linux series so I was like fuck it I’ll give it a go. Spent a week on mint and wifi was broken then tried Endeavor, Garuda and fedora and settled on manjaro. Manjaro was amazing to me. Everything worked out of the box and kde plasma looked so clean and I could set it up exactly how I wanted.
Then I watched linus tech tips video on linux and I was like wtf how did he have such a bad experience is he dumb?
He’s pretty much the quintessential QA tester. He wants to do things his way, regardless of whether or not the OS wants him to do that. He’s usually skilled enough to fix anything he messes up, but he doesn’t know enough about Linux to do that, so he ends up breaking things. I feel like most people have a better experience than he did, but his technique uncovered a ton of bugs and usability issues that significantly improved the Linux desktop to have fixed.
Love those videos, mostly because it is my perfect argument on why the Linux Desktop isn’t ready yet.
Was Linus an idiot in those videos? Yes, Luke even said so, stating he installed in and in the month chose not to use his machine (recent wan show)
However it shows, just how easy it is for a novice to break the distro, and how much work is needed to get it to the point of Windows for general population usability. Granted the issues Linus had with POP_OS was dumb and shouldn’t have happened. But it showed me that Manjaro existed, which I am using to this day.
I think linux desktop is ready for open minded people who see interested in a new way of doing things. I don’t think it’s ready for people who can’t use a computer or troubleshoot. Windows breaks often so I’m not as harsh when I see linux break.
Agreed, I am surprised how often file explorer crashes on Win 10. Or I need to restart windows for random reasons since moving to Linux. Its to the point I want to gut my desktop and put Manjaro on it too.
Compared to when I started using Linux in the late 2000’s, Linux has matured to an unbelievable point. To someone who is even slightly interested in learning, its perfectly usable as a Windows replacement… depending on your Distro, Desktop Environment, etc.
It’s this depends which makes recommending Linux hard for me, since when a problem occurs, I find its not as easy to troubleshoot especially with how many flavours of Linux exists.
Idk for the how but airvpn does for comparable prices. This coming from a fellow multi-tb Linux iso torrenter. Also I assume you mean VPN unless mullvad does VPS stuff I don’t know of.
Does your network not support UPnP? You shouldn’t normally need to port forward in order to seed a torrent, unless your network prevents NAT traversal.
I use Ubuntu. It generally tends to be boring stable, which is kinda what I want out of my OS these days. I can still customize it, and even break it if I really get bored, but it’s nice to have things just work for the most part.
I switched to Debian Stable after using Ubuntu LTS for 6 years, and recommend Ubuntu for beginners. It is stable, best community support, boring and good ol’ reliable, which is perfect to learn Linux and get accustomed to it. Even corporate support and game developers target Ubuntu first. Considering it runs smoothly on a 6 year old midrange Intel laptop chip, nobody is getting that 200% performance boost with other obscure fancy distros.
Yep, games being designed to support Ubuntu first is a big reason why I’m so far into Ubuntu. I could easily switch if I needed to since I’m both a programmer and very comfortable with Linux but for me, it does everything I need an OS to do.
Debian Stable is really, really close for gaming, since Ubuntu LTS itself is based on Debian Unstable branch, if you choose to upgrade with more Linux knowledge in future. Nobara is dedicated to gaming.
Honestly speaking, I keep W10 on SSD for games if any works in a wonky manner on Linux. Takes like 30 seconds to log off Debian, boot into Windows, fire up a game, get back to Linux when not playing.
I had a friend who wanted to try linux but insisted on arch because it’s what I used at the time even though I said they shouldn’t and gave many suggestions for better distros. They gave up after about a day and went back to windows. I don’t know what they expected, multiple people warned them not to use arch.
I’m switching from manjaro to endeavour atm, and i am liking endeavour a lot. I kept having issues with manjaro boot after every kernel update, but otherwise didnt mind it. Probably whatever manjaros build chain for boot is just wasn’t working with my hardware, but also the attitude on the forum is that you are stupid if you have to roll the kernel back.
Endeavour really just provides you arch with some maintenance utilities and otherwise lets you do your thing.
No more firefox home page getting constantly reset to the manajro home page so they can market you their laptop partnerships either 😉
I’ve been off windows for a long time, and when I was forced to use it, it was enterprise, locked down and stripped by knowledgeable IT teams.
Yesterday, I had my first exposure to Win 11 S mode. What a piece of crap. Not just the way its locked down, but the incessant Onedrive ads, broken settings app with missing features, AI buzzword addons, sloppy UI and general lack of control over your own computer.
Recommending my friend install Linux ASAP with my support. Nobody should have to endure that much cruft and garbage on their owned computer. They can’t even install software outside of the MS store? Gross.
Oh yeah no I was not at all saying windows was better, I was just saying arch was definitely not a good distribution for beginners and it was weird how one just insisted on using it. I use arch on my laptop and opensuse tumbleweed on my desktop and have not used windows for anything serious in years because it is so unbearable.
I understood you weren’t advocating for Windows (as an Arch user? The very idea!), but your mention of your friend returning to Windows got me thinking about my friends laptop and how icky it felt.
Glad there are fewer and fewer barriers to using Linux full time these days.
I love Arch but I wouldn’t recommend it to anyone. In my eyes, the only way one should choose Arch is despite all warnings against it, because they feel confident enough to deal with all the problems they encounter.
Honestly I’ve had so little trouble with arch compared to other things, so I would definitely recommend it to experienced linux users, just definitely not unexperienced users. The aur is amazing and rolling release means you don’t have to deal with the horrors of major updates breaking packages. OpenSUSE Tumbleweed is also a great candidate though for people who don’t want to set as many things up themself, I’m currently using both arch and tumbleweed on different computers
Yup! Same here. Once I’ve got everything set up, it has been running smoothly and without any issues for more than 5 years in my case. It’s literally the most reliable system I’ve ever set up, but I understand that the entry hurdle is pretty high.
My IT Bros said the same back when I had to choose W10 or Linux, they haven’t used arch and I had 0 Linux experience. I messed up every single step of the installation to a point where I knew from the problems I created what I did wrong. After many tries and a week later I had a working installation with dual boot. Never used windows and removed it a year later. It was rough but I learned how to recover from most errors a user can create.
If learning is the goal arch and arch-wiki is great.
I find that really cool, BUT, you should delete that link.
First, installing a tweaked Windows version from somebody else is risky. It’s hard to check if you included malware for example. I mean, I trust you that you didn’t do that, but it’s still risky. That alone isn’t the reason you should delete it. If I install a malware-version, it’s my fault, who cares.
The real reason you should delete that immediately is because it’s illegal! The licence doesn’t allow you to share Windows. With scripts on your own install its a grey area, but sharing installs or isos is definitely not allowed and everyone here could report you for that to MS, the police, the admins, whoever.
linux
Top
This magazine is from a federated server and may be incomplete. Browse more on the original instance.