And that definition depends on how you define ābenefitting the userā. If someone has an online match ruined by a hacker, Iād argue that they would have benefitted from the game running some kind of anticheat.
Do we define user as the singular individual person? Or do we consider the user as a collective, and factor in the larger benefit to the masses? It could even be argued that the people running cheats are the ones running malware (specifically, malware that targets the other users in the match) and should therefore be treated the same way we treat people who use more traditional viruses and trojans at the detriment to others. The same way you wouldnāt want some virus-ridden machine connecting to your home network, (youād probably want everyone to at least be running a basic virus scanner and have common sense when browsing,) you would want everyone in the game running anticheat to ensure there is no malware.
Very few people would say that itās okay to waste othersā time and computer resources on a bitcoin miner trojan⦠Most people would (correctly) determine that it is theft. But then when it comes to online games, the same people feel entitled to waste other peoplesā time and computer resources by ruining their matches.
Thatās largely a corporate decision that is out of the hands of the programmers. Generally speaking, security specialists would agree with you. But running anticheat on the server costs server resources, which means you need more servers to accommodate the same number of players. Running it client-side is a cost cutting measure mandated by the corporate bean counters who did the math and concluded itād be cheaper for the company to spend the usersā computer resources instead.
While I agree that client-side security isnāt the best solution, itās certainly better than no solution. Itās the same argument people have against self-driving cars. The self-driving cars donāt need to be perfect; They just need to be better than the average driver. If they can reduce the number and severity of accidents that are currently happening without them, then they should be implemented. Even if the solution isnāt perfect. Because an imperfect solution is better than doing nothing at all.
Youāre right and itās a pragmatic approach to the problem. They only need broad technical effectiveness to change user behaviour.
Iād argue that itās not strictly cost cutting but cost transferring. The total client resources most likely exceed that which would be needed on servers.
I donāt think that is a widely accepted holistic definition of malware. But even if, AC is not waisting resources. Itās taking the resources it needs to perform its job.
There are several forms of anticheat. The ones that just run when the game is running, is usually fine. However, there is the Riot anti cheat which just runs all the time and isnāt uninstalled when Valorant is uninstalled. That is malware.
Youāve linked to their anti cheat which they also offer but itās not their main product. Funny that you missed that, given that you were already on their web site and irdeto.com/denuvo/ spells out āAnti-Piracy technologyā in huge font:
There are games with single player and multiplayer modes that come with anti cheat. I had some game a few months ago that was a Steam freebie (canāt remember the name) whose anti cheat didnāt install properly on Windows and it didnāt allow me to launch regular single player, only mod mode.
According to them ~58% of anti-cheat games work. Thereās been a large uptick of anti-cheat support since the Steam Deck.
According to ProtonDB, 86% of the top 1000 games on Steam function (Silver+ rating). Itās a pretty safe bet that the most of the missing 14% is probably due to anti-cheat.
The horrible part is it was. Your other choice was ext2, which wasted so many lifetimes with its hours long fsck times. Reiserfs was a cut above the rest, we would all be using it today if it werenāt for that one teensy-weensy legal issue.
I have an old drive with it in there. The drive is going bad so I havenāt messed with it too much. I never knew at the time why the development and shine faded so quickly.
I wish I was that lucky, the final straw for me was the grub-customizer shenanigans, manjaro pushed an update that broke grub customizer boot entries, then when users were trying to figure it out, they removed grub customizer, and then they even went so far as to make grub conflict with grub-customizer which was really asinine. IIRC they even wound up locking the forum thread on it
I know itās beside your point, but I want to chime inā¦
My understanding of the history of fashion is that back in the 1950s Americaā¦they were trying to nudge culture into accepting their worldview.
On the contrary, I donāt think thatās how the mentality came about, or was held at that time at all. If you go back to the 1850s or 1750s, suits and dresses (or some older variant of them) were a sign of wealth, intelligence, high class living, etc. They had to be hand-tailored by experts using rare fabrics and dyes that had to be shipped all around the world. Then the industrial revolution came, and clothing was able to be mass produced (usually at the cost of quality). Suddenly the middle class had access to suits and dresses, but the perception that it was something for the wealthy was still there. For many businesses targeting the middle class, the suit and dress WERE the uniform, as a means of displaying how regal their brand is.
And itās not like weāve gotten past this. If you go on any of the social media sites with ads, take a look at what you see: some knock-off piece of trendy clothing thatās made to look like a high end fashion brand, but targeting the lower/middle class.
All that said, Iām all for the āpunk rockā mentality. Donāt do what your parents did just because society told them to tell you it was important. Stick it to the man, yadda yadda. But I think itās a trap to assume that the 1950s proletariat felt any differently than the same class of people do today.
As for windows v linux, of the people who are aware of both yet continue using windows, I think most would say that they use it specifically because they have a āpreference for something that i can just set up and not have to tinker withā and because they also arenāt making their choice based on āthe trackers in win11 or because [they] care that Microsoft is an evil megacorpā.
I'm not sure why Docker would be a particularly good (or particularly bad) fit for the scenario you're referring to.
If you're suggesting that Docker could make it easy to transfer a system onto a new SD card if one fails, then yes that's true ... to a degree. You'd still need to have taken a backup of the system BEFORE the card failed, and if you're making regular backups then to be honest it will make little difference if you've containerised the system or not, you'll still need to restore it onto a new SD card / clean OS. That might be a simpler process with a Docker app but it very much depends on which app and how it's been set up.
āStabilityā is probably the most mis-used word in the Linux world.
It means that how your system looks and behaves doesnāt change, which is really important for servers, especially in business, where you want to plan any change in advance before you commit to it.
Arch is not stable in this sense. It constantly changes, and those changes can come up on short notice with any upgrade.
But when people read that Arch isnāt stable, they think the system can break at any time.
Iād say this hasnāt been the case for at least 10 years now. If you RTFN (read the fucking news) and use the AUR sensibly, Arch has become a really boring system, regarding breakage.
Arch breaks all the time. It has to because upstream is usually always changing so breakage is inevitable.
Though a personās mileage on this may vary (less update frequency, less no of programs etc.), the constant thing about rolling release is that breakages within software releases are to be expected.
My experience with Arch is that it has been very solid and stable. It is just āmakes senseā for the most part and so issues are very resolvable.
If you use the AUR, you can get times when packages need to be excluded ( held back ) in order for the overall system to update. I do not see that as an Arch problem and it is easy to handle.
One thing that is an Arch problem is that, if you do not update often enough, you can end-up with outdated keys that prevent you from installing before packages. The solution is just to update the keyring before updating everything else but this is confusing for a new user and kind of dumb in my opinion. I feel like the system should do this for me.
Ironically, I find Arch is most stable if you update very frequently ( which makes the updates smaller and more incremental ). I do a quick update almost every day without any fear of breaking my system. Any āproblemsā I have had with Arch updates are trying to update a system that has not been updated forever. Even then, it is just a bit more work.
Another thing that can happen if you leave it too long is that packages will have been replaced by newer ones. Keeping up to date means there are only going to be a small number of those. An update after a year can run into a surprising number of them.
I dug out an old laptop that had Arch on it from 3 years before. Updating it was annoying but in the end it was totally up to date and stable.
Arch is not stable but itās easy to fix issues arising from its rolling release nature. One of the ways being utilizing the AUR packagedowngradefor easy package version rollbacks. I should also note that the most common reason for Arch breaking is rarely ever because of the distro itself but because upstream has introduced breaking changes. You can see this when an upstream feature breaks in Arch, then Fedora picks up the same bug a few weeks/month later.
Arch is however the most solid distro Iāve ever used since I began using Linux many many moons ago.
One thing that is an Arch problem is that, if you do not update often enough, you can end-up with outdated keys that prevent you from installing before packages. The solution is just to update the keyring before updating everything else but this is confusing for a new user and kind of dumb in my opinion. I feel like the system should do this for me.
Arch already does this. Could be that your install has the keyring refresh service disabled but Iāve had it enabled for a good while now and Iāve never encountered that outdated pacman keyring issue.
Ofc, Arch users should learn how to resolve a package conflict, or how to downgrade packages, or generally how to debug the system. Sometimes you also have to migrate config files.
On the other hand, as an arch user, I can tell that it mostly just works. If you customize heavily an ubuntu, it will break more likely. And while you can fix an arch, you probably have to reinstall an ubuntu.
Moreover, Arch has a testing repository which is not the default.
If I had to guess when GIMP will be native GTK 4 Iād say no sooner than five years, more likely itāll be 10. By the time GIMP runs on GTK 4 weāll be working with GTK 5 or 6.
I don't know how long it will take but it should be much much less work now that gnarly UI elements as old as GTK have been replaced with modern toolkit ones.
Iām in the opposite situation. I started on KDE but moved to GNOME. I sometimes think about moving back to KDE but I do love the design consistency of GNOME. KDEās endless theming is great, but I only ever used the default them because Iād notice little inconsistencies otherwise. Iāll probably be on KDE Plasma 6 though, because I tend to jump ship to the shiny new thing that will solve all my problems.
I always use Breeze lol. Breeze cursor is a true gem. Icons not so much, the big ones are okay, the file icons are sometimes very okay and the small b/w ones are pretty horrible.
I love Adapta Qt theme, but only for the small icons.
Iāve also been a Gnome user for a while, but i am looking forward to plasma 6 as well. I highly doubt Iāll make any sort of switch, but Iāve never had a good time running plasma 5 so i would love to like kde more. Wayland by default is going to benefit gnome too since itāll put more priority on bugs and lack of support that is still somewhat common among the less desktop-tied apps.
(My Plasma 5 woes have been on multiple devices, multiple times over multiple years, with and without basic customization. i was basically never able to go a day without some sort of major shell crash. I got way too familiar the the command sequence to restart the desktop ui)
I do find KDE to be a bit info dense and it doesnāt look like 6 is changing that aspect of things (at least by default), but it does look a bit less busy at least. I also never like basically anything about classic windows UI, layout, or task flows so KDE leaning into those just doesnāt work well for me. That said, while i like gnome being more minimal, i do wish it had a bit more capability to expose hidden/nested options more easily than requiring extension installs.
Iām similarly excited about cinnamon 6. A bit unfortunate (and understandable given its goals and usage share) it is still X11, but thereās a lot about it that demonstrates a solid middle ground between gnome and KDE.
I am usually on the pro-Wayland side but with GNOME and KDE the Wayland implementations are fairly independent. That means that your statement that KDE going āWayland by default is going to benefit gnome too since itāll put more priority on bugsā is watered down somewhat.
Fixing bugs in the KDE compositor / display server ( KWin ) will not necessarily address bugs or missing functionality in GNOME ( Mutter ). A lot of what they share is also shared with Xorg ( libinput, libdrm, KMS, Mesa ).
On the application side, apps lean heavily on the toolkit libraries. KDE apps are built with Qt and GNOME apps are built with GTK. Fixing Qt bugs may not improve the quality of GTK and vice versa.
Smaller projects will share more infrastructure. Many other environments are using Wlroots as a compositor library for example. Fixing bugs there will benefit them all but again is independent of KDE and GNOME.
Your point is still valid though. For one thing, the larger the Wayland user base, the greater the number of use cases the Wayland protocol itself will be adapted to address and the more testing and development everything in the Wayland ecosystem will get.
Over time, one benefit of multiple implementations will probably be code quality. Apps that run well in multiple environments are well implemented and the same is true of environments that provide the necessarily features to a large body of apps. In that way, more bugs will be found and fixed in all environments.
I avoid it at all costs as no solution is really seamless, but NoMachine gave me the best (perceived) latency out of VNC, TeamViewer, and a couple others I tried a couple years ago. Itās also cross platform, but if the machines are in different networks (behind a NAT), youāll likely need to configure port forwarding manually or via their GUI.
edit: I just remembered I even played youtube videos and the transport fever 2 game via NX (NoMachine) for a few hours and it worked well, while other protocols had either too much of a degraded quality or latency.
How intensive is nomachine? Iāve used it on decent hardware and itās performance was pretty good. But Iām thinking of setting it up on the raspbery pis at work since VNC is painful to use.
Iāve used it on my pi before I disabled the display manager because I barely used it, but performance was fine. I could log in from my desktop, phone, laptop, another pi, anything really, which was nice to have.
I use NoMachine as well as it has been the most responsive solution for me.
My biggest problem that I finally figured out was that NoMachine was attaching to a VNC console instead of creating its own display when I was using it with Unraid VMs (KVM)
but other distributions are complex to install and besides, Ubuntu works out of the box on my laptop!! But thank you so much, I once tried KDE but Plasma felt very hard to understand.
I disagree eight the other poster. Please use whatever distribution you feel most comfortable with!
With KDE Plasma you might want to wait for the upcoming 6 release, since they simplified a lot of stuff (and also Wayland per default iirc?). Kubuntu will take longer than Feodora to ship though.
I personally used Plasma a lot, and I understand the being overwhelmed. What I did was just working with it, and figuring stuff out along the way. I think KDE Plasma is awesome, especially for customization!
Because snaps are terrible. They constantly break parts of apps for no reason. If you have container issues with a flatpak, just use flatseal to punch a hole through the container. With snaps, people will tell you to install the non-snap version because thatās easier than beating snap into submission. I learned that the hard way when I had a university project with kubernetes and docker was installed as a snap. I spent way too much time trying to make it work at all before giving up and switching to a VM on my work laptop where it went surprisingly smooth without snaps.
Flatpaks are better in every way and since this isnāt about money, we should all just move on and use the best tool for the job.
But what does canonical think should happen when you run sudo apt install firefox and press Y? Thatās right, you now have firefox as a snap. Have fun waiting for 5 seconds every time you start it.
Shit like that scares new users away from linux as a whole
Maybe they fixed that part, but that isnāt a good thing. Now you canāt feel whether something is installed as snap and will probably run into snap issues without a clue what could be causing them.
Use what distro you like, but most distros are very easy to install (some even easier than Ubuntu I would argue). KDE Neon would be a good starting point in that regard. What exactly is hard to understand about Plasma? I have heard this sometimes now but I really donāt get it, I find it to be very easy to understand as it integrates for example theming
Iāve been using gnome for the past year on my laptop and on my desktop Iāve been using kde. I havenāt used my desktop in a few months and I missed kde. I moved from silverblue to fedora kinoite on my laptop and I donāt think that itās been two weeks but today I went back to gnome because the overview is much more polished than kdeās. It just works. Gnome always breaks extensions when they update a major version but Iāve seen so many āextensionsā on kde now which are all not updated anymore and break stuff that I might actually think that gnomeās way is kind of good. Maybe it was just the fedora version which lead to so many bugs but the experience I had in the past week wasnāt so good.
I also use KDE because I like customizing my DE, but Iām not sure I agree that itās hard to break. When I just switched from Xfce to KDE I downloaded several global themes using the built-it theme browser, and a few of those definitely messed things up. Itās also happened more than once that I boot my computer and end up with only the desktop background (i.e. no panels or context menu) because KDE thought there was some wrong with the theme, which can be difficult to recover from for someone who doesnāt know how to ctrl-alt-F3 and edit settings manually. Though itās ofc. more stable when not testing global themes, and only changing other appearance settings.
linux
Top
This magazine is from a federated server and may be incomplete. Browse more on the original instance.