Gnome is the desktop environment. You may have gnome already installed if that’s what your OS uses, otherwise you probably have KDE. There are others but those are the 2 big ones.
The Desktop Environment is what gives the OS its look, and typically which GUI programs are installed by default.
That’s just scratching the surface. You can go even deeper and „make your own” desktop environment. Desktop environment is kind of like a pack of software that has a cohesive look and feel. You can leave that and pick and choose programs on your own and configure them to your liking. You can configure KDE or gnome, but this gives you even more control :)
In my experience, every computer is faster with Linux than with Windows. But if this measures just the processor performance on similar tasks I guess it’s news.
Consequently battery life tends to suffer on Linux vs windows. Especially on newer hardware before people figure out how to manage performance and battery life.
Usually, applying the same tricks that Windows does, its not true.
But by default, mostl Linux ditros dont do something special for having performance managing.
But actually. Windows does neither, at least the pure Vanilla form. Its a huge difference when using my Levono Ideapad with the preinstalled Windows versus Windows that is reinstalled Vanilla without drivers. Then Linux is more plug and play and better at this job than Windows.
Maybe they do it differently on ideapads. But on all of the modern thinkpads I own the all install at set up the same power profiles and dynamic tuning that the factory image does. Factory install vs fresh install performance is the same on these machines once windows update has done it’s thing. Even the random POS HPs will do the same thing.
I think it comes down to the culture. A minuscule improvement to a file system is big news in the Linux community. There’s also lots of academic interest in the performance critical parts of the kernel that you just can’t emulate with a closed source model. Is anyone writing papers on how to obtain a 2% improvement in the task scheduler on Windows?
Linux dominates the server market, so even small improvements matter when you’re talking about a server farm with thousands of machines or the latest supercomputer. Many, many people care about the scalability of Linux. On Windows, we say: NTFS? It’s good enough. The user won’t notice on modern SSDs.
A lot of the software components under the hood in Linux are replaceable.
So you have a bunch of different CPU and disk IO schedulers to suit different workloads, the networking stack and memory management can be tweaked to hell and back, etc etc.
TLDR: Ubuntu Pro offers additional security patches to packages found in the universe repo. Universe is community maintained so Ubuntu is essentially stepping in to provide critical CVE patches to some popular software in this repo that the community has not addressed.
I suppose it depends on how you look at it but I don’t really see this as withholding patches. Software in this repo would otherwise be missing these patches and it’s a ton of work for Ubuntu to provide these patches themselves.
Now is they move glibc to universe and tell me to subscribe to get updates I’ll feel differently.
Debian includes ffmpeg, for example, in the main stable repo. Given Debian’s reputation, I would think they are including these security patches in a timely manner, though I’m not entirely sure how to compare specific patches to verify this.
Of course, everything changes when you are selling support contracts. Canonical and Red Hat are the big two for enterprise because they provide support.
When I was last running Ubuntu on desktop, I signed up for an account and enabled these extra security updates. Yeah, it’s “free”, but it requires jumping through hoops. Requiring an account to get patches is the kind of user-hostile design pattern I expect from Apple or Google, but not in the desktop Linux world.
Debian’s contrib repo, which is the equivalent of Ubuntu’s universe repo, doesn’t get security updates from the Debian security team, as it’s not considered an official part of Debian. Package maintianers have to provide security updates. www.debian.org/security/faq#contrib
The difference is that Ubuntu provide paid support for contrib packages, including patches. Debian doesn’t have any official paid support options.
Nobody else has this hybrid model. RHEL is a paid distro in general. Most others are just free entirely. They all patch CVEs when they can. Ubuntu doesn’t write all of their patches or anything.
I really do wish governments invested more in open source. If it’s a generic thing like an operating system that the public could benefit from at large, they would be doing the public a service.
Government ran distros in public schools and government offices wouldn’t be any more invasive than windows working with the government. Better yet there actually be some sort of education on using the os and exponential growth of the Linux desktop as a whole.
I just wish KDE would get some love too. They work their asses off to make a desktop suit as many use cases and workflows as possible while maintaining a mostly polished experience. Their not afraid to implement stuff knowing it’s just a temporary solution till other projects catchup. They are actually willing to work with other projects on implementing standards and are developing standards like HDR on wayland for professional artists and gamers and are the first to jump on major features as soon as its solid.
Gnome is just annoying mess great for smartphone users unwilling to learn anything new and had never touched a pc or Mac in their life. What’s the appeal of using something with half its features gutted for the sake of looks just to have everyone add it back in anyway. It’s an annoying Apple like philosophy of let’s implement counter intuitive interfaces to preserve a look and never change it back because we’re always right. You’d think they’d have improved the window snap feature since 3.0
Ffs I knew this submission would turn into a minority of Plasma users trying to piss on Gnome. Can you not just be happy that an open source project is receiving help and that this will be a big improvement for accessibility features?
I never hear Gnome users crying about Valve heavily supporting KDE, so why are you angry about this?
I never hear Gnome users crying about Valve heavily supporting KDE, so why are you angry about this?
This does not happen because Gnome is the most supported desktop environment out there, they have Red Hat, Google, Canonical, OpenSuse even Microsoft donated to Gnome. Don’t get me wrong some of this company do support kde too, but Gnome get treated in a different way because it’s the default de for most of the distros out there.
Like you said, these companies help KDE too. KDE also has more hardware partners, and more contributors.
Even ignoring all that though, it still doesn’t answer the question: why cry over Gnome getting money to aid in accessibility improvements?
I have never once heard anybody cry about the companies that support KDE, yet some people here go on like Gnome fucked their girlfriend. It’s pathetic.
Nobody’s forcing anybody to use Gnome or any any other DE. Just be happy when nice things happen in the FOSS word.
I’m not complaining about gnome getting support, I’m complaining about kde being overlooked because gnome is the default desktop for Ubuntu. Kde is just a better tool for people wanting to just get things done. Gnome is pretty I’ll give you that but ask anyone, they are very hard to work with and stubbornly refuse compromise when working with others on creating useful tools and standards.
Just think how many times they broke extensions without any regard for the individuals using it. Their efforts to make other projects wait for them to deside what’s best for gnome like they are the only desktop that matters. The projects like portals usually say their going to implement the standard despite what gnome wants and kde often helps with the brunt of that work.
Just think how many times they broke extensions without any regard for the individuals using it.
You have no idea what you’re talking about.
It’s the job of the Gnome developers to update and improve Gnome.
It’s the job of the extension developers to update their extensions when there’s a new Gnome version.
And it’s the job of your distro’s maintainers to keep the versions of Gnome and the extensions in the repo compatible.
If you install Gnome from your distro’s repo and extensions from Gnome’s website, YOU take on this job.
Just install your extensions from your distro’s repo and you won’t have any issues.
KDE isn’t overlooked. KDE gets funding too. Valve and others have put so much into KDE. KDE has the most hardware partnerships. KDE has more contributors.
Kde is just a better tool for people wanting to just get things done
In your opinion…
I do all my work on Gnome because it’s got an amazing and highly productive workflow, minimal distractions, and it’s extremely stable.
I like Plasma, I like the options it has, I have it on one of my laptops, but it’s not what I’d use for work. The last thing I need is for kwin to crash and take all the programs I had open with it, losing hours of work. Yes, I’m aware this should be fixed in Plasma 6, but as of right now it’s a massive showstopper.
stubbornly refuse compromise when working with others on creating useful tools and standards
Gnome has championed a lot of open standards, and worked with others. You’re just repeating a Reddit meme. They’ve done so much flatpak, portals, open-desktop stuff in collaboration with KDE and others.
Just think how many times they broke extensions without any regard for the individuals using it.
You’re showing a complete lack of understanding about what extensions are.
Extensions are impossible not to break from time to time. Extensions don’t use some unchanging API to work - they’re modifications on the DE itself. That’s why they’re so powerful.
There’s no way around DE mods sometimes becoming borked when the DE gets a big update.
Why are you acting like Gnome is against portals lmao, they’ve been massively pushing portals and open desktop standards, even going as far as refusing to implement features unless there’s a cross-desktop standard way of handling it (e.g. accent colours, which they are only now putting in place now that they and KDE have hammered out a sensible standard for it. Or a better system tray, which they’ve been trying to spearhead an open, cross-desktop solution for for years now, although little progress has been made by everyone). Of the DEs, Gnome has pushed for things like portals and flatpaks the most lol
We get it. In your mind, Gnome = bad and evil and nasty, KDE = good quirky and kool.
They are being downvoted because it is utter nonsense, spouted as authoritative fact.
Anyone who has ever used gnome seriously, knows that although it can be used for touch it is heavily keyboard oriented.
While not undermining the work of KDE devs who I have great admiration for, GNOME devs also work heavily on standards that benefit all of linux, and arguably do just as much if not more, as they are a very well resourced project.
that would be a sound investment and we can’t have that, the government must focus on actively detrimental infrastructure projects to put money in the pockets of rich people.
Well, when you get from 3 to 2000 in only a few years, the vast majority of these versions will be unusable. No wonder they had to drop everything after 11…
Quite the statement that Gentoo has survived for so long compiling from source but, even with ever advancing processor speeds, they’ve finally gone "Nah… Takes to long. ".
I mean, I don’t blame them. Yesterday I left my machine building a PyTorch package for 4 hours on a 12 core processor.
As a long-time Gentoo user the only packages where compile times (and RAM usage) really bother me are all the myriad of forks of that shitty Chrome browser engine (webkit-gtk, QtWebEngine, chromium,…) and LLVM and clang.
My beef tends to be with software out of FANGs. Big teams and huge codebase to match. Completely inpetetrable for the rest of us and, I suspect, far more code then there should be.
Chrome takes so much longer than the kernel somehow. There’s also the occasional package that makes you build single-threaded because nobody has fixed some race condition in the build process.
More importantly Chrome takes so much longer than Firefox even though they essentially do the same things (or 95% the same things if you are nitpicky).
Yes, but Chromium is very easy to embed in applications. Mozilla has a history of creating and then abandoning embedding APIs every few years or so (and right now I think they have none).
It seems very hard to embed it anywhere considering everyone doing so forks the whole codebase. Besides, my point was about compile times, embedding APIs shouldn’t take significantly longer to compile.
To be fair USB sticks and SD cards seem to fail when you stare at them a bit too intensely. I think it has been at least a decade since I bought a USB stick for OS installations that lasted for more than three installs (each a few months apart at least since the need does not arise that often).
Could be familiarity? I saw an article go by recently about how projects that aren’t on GitHub suffer from lack of contributions. Although that matters more for smaller projects, Mozilla is a beast and could probably pull people off GitHub if it wanted to.
Also if anyone should be trying to build up an alternative to GitHub, it should be Mozilla
If you are at a skill level, where you can meaningfully contribute to a project like this, registering for an alternative git provider should not be an obstacle
It’s super cool that it supports this, heck I’ve used it when no other options were there (and thank git I could! It made a nightmare into just a little more work instead).
I will say though, it’s most of the other software forge features that people normally talk about adding Activity Pub support for (issues tracking, merge requests, tracking forks, CI tooling, handling documentation, etc).
Pull people off GitHub? I get the impression from others that contributing to Mozilla projects, particularly Firefox, is a painful experience. But afaik one former Mozilla project uses GitHub for everything: Rust, the programming language.
It’s the most widely used platform that the most people are familiar with that they get to use likely for free. Newer projects of theirs are also hosted there. Why would you say it makes no sense?
[Mouseover text] Thomas Jefferson thought that every law and every constitution should be torn down and rewritten from scratch every nineteen years–which means X is overdue.
As somebody that first configured X back in 1991, I agree with this message.
To be fair though, with KMS, libdrm, and libinput, setting up X is 1000 times easier than it used to be. I suspect most users never even need to open Xorg.conf or even know it exists.
Ironically, all these technologies are also used by Wayland. A lot of what Wayland does not do, Xorg basically does not do either.
Office: I use LibreOffice as much as possible. At work, I use the Web version of MS Office; it doesn’t have all features of the desktop version but it’s good enough for my use case.
Media editing (music, image, video): GIMP, Krita, Kdenlive and Ardour are more than enough for my personal use.
In general, I would recommend trying the Linux alternative, and if it’s not good enough, use a Windows VM or dual-boot. If you spend 90% of your time in Photoshop or any other professional software without a Linux version or feature-complete alternative, you should stay on Windows, and maybe use Linux only when you’re not working.
Have you compared kdenlive to shotcut? Wondering how they compare as I’ve been working with SC for a few months an dfinally getting used to it, but the lack of a titler feature is a glaring omission.
I second this, OP, this is pretty much the state of it, but I do recommend trying out a Linux program called Wine, it can run some windows programs in your Linux environment. It’s not always the best, but I run a circuit making program there and I only had a bit of issue once. I just wanted to mention wine since some stuff works well with it, but now I’m realizing a VM might be better if it’s multiple programs lol. Oh well.
en.wikipedia.org/wiki/Bus_factor is a thing that any GOOD project or IT department considers. How many of your staff can you afford to lose if they all happen to be travelling in the same bus, on their way to eat at the same place for lunch when an asteroid inevitably punches through said bus and/or diner.
‘Hit by an asteroid’ is a little unrealistic. Sentenced to prison for 15 to Life has happened in the Open Source community at least once before. The project I linked to had a Bus Factor of about one. It’s now ‘old code using outdated APIs’ and is considered obsolete.
I’ve personally seen legal and criminal issues for a single individual cripple IT departments before, meaning their bus factor was also way too low. I’ve been on trips that have been rudely interrupted by screaming executives when I came down out of the mountains into cell range because I was the only bus factor left on certain systems. Natural disaster, such as hurricanes, wildfires, and floods are very serious existential threats to even the largest of organizations.
Since Linux seems to be a good project, I can’t imagine that the discussion hasn’t been had, in public or in private. Millions of individuals and dozens upon dozens of big corporations depend on Linux, Open source and otherwise. If the bus comes for core maintainers or project leaders we have at least SOME backup.
I’ve been on trips that have been rudely interrupted by screaming executives when I came down out of the mountains into cell range because I was the only bus factor left on certain systems.
Wow, incredible management skills, genius move to treat your one critical employee like a piece of shit.
Yeah, that was close to the end of that job. I didn’t want to be there, and that particular manager was really upset that they couldn’t just eliminate those servers. He wanted his folks trained on them, but then refused to actually let them spend any time training on them. I was a scapegoat and took the severance deal ASAP.
when an asteroid inevitably punches through said bus and/or diner.
Or, you know there is a crash? Lol
I’ve never heard it with the asteroid explanation. But thousands of people die every year in car crashes. Most in single occupant vehicles, but a bus can be involved too.
“Brave Hero from Finland, you’ve been struck by a bus and are going to reincarnate into–”
“No I wasn’t. That bus CHASED ME DOWN two alleys, over a fire hydrant, into, and out of a Starbucks. It did NOT hit me. You just summoned me here.”
“Err… anyway, this world needs a hero to–”
“Write hardware drivers? A kernel module? Some inline assembly?”
"Err… the demon lord… er… "
“DID YOU EVEN MAIL THE LIST? Hah… Okay. Does this world have logic gates of any kind? I need to get this knocked out as soon as possible. I’ve got the entirety of the bcachefs patchset to review before 6.7 is in release.”
I’m just eager to know how much laptops will cost with the new Qualcomm chip. I don’t want to pop champagne too early only to realize that new ARM laptops cost $2000.
New tech always comes at a cost, hopefully with the many manufacturers partnering with Qualcomm in this project we’ll have competitive pricing better than the current offering that Apple silicon provides.
That’s not happening anymore due to real world constraints, though. Dennard scaling combined with Moore’s Law allowed us to get more performance per watt until around 2006-2010, when Dennard scaling stopped applying - transistors had gotten small enough that thermal issues and other current leakage related challenges meant that chip manufacturers were no longer able to increase clock frequencies each generation.
Even before 2006 there was still a cost to new development, though, us consumers just got more of an improvement per dollar a year later than we do now.
Youre right, just like the first risc-v laptop which was more than 1k with awful performances. This will probably follow the M series trend at about 1,5k , but arm has a lot of competitors…
Lots of tech companies might be interested. For example, at my work we are now stuck half way between x64 and arm, both on the server side and on the developers side (Linux users are on x64 and Mac users are on arm). While multiarch OCI/docker containers minimize the pains caused by this, it would still be easier to go back to a single architecture.
If you build a docker image on an ARM Mac OS with default settings it will happily run on Linux on ARM, the same for a Go app compiled with GOOS=“linux”, for example. Of course you can always fix the issues that pop up by also specifying the architecture, but people often forget, and in the case of docker it has significant performance penalties.
linux
Top
This magazine is from a federated server and may be incomplete. Browse more on the original instance.