<span style="color:#323232;">“We are delighted to welcome Holly to the GNOME Foundation. With her experience managing nonprofits, and passion for working with diverse communities of creators and technologists, she can strengthen the Foundation’s unique position as a partner and collaborator at the heart of the GNOME community. And, as an experienced communicator and fundraiser, she can tell our story to the outside world and position the Foundation in the wider ecosystem of nonprofits to raise the profile and impact of our incredible work.”
</span><span style="color:#323232;">Robert McQueen
</span><span style="color:#323232;">GNOME Foundation Board President
</span>
Edit: my below comment was actually wrong. They actually do use git.
Thanks for sharing. What I find most interesting is that Linus is still using the same email-based software development methods for the kernel while the rest of the software engineering world has evolved to use his other invention, git, for that. I’m kind of second-hand embarrassed for those geniuses who have yet to adopt proper version control for (what I’d argue is) the most important project in the computing world.
Here’s a far more nuanced explanation from Spore’s reply to this comment :
Git and Email are not mutually exclusive. In order to collaborate with git, you need and only need a way to send your commits to others. Commits can be formatted as plain-text files and sent through emails. That is how git has been used by its author from literally the first release of it.
Git was originally authored by Linus Torvalds in 2005 for development of the Linux kernel, with other kernel developers contributing to its initial development. Since 2005, Junio Hamano has been the core maintainer.
Git and Email are not mutually exclusive. In order to collaborate with git, you need and only need a way to send your commits to others. Commits can be formatted as plain-text files and sent through emails. That is how git has been used by its author from literally the first release of it.
A git server don’t need to know email to work, and it is not required to have a git server. Email in this workflow is an alternative to a PR: contributor submit a set of commits to the maintainer (or anyone interested). Then the maintainer is free to apply or merge the commits. After that the code can be pushed to any servers.
Honestly I’m surprised that so many people don’t know how git can be used without those repository hosting sites. That’s one way to use it, not the only way. And it’s not even the way it was originally designed for.
I’m not so surprised anymore. I’m self-taught using open-source software projects for guidance. But not everyone learns like that. For example in the commercial software dev world, having patches easy to apply with minimum tooling isn’t usually a priority (for better or worse).
This is actually a little story I had half written down; your comment prompted me to finish it. Thanks! www.srcbeat.com/2023/11/git-email/
Yeah, that’s not quite right. You need a means to discuss things and review code. You can do this via a website or mailing list. The Linux kernel uses the latter. Lots of other devs use the former. Like Github. And Github and Git aren’t the same. The issue tracking, discussion platform etc are something Github does on top of Git. You can as well use Email or a different service/online platform for the communication. The actual program code is stored in Git in both cases.
She sounds very experienced in managing larger projects and even some open source ones. Reading articles is not a hard endeavour. Perhaps you should try it. Gnome is the largest desktop enviorment on linux and it isn’t there because of bad decisions
(great ceo choice, she has experience in communication, which is the main thing a ceo has to do for gnome. She doesn’t need to do or participate deeply in development.
Great to see this perspective from a developer and it totally makes sense. I think the Firefox browser has encountered essentially the exact same thing. Linux support may be a strategic advantage for devs that embrace it.
That does not mean that every developer will find the same thing though. Proton and Unity have many, many Linux specific ( or at least non-Windows ) bugs I am sure. It would be easy to bemoan these. It takes a different kind of mind-set to see working around these kinds of issues as valuable. Even rarer are devs that take the opportunity to address bugs in the underlying tech ( outside the game - eg. in Proton ).
I suspect though that many non-Windows bugs are actually due to defects in the game. They are just not manifesting yet or in the same way. The fact that Linux exposes these is again an opportunity in the way the author of this post points out.
In other words, cross-platform deployment is an opportunity for a stronger product. Access to an engaged community with strong communication skills and technical chops is a bonus.
Hopefully more devs start to see the world this way. Great article.
They take a lot of space but the advantages you get are amazing, VScodium broke again this week, I could just rollback to the commit that worked with no issues. I can install apps I don’t trust and not give them any permission over my filesystem. And best of all: it works on any distro so I know my setup is reproducible easily.
I’ve tried a few on my 2008ish macbook pro and they all work. Antix and MX work well as do the others. I know MX gets some hate on here, but it works. I did cheat and shoved an old SSD in there because it really sped things up.
When you make a project with git, what you’re doing is essentially making a database to control a sequence of changes (or history) that build up your codebase. You can send this database to someone else (or in other words they can clone it), and they can make their own changes on top. If they want to send you changes back, they can send you “patches” to apply on your own database (or rather, your own history).
Note: everything here is decentralized. Everyone has the entire history, and they send history they want others to have. Now, this can be a hassle with many developers involved. You can imagine sending everyone patches, and them putting it into their own tree, and vice versa. It’s a pain for coordination. So in practice what ends up happening is we have a few (or often, one) repo that works as a source of truth. Everyone sends patches to that repo - and pulls down patches from that repo. That’s where code forges like GitHub come in. Their job is to control this source of truth repo, and essentially coordinate what patches are “officially” in the code.
In practice, even things like the Linux kernel have sources of truth. Linus’s tree is the “true” Linux, all the maintainers have their own tree that works as the source of truth for their own version of Linux (which they send changes back to Linus when ready), and so on. Your company might have their own repo for their internal project to send to the maintainers as well.
In practice that means everyone has a copy of the entire repo, but we designate one repo as the real one for the project at hand. This entire (somewhat convoluted mess) is just a way to decide - “where do I get my changes from”. Sending your changes to everyone doesn’t scale, so in practice we just choose who everyone coordinates with.
Git is completely decentralized (it’s just a database - and everyone has their own copy), but project development isn’t. Code forges like GitHub just represent that.
Well the bugtracker and additional features are not inside of the git repository. So they’d get lost. But each ‘git clone’ is a complete clone of the (source code) repository including all of the history of changes, the commit messages, dates and individual changes. That’s stored on every single computer that cloned the repository and you have a copy of everything locally. Though it might be out of date if you didn’t pull the latest changes. But apart from that it’s the same data that Github stores. You could just make it available somewhere else and continue.
Download all the builds from git and manual back them up or there are programs to do it for you. I usually used an old laptop connected to multiple HDDs to back up onto (I haven’t done this for a few years now).
Generally speaking, it has been a great experience for most apps I use. The only exception is Steam, it runs well, but sometimes I run into a few issues.
This might be due to me using an NVIDIA GPU, but after I do a graphics update, my game (Team Fortress 2) doesn’t launch until I reset Steam.
I like joining a third party MvM servers through the website (potato.tf), sometimes joining the game causes a second instance of Steam to launch for some reason…
linux
Oldest
This magazine is from a federated server and may be incomplete. Browse more on the original instance.