phoronix.com

sentient_loom, to linux in openSUSE Logo Contest Concludes With Winners Selected
@sentient_loom@sh.itjust.works avatar

I know this is dumb, but cute animal logos is the reason I refuse to learn Go.

janAkali,

IMO, go’s gopher is ugly, not cute. But, anyway, there are better reasons not to learn Go.

sentient_loom,
@sentient_loom@sh.itjust.works avatar

I’m curious to know those reasons. I’d like to pretend that I have a valid argument against Go.

janAkali, (edited )

For one - the error handling. Every codebase is filled with messy, hard to type:


<span style="color:#323232;">if err != nil {
</span><span style="color:#323232;">    ...
</span><span style="color:#323232;">}
</span>

And it doesn’t even give you a stack trace to debug the problem when an error happens, apparently.

Second reason - it lacks many features that are generally available in most other languages. Generics is the big one, but thankfully they added them in last half a year or so. In general Golang’s design principle is to implement only the required minimum.

And probably most important - Go is owned by Google, aka the “all seeing eye of Sauron”. There was recently a big controversy with them proposing adding an on-by-default telemetry to the compiler. And with the recent trend of enshittification, I wouldn’t trust google or any other mega-corporation.

sentient_loom,
@sentient_loom@sh.itjust.works avatar

Yeah the “owned by google” thing is a big turn-off. And telemetry… he’ll no. Also it’s weird that Go doesn’t have a ternary. It’s a small thing, but it’s a thing.

BarrierWithAshes, (edited )
@BarrierWithAshes@kbin.social avatar

That gopher is literally the reason I have been considering learning go. Same with plan 9.

DannyBoy,

Guess you’re stuck with C++

sentient_loom,
@sentient_loom@sh.itjust.works avatar

Right, the only other language.

DannyBoy,

Because it has an animal mascot that’s not cute.

savvywolf,
@savvywolf@pawb.social avatar

Poor Keith. ;_;

sentient_loom,
@sentient_loom@sh.itjust.works avatar

I didn’t even know they has a mascot. And now my idiot-brain wants to learn c++ for a bad reason (on top of some good reasons).

danielfgom, to linux in KDE's Nate Graham On X11 Being A Bad Platform & The Wayland Future
@danielfgom@lemmy.world avatar

Undoubtedly Wayland is the way forward and I think it’s a good thing. However I wouldn’t piss all over X because it served us well for many years. My LMDE 6 still runs X and probably will for the next 2 years at least because both the Mint Team and Debian team don’t rush into things. They are taking it slow, testing Wayland to make sure no-one’s system breaks when they switch to Wayland.

This is the best approach. Eventually it will all be Wayland but I never understood why this is such an issue. Like any tech it’s progress, no need for heated debates. It’s just a windowing system after all.

Juujian, to linux in GNOME's Dynamic Triple Buffering "Ready To Merge"

That sounds cool… Wish the article said what it does.

AlmightySnoo, (edited )
@AlmightySnoo@lemmy.world avatar

Double and triple buffering are techniques in GPU rendering (also used in computing, up to double buffering only though as triple buffering is pointless when headless).

Without them, if you want to do some number crunching on your GPU and have your data on the host (“CPU”) memory, then you’d basically transfer a chunk of that data from the host to a buffer on the device (GPU) memory and then run your GPU algorithm on it. There’s one big issue here: during the memory transfer, your GPU is idle because you’re waiting for the copy to finish, so you’re wasting precious GPU compute.

So GPU programmers came up with a trick to try to reduce or even hide that latency: double buffering. As the name suggests, the idea is to have not just one but two buffers of the same size allocated on your GPU. Let’s call them buffer_0 and buffer_1. The idea is that if your algorithm is iterative, and you have a bunch of chunks on your host memory on which you want to apply that same GPU code, then you could for example at the first iteration take a chunk from host memory and send it to buffer_0, then run your GPU code asynchronously on that buffer. While it’s running, your CPU has the control back and it can do something else. Here you prepare immediately for the next iteration, you pick another chunk and send it asynchronously to buffer_1. When the previous asynchronous kernel run is finished, you rerun the same kernel but this time on buffer_1, again asynchronously. Then you copy, asynchronously again, another chunk from the host to buffer_0 this time and you keep swapping the buffers like this for the rest of your loop.

Now some GPU programmers don’t want to just compute stuff, they also might want to render stuff on the screen. So what happens when they try to copy from one of those buffers to the screen? It depends, if they copy in a synchronous way, we get the initial latency problem back. If they copy asynchronously, the host->GPU copy and/or the GPU kernel will keep overwriting buffers before they finish rendering on the screen, which will cause tearing.

So those programmers pushed the double buffering idea a bit further: just add an additional buffer to hide the latency from sending stuff to the screen, and that gives us triple buffering. You can guess how this one will work because it’s exactly the same principle.

QuazarOmega,

I love this explanation, I thought I’d never understand

MonkderZweite, (edited )

And why does a desktop environment need to do that?

jmcs,

To reduce input lag and provide smoother visuals.

MonkderZweite,

You say the animations are too much?

Moltz, (edited )

Lol, why own up to adding animations the system can’t handle when you can blame app and web devs? Gnome users always know where the blame should be laid, and it’s never Gnome.

jmcs,

If by animations you mean smoothly moving the mouse and windows while badly optimized apps and websites are rendering, yes.

Chewy7324,

If the system can’t keep up with the animation of e.g. Gnome’s overview, the fps halfes because of double buffered vsync for a moment. This is perceived as stutter.

With triple buffer vsync the fps only drop a little (e .g 60 fps -> 55 fps), which isn’t as big of drop of fps, so the stutter isn’t as big (if it’s even noticeable).

MonkderZweite, (edited )

Maybe the animation a bit simpler…?

Less animation is usually better UX in something often used, if it’s not to hide slowness of someting else.

AlmightySnoo, (edited )
@AlmightySnoo@lemmy.world avatar

Biased opinion here as I haven’t used GNOME since they made the switch to version 3 and I dislike it a lot: the animations are so slow that they demand a good GPU with high vRAM speed to hide that and thus they need to borrow techniques from game/GPU programming to make GNOME more fluid for users with less beefy cards.

Moltz, (edited )

Not only slow, it drops frames constantly. Doesn’t matter how good your hardware is.

There’s always the Android route, why fix the animations when you can just add high framerate screens to all the hardware to hide the jank. Ah, who am I kidding, Gnome wouldn’t know how to properly support high framerates across multiple monitors either. How many years did fractional scaling take?

const_void, to linux in Steam Linux Marketshare Surges To Nearly 2% In November

Tell me why “market share” of commerical, proprietary games is important to Linux again?

Secret300,

Potentially more support for other things other than gaming, maybe… Hopefully

Vilian,

nvidia openned their drivers not long after they announced that was “working sith valve to givd a better gaming experience on linux”

ShittyBeatlesFCPres,

Because it’ll be funny if Microsoft just gives up and makes “Windows” a desktop environment for Linux.

rasensprenger,

That would be extremely funny

CeeBee,

What would be great is they’d likely need to open source certain stuff for it to play nice with the kernel. Stuff like DirectX. And if that happens it’ll be a singularity moment for Linux compatibility and adoption.

zingo, (edited )

Starter edition - with no option of changing wallpaper and a 3 app multitask limitation.

Proprietary telemetry built into the kernel.

…Microsoft will die on that hill.

;)

possiblylinux127,

That’s what many people miss. I know Value is doing a lot but I was hoping for some other large companies to get into the space.

eager_eagle,
@eager_eagle@lemmy.world avatar

this is measuring market share of Linux in the gaming scene, not the other way around.

sep,

Bow I wonder what the gaming share of linux use would be. Probably very very small percentage. since the wast majority of linux installs are servers

AtmaJnana, (edited )

I have at least 20 different devices that run some flavor of Linux. Servers, a laptop, TVs, AP/routers, probably more, if my other “smart” appliances run Linux also.

Do Android phones and tablets count towards Linux gaming?

PlayStations run a derivitive of BSD, maybe those should get honorable mention. ;)

sep,

Including phones would be significant. But in my (probably deranged) head android/linux is a different os from GNU/linux. The overlap of the kernel itself is not enough. In that case all switches/routers/storage appliances/toasters/washing machines/fridges/iot sensors often also run linux.

itsPina,
@itsPina@hexbear.net avatar

Steam market share is honestly probably a decent metric for adoption rate of Linux as a whole.

const_void,

And that’s important because?

UprisingVoltage,

Linux can be used to play commercial games > more people daily drive to linux > more companies port their software to linux > even more people switch to linux > Windows/macOS duopoly breaks, losing to open source alternatives

I’m not saying playing call of duty on the deck will make windows fall, but it’s a start

const_void,

daily drive to Linux

Since when have you needed to commute to use Linux? 🤣

UprisingVoltage,

Daily drive linux *

Lmao my bad

0xtero,

Because the more market share leads to better hardware and driver support

Tak,
@Tak@lemmy.ml avatar

The fluctuation in Simplified Chinese use makes me pretty suspect here. It was nearly cut in half in one month and suddenly 20% of Steam’s users that used Simplified Chinese just didn’t exist.

Honestly that big of a fluctuation in regional selection tells me none of the other data means anything.

LeFantome,

If you are a Linux user and like commercial games, you probably would prefer them to work on Linux.

“Market share” on Linux aligns the vested interest of game makers and Linux game players. If the company thinks it can make money, it will do more to allow games to run, or at least do less to stop them.

Mereo,

Because of Valve, Linux is finally my main OS. I’m a PC gamer and it was a pain in the ass to dual-boot between Windows and Linux.

GravitySpoiled,

A lot of people only play games on their computer, hence running linux doesn’t make sense if they can’t play games on it

nous,

Yup, a big excuse I used to see a lot was

I would like to run Linux, but I want to game more so will stick to Windows

And this has changed a lot with what valve has done which opens Linux to a much larger market of people that can now use it for their usecases.

lemmyvore, (edited )

There’s high potential overlap between the profile of a PC gamer (who is often also a PC builder and general computing DIY hobbyist) and an OS like Linux that extends your tinkering ability massively on the software side.

PC/laptop users are a shrinking demographic nowadays thanks to the advent of mobile devices, but they’re a high quality demographic made up of professionals and hobbyists with above average computer savvy. So lots of companies are trying to appeal to them because the choices they make in software and hardware can translate into many other IT fields.

andrew_bidlaw,
@andrew_bidlaw@sh.itjust.works avatar

These commercial, proprietary games are one of the things that pushes forward the capabilities of personal computers. They are unreasonable, unoptimized resource-hogs. If a Linux system is as capable of running them as a proprietary OS (that has a deck stacked in it’s favor), it means they lose one another advantage over Linux. And it also means that your hardware now is more productive at less bs tasks, especially consumer-grade nvidia cards, who are better supported now than years ago.

FIST_FILLET,

market share leads to demand, demand leads to supply

this benefits you

chitak166, to linux in KDE's Nate Graham On X11 Being A Bad Platform & The Wayland Future

Eh, I always discredit people when they say X is bad.

It’s been around for over 20 years. That kind of longevity should be praised.

Omega_Jimes, to linux in KDE's Nate Graham On X11 Being A Bad Platform & The Wayland Future

I love Wayland until I don’t. I honestly don’t think about it, it gets out of my way and my system is stable, until I go to use something like scrcpy that just doesn’t work at all. Luckily, the amount of things that straight up don’t work is shrinking.

sabreW4K3, to linux in Fedora 40 Eyes The Ability To Boot Unified Kernel Images Directly
@sabreW4K3@lemmy.tf avatar

Is this good?

vanderbilt,
@vanderbilt@beehaw.org avatar

Yes, in my opinion. The configuration of grub (boot loader) is just another step to go wrong, and this will eliminate that possibility. Additionally, it will prevent stupider operating systems (cough Windows) from accidentally overwriting the boot loader during an update.

sabreW4K3,
@sabreW4K3@lemmy.tf avatar

Does that mean that the OS would have to handle version booting?

vanderbilt,
@vanderbilt@beehaw.org avatar

My understanding is that’s a yes.

sabreW4K3,
@sabreW4K3@lemmy.tf avatar

Thank you

Flaky, (edited )
@Flaky@iusearchlinux.fyi avatar

It basically means instead of relying on a bootloader (e.g. GRUB or systemd-boot) the computer boots the kernel directly. Generally there should be no change besides having to use the BIOS menu to manually select a kernel.

sabreW4K3,
@sabreW4K3@lemmy.tf avatar

Thank you, you’re awesome!

Flaky,
@Flaky@iusearchlinux.fyi avatar

No problem! :)

FWIW, a lot of the DIY distros (Arch and Gentoo being the ones on most minds) allow this already so it’s nothing new. It’s just Fedora implementing it that’s new I guess. If you’re curious, the term to search is “EFISTUB”.

Blisterexe,

Is the benifit making secure boot work better?

duncesplayed,

I think for most people they won’t care either way.

Some people do legitimately occasionally need to poke around in GRUB before loading the kernel. Setting up certain kernel parameters or looking for something on the filesystem or something like that. For those people, booting directly into the kernel means your ability to “poke around” is now limited by how nice your motherboard’s firmware is. But even for those people, they should always at least have the option of setting up a 2-stage boot.

Dio9sys, to linux in KDE's Nate Graham On X11 Being A Bad Platform & The Wayland Future

It’s super impressive to see Wayland having its big breakthrough moment. I remember reading about Wayland 10 years ago and worrying it was going to end up as a dead project.

1984, (edited ) to linux in GNOME's Dynamic Triple Buffering "Ready To Merge"
@1984@lemmy.today avatar

There is already a package for this in arch AUR you can install:

aur.archlinux.org/…/mutter-dynamic-buffering

I used to install this (it replaces mutter) but didn’t notice any difference in my system.

I think it makes a big difference on some systems though, since I saw other people absolutely love it.

Chewy7324,

There’s a Fedora copr with the triple buffering patches and it did improve the perceived smoothness of Gnome’s animations on my 8th gen Intel CPU.

It was especially noticeable if the system was limited in power because of running on battery.

dario,

I use this package. It makes a difference in games.

Patch,

Canonical have had it in Ubuntu for years, but it’s taken them a while to get it to a point where it could be upstreamed. That’s what this news is: that Canonical’s patch is finally all clear to be merged.

taanegl, to linux in KDE's Nate Graham On X11 Being A Bad Platform & The Wayland Future

Wayland on an Intel iGPU runs flawlessly and has for several years. However, that’s a matter of drivers. AMD is in the forefront regarding having dGPU support, while NVIDIA is playing catch-up.

In any case, the future is bright.

DumbAceDragon, to linux in KDE's Nate Graham On X11 Being A Bad Platform & The Wayland Future
@DumbAceDragon@sh.itjust.works avatar

Really looking forward to the day nvidia drivers properly support wayland. Getting tons of bugs, stutters, and general usability issues with plasma wayland on my 3060. X11 just works on the other hand, even with multiple monitors running at different refresh rates (something a friend of mine said X11 doesn’t work well with). But I want all the nice benefits wayland offers.

MonkderZweite, (edited ) to linux in KDE's Nate Graham On X11 Being A Bad Platform & The Wayland Future

Because Wayland is only a protocol and you write the platform yourself (be it badly or not).

Would be cool if the reference implementation (Weston) were not an unusable monolith but a small plugin-based thing.

kawa, to linux in GNOME's Dynamic Triple Buffering "Ready To Merge"
@kawa@reddeet.com avatar

I genuinely tried Gnome and started to like it but a very minor update broke all of my QoL extensions and only 1/8th of them were updated. It’s lacking so many features that it’s just a bad DE all around : snapping windows in quarters anyone ? Why isn’t it already an option ? GNOME devs need to touch grass and listen to the actual users.

Fredol,

Gnome devs will never listen to criticism. Even if you do a MR it might get denied because it contraricts with the “Gnome way”. Just use KDE and live an happy life. KDE can be easily modified to look like Gnome and have all the QOLs you need.

kawa,
@kawa@reddeet.com avatar

Oh yeah I’m 100% on KDE now, I switched to Gnome for a little while because it had less bugs on Wayland on nvidia cards

AnUnusualRelic,
@AnUnusualRelic@lemmy.world avatar

That wouldn’t be the true gnome experience.

chitak166,

GNOME devs need to touch grass and listen to the actual users.

I totally agree. However, interacting with any gnome devs is like pulling teeth. They keep making bad decisions to be ‘different’ and make their jobs easier, then when those decisions turn out to be bad they have to walk them back but never admit fault.

Being able to move the dock is fine example of this.

It’s like they want Apple’s lack of customization but can’t provide a competitive default (because they suck at their jobs.)

isVeryLoud,

You know these are volunteers that work for free, right?

Moltz, (edited )

Lol, how does this change the fact their work stinks? Maybe if they didn’t suck at designing the hate would stop? Nah, guilt trip the users instead, that’ll fix it. Free crap is still crap, and pointing it out isn’t a sin. If the devs can’t deal with that, maybe they should go home and cry about it instead of further shitting up the code.

Devs don’t owe users anything? Guess what, users don’t owe devs shit either. If they don’t like criticism, tough tittys, cause shit code will be criticized, which is why Gnome is still considered a joke.

unionagainstdhmo,
@unionagainstdhmo@aussie.zone avatar

To be fair the extension developers were given quite a while to update their extensions to use JavaScript modules instead of the custom GNOME solution. This was actually a change for the better and unlikely to happen again which should make extension development easier. As for better tiling look up their mosaic thing which was announced a while ago, though I’m unsure as to how soon that will come out.

Also try to remember that GNOME is developed mostly by volunteers who frankly owe you nothing

KseniyaK, (edited ) to linux in systemd 255 Released With A "Blue Screen of Death" For Linux Systems

I hope this isn’t going to be the default. I know, the average granny might prefer to have a BSOD with a QR code, but I think a lot of the people who are more tech-savvy, like me, would prefer to see log messages when booting because then you could see which service failed and why or why it’s all of a sudden taking so long to boot. That’s also why I choose not to have a splash screen when booting.

Anyways, this BSOD thing doesn’t apply to me because I use Gentoo with OpenRC.

itslilith,
@itslilith@lemmy.blahaj.zone avatar

I’m honestly fine if this is the default for beginner distros, as long as it’s easy to disable and there is still a way to get to the logs

SeeJayEmm,
@SeeJayEmm@lemmy.procrastinati.org avatar

Just let me hit ESC and see the panic.

pl_woah,

Came here to say this. Let them toggle the logs or the QR code.

Holzkohlen, to linux in systemd 255 Released With A "Blue Screen of Death" For Linux Systems

At least make it pink or smth

ikidd, (edited )
@ikidd@lemmy.world avatar

Maybe it can be the “brown screen of death”. To indicate that it shit itself.

wmassingham,

PSoD is already used by VMware ESXi. And Windows Insider builds, I think.

Maybe green?

ABeeinSpace,

Green is Windows Insider builds

uriel238,
@uriel238@lemmy.blahaj.zone avatar

Maybe a customizable setting? Black screen with red border and a looping kittens video?

mateomaui, (edited )

PSoD

Piss Screen of Death?

edit: oh nvm, I mistakenly thought this was in reply to the suggestion for dark yellow.

AgnosticMammal,

Dark yellow?

  • All
  • Subscribed
  • Moderated
  • Favorites
  • localhost
  • All magazines
  • Loading…
    Loading the web debug toolbar…
    Attempt #

    Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 20975616 bytes) in /var/www/kbin/kbin/vendor/symfony/http-kernel/Profiler/FileProfilerStorage.php on line 171

    Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 4210688 bytes) in /var/www/kbin/kbin/vendor/symfony/error-handler/Resources/views/logs.html.php on line 28