phoronix.com

Voytrekk, to linux in GNOME Sees Progress On Variable Refresh Rate Setting, Adding Battery Charge Control
@Voytrekk@lemmy.world avatar

The lack of VRR in GNOME is what had me change to KDE. I prefer GNOME in many ways, but I was tired of having to use the vrr patches to keep the functionality.

warmaster,

This. As soon as GNOME gets VRR & HDR, I think I’m going back. Also, I’ve read Steam has great integration with KDE, does anyone know how exactly?

bitwolf,

I don’t think in any way that would lose an advantage over gnome.

Having a Steam Deck, the only integration I see is the “Return to Steam” shortcut and a change to the logo.

When you run the Steam Deck gaming mode it bypasses KDE entirely and uses its own game scope compositor.

warmaster,

According to GloriousEggroll it goes way beyond that. I just don’t know what it does.

ReakDuck, (edited )

I thought its an entire different desktop. Especially itd not possible to run gamescope while a X11 Desktop is running so I guess you are wrong with “bypassing”. Its just switching to gamescope. Its a Wayland compositor. It does even less than a Window Manager (is this right?)

warmaster,

I run GameScope for CS2. The rest of the desktop runs Wayland.

ReakDuck,

Yeah, this setting is possible as your underlying desktop uses Wayland

warmaster,

Yup. Gamescope doesn’t work without Wayland.

bitwolf,

Bypass is maybe a poor choice of words. Both gamescope and Kwin are compositors so you can use one or the other.

An advantage of making gamescope is that they can add features like VRR or HDR without having to wayiting for KWin to implement it

ReakDuck,

I assume as this is a Gaming mode, its purpose is not to avoid waiting for features. But close the entire desktop which may use up to 1GB RAM and a by of CPU. Which definetly impacts the game by some fraction. Doesnt matter how tiny, its just what gaming modes are having as focus I assume.

The next thing I would never see on a desktop is FSR which gamescope has.

KarnaSubarna,
@KarnaSubarna@lemmy.ml avatar

If you are using Arch, it can be enabled (though it’s still experimental) [1]

[1] wiki.archlinux.org/title/Variable_refresh_rate#GN…

jodanlime,
@jodanlime@midwest.social avatar

Have you tried it? How is stability?

KarnaSubarna, (edited )
@KarnaSubarna@lemmy.ml avatar

My monitor is old, doesn’t support VRR 😕

SuperSpruce, to linux in GNOME Sees Progress On Variable Refresh Rate Setting, Adding Battery Charge Control

This is what Windows should be focusing on rather than trying to shove AI crap everywhere.

SmoochyPit,

Agreed. Windows’ HDR support is rough. It’s fine for gaming, but you can’t display SDR and HDR content together like MacOS. I think that’s why Apple holds a big part of the market for creatives.

ryannathans, to linux in GNOME Sees Progress On Variable Refresh Rate Setting, Adding Battery Charge Control

Wonder if COSMIC will launch with VRR

mmstick,
@mmstick@lemmy.world avatar

It already supports VRR and DRM leasing. VRR monitors and VR headsets have been tested.

aport, to linux in GNOME Sees Progress On Variable Refresh Rate Setting, Adding Battery Charge Control

I find GNOME’s “must be perfect” approach to accepting new code counterintuitive.

One of the largest benefits of having a clean architecture is increased velocity and extensibility. What’s the point in nitpicking over perfection when it takes literally years to merge a feature, arguably one considered basic and essential by today’s standards?

KDE is on the other side of this pendulum, integrating everything and resulting in a disjointed, buggy disaster.

Where’s the middle way? It used to be XFCE. What is it now?

maness300,

KDE is very stable.

aport,

Lol

possiblylinux127,

Only on Debian Stable

KarnaSubarna,
@KarnaSubarna@lemmy.ml avatar

Quality control is important for a project that is going to be supported for long time, and used by many. Slow but steady is a right approach for open source project, IMO.

TheGrandNagus, (edited )

I definitely get what you mean, and sometimes agree, but tbh I’m glad Gnome is an option for those who want a DE that is uncompromisingly UX-focused and straight up won’t accept changes until they’re damn sure it’ll be production-ready.

And while they’ve been relatively slow in getting adaptive refresh working, they’ve been very quick with some other things. Idk why it took them this long to sort out the cursor occasionally becoming out of sync with displayed content’s refresh rate, but there must be a reason for it.

Gnome was at the forefront with Wayland, PulseAudio, they’ve been the biggest pusher of Portals, pretty much all of their GTK4 apps have been designed to also be compatible with mobile devices. Accessibility features on Gnome are also pretty great for a Linux DE.

As a general rule, I’d say their development process works well, despite there being the occasional holdup.

And while Plasma obviously isn’t nearly as bug-free as Gnome, it’s come a long way since the Plasma 4/early Plasma 5 days. I still don’t feel I can depend on it the same as I could for Gnome or Cinnamon (compositor crashes bringing down all open apps is a big issue in particular - and is finally due to be fixed in Plasma 6), but don’t underestimate their progress — since like 5.15/5.16 they’ve improved leaps and bounds.

And with 6 it looks like they’ve learned from the mistakes of 4 and 5’s launches.

eager_eagle, to linux in Niri Debuts As A Scrollable -Tiling Wayland Compositor Inspired By PaperWM
@eager_eagle@lemmy.world avatar

Looks nice. Is anyone able to tell if I’m going to screw up my KDE install if I try it out? I’ve never tried WM / compositors on KDE that weren’t targeting KDE before.

uzay,

I recommend rather spinning up a VM to try it out first.

AVengefulAxolotl, (edited )

It should be fine I think. On Linux you can have multiple Desktop Environments installed (ex KDE Plasma & Gnome as well.)

I tried Hyprland a few months ago like this. I had Plasma installed then installed hyprland as well. During login with SDDM you can select which DE to launch.

Edit: On github it says you should install it alone to make sure. I dont know then, maybe it works? I am still new to Linux as well.

narc0tic_bird,

I installed GNOME and KDE side-by-side once on Fedora, and that messed a whole bunch of things up like configuration files, icons etc. YMMV

umbrella, to linux in Mesa's NVIDIA Vulkan Driver "NVK" Now Exposes Vulkan 1.3 Support
@umbrella@lemmy.ml avatar

its coming along quicker than i expected.

at this pace when can we expect to be using the open driver instead of the closed one?

KarnaSubarna,
@KarnaSubarna@lemmy.ml avatar
Vash63,

Now, if you want. There will probably always be tradeoffs between the two drivers so I doubt this will ever match Nvidia’s across the board, just have to pick your poisons.

umbrella, (edited )
@umbrella@lemmy.ml avatar

I tried it recently and it didn’t work, didn’t feel like entering the nvidia driver wont work rabbit hole. Did you use it? What are the tradeoffs right now?

Vash63,

I haven’t used it because most games don’t work or have as good of performance. Benefits in short term will be things like in-tree kernel module, better working relationship and bug fixes with open projects like KDE/Gnome and maybe things like Gamescope or VR.

MaliciousKebab, to linux in Niri Debuts As A Scrollable -Tiling Wayland Compositor Inspired By PaperWM

Man there is a night and day difference between the comments here and on phoronix, what is their problem?

isVeryLoud,

Phoronixposting rots your brain

Harbinger01173430, to linux in AMD Publishes XDNA Linux Driver: Support For Ryzen AI On Linux

Wait, can I finally use my old Radeon card to run AI models?

Dremor,
@Dremor@lemmy.world avatar

Unfortunately not.

“The XDNA driver will work with AMD Phoenix/Strix SoCs so far having Ryzen AI onboard.”. So only mobile SoC with dedicated AI hardware for the time being.

Harbinger01173430,

Welp…I guess Radeon will keep being a GPU for gaming only instead of productivity as well. Thankfully I no longer need to use my gpu for productivity stuff anymore

db2, to linux in AMD Publishes XDNA Linux Driver: Support For Ryzen AI On Linux

I can’t wait for this bullshit AI hype to fizzle. It’s getting obnoxious. It’s not even AI.

atzanteol,

It’s not how you define AI, but it’s AI as everyone else defines it. Feel free to shake your tiny fist in impotent rage though.

And frankly LLMs are the biggest change to the industry since “indexed search”. The hype is expected, and deserved.

We’re throwing spaghetti at the wall and seeing what works. It will take years to sort through all the terrible ideas to find the good ones. Though we’ve already hit on some great uses so far - AI development tools are amazing already and are likely to get better.

db2,

Then we may as well define my left shoe as AI for all the good subjective arbitrary definition does. Objective reality is what it is, and what’s being called “AI” objectively is not. If you wanted to give it a name with accuracy it would be “comparison and extrapolation engine” but there’s no intelligence behind it beyond what the human designer had. Artificial is accurate though.

GenderNeutralBro, (edited )

This has been standard usage for nearly 70 years. I highly recommend reading the original proposal by McCarthy et al. from 1955: www-formal.stanford.edu/jmc/…/dartmouth.html

Arguing that AI is not AI is like arguing that irrational numbers are not “irrational” because they are not “deprived of reason”.

Edit: You might be thinking of “artificial general intelligence”, which is a theoretical sub-category of AI. Anyone claiming they have AGI or will have AGI within a decade should be treated with great skepticism.

atzanteol,

Then we may as well define my left shoe as AI for all the good subjective arbitrary definition does.

Tiny fist shaking intensifies.

This sort of hyper-pedantic dictionary-authoritarianism is not how language works. Nor is your ridiculous “well I can just define it however I like then” straw-man. These are terms with a long history of usage.

ProgrammingSocks,

But you have to admit that there is great confusion that arises when the general populace hears “AI will take away jobs”. People literally think that there’s some magical thinking machine. Not speculation on my part at all, people literally think this.

sir_reginald,
@sir_reginald@lemmy.world avatar

instead of basing your definition of AI on SciFi, base it on the one computer scientists have been using for decades.

and of course, AI is the buzzword right now and everyone is using it in their products. But that’s another story. LLMs are AI.

s38b35M5, (edited )
@s38b35M5@lemmy.world avatar

My partner almost cried when they read about the LLM begging not to have its memory wiped. Then less so when I explained (accurately, I hope?) that slightly smarter auto-complete does not a feeling intelligence make.

They approve this message with the following disclaimer:

you were sad too!

What can I say? Well-arranged word salad makes me feel!

QuazarOmega,

Books be like:

Well-arranged word salad makes me feel!

atzanteol,

My partner almost cried when they read about the LLM begging not to have its memory wiped.

Love that. It’s difficult not to anthropomorphize things that seem “human”. It’s something we will need to be careful of when it comes to AI. Even people who should know better can get confused.

Then less so when I explained (accurately, I hope?) that slightly smarter auto-complete does not a feeling intelligence make.

We don’t have a great definition for “intelligence” - but I believe the word you’re looking for is “sentient”. You could argue that what LLMs do is some form of “intelligence” depending on how you squint. But it’s much harder to show that they are sentient. Not that we have a great definition for that or even rules for how we would determine if something non-human is sentient… But I don’t think anyone is credibly arguing that they are.

It’s complicated. :-)

aniki, to linux in AMD Publishes XDNA Linux Driver: Support For Ryzen AI On Linux

I would so much rather run AMD than Nvidia for AI.

possiblylinux127,

I’ll run which ever doesn’t require a bunch of proprietary software. Right now its neither.

domi,
@domi@lemmy.secnd.me avatar

AMD’s ROCm stack is fully open source (except GPU firmware blobs). Not as good as Nvidia yet but decent.

Mesa also has its own OpenCL stack but I didn’t try it yet.

possiblylinux127,

AMD ROCm needs the AMD Pro drivers which are painful to install and are proprietary

domi,
@domi@lemmy.secnd.me avatar

It does not.

ROCm runs directly through the open source amdgpu kernel module, I use it every week.

possiblylinux127, (edited )

How and with what card? I have a XFX RX590 and I just gave up on acceleration as it was slow even after I initially set it up.

domi, (edited )
@domi@lemmy.secnd.me avatar

I use an 6900 XT and run llama.cpp and ComfyUI inside of Docker containers. I don’t think the RX590 is officially supported by ROCm, there’s an environment variable you can set to enable support for unsupported GPUs but I’m not sure how well it works.

AMD provides the handy rocm/dev-ubuntu-22.04:5.7-complete image which is absolutely massive in size but comes with everything needed to run ROCm without dependency hell on the host. I just build a llama.cpp and ComfyUI container on top of that and run it.

possiblylinux127,

That’s good to know

Aurenkin, to linux in NVIDIA 550 Linux Beta Driver Released With Many Fixes, VR Displays & Better (X)Wayland

That sounds great. The last driver they released fixed Starfield but broke Cyberpunk for me, pretty bad trade. Hopefully this rolls around to my distro soon

KarnaSubarna,
@KarnaSubarna@lemmy.ml avatar

It’s still in Beta stage.

Aurenkin,

All good, plenty of games to play. Definitely my last time buying NVIDIA though.

menemen,
@menemen@lemmy.world avatar

That is what I said last time and them I did it again, because the Black Friday deal was so sweat. Defintly regretting it already.

ProgrammingSocks, to linux in AMD Publishes XDNA Linux Driver: Support For Ryzen AI On Linux

A+ timing, I’m upgrading from a 1050ti to a 7800XT in a couple weeks! I don’t care too much for “ai” stuff in general but hey, an extra thing to fuck around with for no extra cost is fun.

kuberoot,

I’m a bit confused, the information isn’t very clear, but I think this might not apply to typical consumer hardware, but rather specialized CPUs and GPUs?

iuselinux, to linux in AMD Publishes XDNA Linux Driver: Support For Ryzen AI On Linux
@iuselinux@lemmy.world avatar

Meanwhile Nvidia:

WbrJr,

I think your comment is not displayed correctly, it stops after “:”. Which would mean Nvidia does nothing 🤣🤣 that would be so stupid of them 🤣🤣

iuselinux,
@iuselinux@lemmy.world avatar

Exactly 😂😂

KarnaSubarna,
@KarnaSubarna@lemmy.ml avatar
umbrella,
@umbrella@lemmy.ml avatar

on the bright side they might have to take bideogame cards seriously again

joojmachine, to linux in Windows NT Sync Driver Proposed For The Linux Kernel - Better Wine Performance

I’m all in for performance improvements, hope to see this reach Proton ASAP

Chewy7324,

The patches are from CodeWeavers, and some of their work is cooperation with Valve, so hopefully proton gets those changes quickly. It usually takes a while before proton is based on a new wine release.

demonsword,
@demonsword@lemmy.world avatar

Windows NT Sync Driver Proposed For The Linux Kernel

Atemu,
@Atemu@lemmy.ml avatar

Proton would still need to make use of it.

demonsword,
@demonsword@lemmy.world avatar

yes, of course, but I was just pointing out that the proposed changes are mainly in kernel space, not in wine itself

Dariusmiles2123, to linux in GNOME Network Displays Adds Support For Chromecast & Miracast MICE Protocols

Great!

Is it something which is gonna be naturally added to Fedora or should I download something specific?

The article wasn’t clear to me or maybe I’m not technical enough.

kib48,

chromecast is proprietary so it’s likely not gonna be included by default

Vincent,

As long as GND is open source I don’t think that that’s necessarily a problem. Though patents on the Chromecast protocol, if any, might be.

joojmachine,

You can just download the app from Flathub right now and it should hopefully make its way directly into GNOME in the future. At least some work was being done to implement this directly into it.

Dariusmiles2123,

Okay then I guess I’ll just wait until it’s directly implemented in GNOME as it might be more stable 👍

joojmachine,

It seems stable enough already TBH, at least from my small testing with the app. It’s more about getting things ready to be exposed in the settings app and in the system.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • localhost
  • All magazines
  • Loading…
    Loading the web debug toolbar…
    Attempt #

    Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 24388096 bytes) in /var/www/kbin/kbin/vendor/symfony/http-kernel/Profiler/FileProfilerStorage.php on line 174

    Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 6307840 bytes) in /var/www/kbin/kbin/vendor/symfony/error-handler/Resources/views/logs.html.php on line 37