@AlmightySnoo@lemmy.world
@AlmightySnoo@lemmy.world avatar

AlmightySnoo

@AlmightySnoo@lemmy.world

Yoko, Shinobu ni, eto… 🤔

עַם יִשְׂרָאֵל חַי Slava Ukraini 🇺🇦 ❤️ 🇮🇱

This profile is from a federated server and may be incomplete. Browse more on the original instance.

AlmightySnoo, (edited )
@AlmightySnoo@lemmy.world avatar

Congrats! Your laptop will be even happier with a lighter but still nice-looking desktop environment like Xfce and you even have an Ubuntu flavor around it: Xubuntu.

AlmightySnoo,
@AlmightySnoo@lemmy.world avatar

Me when someone’s Ubuntu install reaches EOL: just install Arch

AlmightySnoo,
@AlmightySnoo@lemmy.world avatar

They’re worse than us Arch users (btw)

AlmightySnoo,
@AlmightySnoo@lemmy.world avatar

It depends. I’m working in the quant department of a bank and we work on pricing libraries that the traders then use. Since traders often use Excel and expect add-ins, we have a mostly Windows environment. Our head of CI, a huge Windows and Powershell fan, once then decided to add a few servers with Linux (RHEL) on them to have automated Valgrind checks and gcc/clang builds there to continuously test our builds for warnings, undefined behavior (gcc with O3 does catch a few of them) and stuff.

I thought cool, at least Linux is making it into this department. Then I logged into one of those servers.

The fucker didn’t like the default file system hierarchy and did stuff like /Applications and `/Temp’ and is installing programs by manually downloading binaries and extracting them there.

AlmightySnoo, (edited )
@AlmightySnoo@lemmy.world avatar

reminds me of #ifndef instead of #if !defined(…)

AlmightySnoo,
@AlmightySnoo@lemmy.world avatar

Bad track record with their privacy invasion via their Amazon shenanigans (which Richard Stallman called the Ubuntu Spyware), the shilling of Ubuntu One cloud and now Ubuntu Pro subscriptions that are reminiscent of Microsoft’s shilling of Microsoft accounts and OneDrive, Snap telemetry…

AlmightySnoo,
@AlmightySnoo@lemmy.world avatar

Ubuntu is just Windows in Tux’ clothing

AlmightySnoo,
@AlmightySnoo@lemmy.world avatar

That repo is just pure trolling, read the “Improved performance” section and open some source files and you’ll understand why.

AlmightySnoo,
@AlmightySnoo@lemmy.world avatar

they obviously have upscalers in their brains

AlmightySnoo,
@AlmightySnoo@lemmy.world avatar

It’s as if you are in an isekai

AlmightySnoo,
@AlmightySnoo@lemmy.world avatar

You just need more EXP to unlock the Appraisal skill

AlmightySnoo,
@AlmightySnoo@lemmy.world avatar

I thought this was the name of an isekai for a second

AlmightySnoo, (edited )
@AlmightySnoo@lemmy.world avatar

It’s a lifelong learning nerding process

AlmightySnoo,
@AlmightySnoo@lemmy.world avatar

They won’t be spared 😔

AlmightySnoo,
@AlmightySnoo@lemmy.world avatar

Plot twist: he gave them protein bread, that’s why they’re all buff

AlmightySnoo, (edited )
@AlmightySnoo@lemmy.world avatar

Biased opinion here as I haven’t used GNOME since they made the switch to version 3 and I dislike it a lot: the animations are so slow that they demand a good GPU with high vRAM speed to hide that and thus they need to borrow techniques from game/GPU programming to make GNOME more fluid for users with less beefy cards.

AlmightySnoo, (edited )
@AlmightySnoo@lemmy.world avatar

Hard to tell as it’s really dependent on your use. I’m mostly writing my own kernels (so, as if you’re doing CUDA basically), and doing “scientific ML” (SciML) stuff that doesn’t need anything beyond doing backprop on stuff with matrix multiplications and elementwise nonlinearities and some convolutions, and so far everything works. If you want some specific simple examples from computer vision: ResNet18 and VGG19 work fine.

AlmightySnoo,
@AlmightySnoo@lemmy.world avatar

Yup, it’s definitely about the “open-source” part. That’s in contrast with Nvidia’s ecosystem: CUDA and the drivers are proprietary, and the drivers’ EULA prohibit you from using your gaming GPU for datacenter uses.

AlmightySnoo, (edited )
@AlmightySnoo@lemmy.world avatar

Works out of the box on my laptop (the export below is to force ROCm to accept my APU since it’s not officially supported yet, but the 7900XTX should have official support):

https://lemmy.world/pictrs/image/18fc2c67-2486-4205-bfa1-bcc3df638bfd.png

Last year only compiling and running your own kernels with hipcc worked on this same laptop, the AMD devs are really doing god’s work here.

AlmightySnoo,
@AlmightySnoo@lemmy.world avatar

ROCm is that its very unstable

That’s true, but ROCm does get better very quickly. Before last summer it was impossible for me to compile and run HIP code on my laptop, and then after one magic update everything worked. I can’t speak for rendering as that’s not my field, but I’ve done plenty of computational code with HIP and the performance was really good.

But my point was more about coding in HIP, not really about using stuff other people made with HIP. If you write your code with HIP in mind from the start, the results are usually good and you get good intuition about the hardware differences (warps for instance are of size 32 on NVidia but can be 32 or 64 on AMD and that makes a difference if your code makes use of warp intrinsics). If however you just use AMD’s CUDA-to-HIP porting tool, then yeah chances are things won’t work on the first run and you need to refine by hand, starting with all the implicit assumptions you made about how the NVidia hardware works.

AlmightySnoo, (edited )
@AlmightySnoo@lemmy.world avatar

HIP is amazing. For everyone saying “nah it can’t be the same, CUDA rulez”, just try it, it works on NVidia GPUs too (there are basically macros and stuff that remap everything to CUDA API calls) so if you code for HIP you’re basically targetting at least two GPU vendors. ROCm is the only framework that allows me to do GPGPU programming in CUDA style on a thin laptop sporting an AMD APU while still enjoying 6 to 8 hours of battery life when I don’t do GPU stuff. With CUDA, in terms of mobility, the only choices you get are a beefy and expensive gaming laptop with a pathetic battery life and heating issues, or a light laptop + SSHing into a server with an NVidia GPU.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • localhost
  • All magazines
  • Loading…
    Loading the web debug toolbar…
    Attempt #