github.com

NixDev, to linux in sigoden/argc-completions: Autocompletion for any shell and any command.

Have you used this? Is there any benefits over bash-completion?

Chewy7324,

I’ve not used it yet, but I found bash-completion to be lacking quite often. Completion is one reason I’m using fish atm.

But from the looks of it it’s exactly what bash/zsh is missing for me.

cashews_best_nut,

Have you tried ZShell? I’m consistently amazed at the number of great plugins that get made for it.

Chewy7324,

I’ve used zsh for it’s support for posix sh and have my config. But I find fish to be faster with the features I want and it has those features ootb.

Maybe I’ll give zsh another try.

maegul, to lemmy_support in Please reconsider removing user aggregate scores from the API
@maegul@lemmy.ml avatar

As someone who is against use aggregate scores and pleased to see it removed I can understand the desire to make it available to admins/moderators to assist in their actions.

I think making the numbers available only for admins/mods would make sense, though I also feel it starts to get to be an arbitrary divide.

I also have to wonder if an admin/mod couldn’t simply use the view of the user’s posts/comments we all have access to along with the various sorts available. Want to know if a user posts generally well received stuff … look at their posts and sort by “Top all time”. Want to know if they’re regularly posting stuff that is poorly received, sort by “Controversial” (which is new) or just “New”. I’d suspect that in the end integrating this sort of lookup into the moderation tooling so that it’s easier/quicker to do would be more worthwhile than persisting with user aggregates.

WheelcharArtist, (edited ) to linux in LACT: Linux AMDGPU Controller for overclocking and fan curve control

is there any advantage over corectrl or is it simply another tool?

Chewy7324, (edited )

The big advantage for me is that lact runs as a (systemd) daemon. This is more convenient for me than having to autostart CoreCtrl.

A disadvantage of the daemon is that it can’t be packaged on flathub.

Enable and start the service (otherwise you won’t be able to change any settings):
sudo systemctl enable --now lactd
You can now use the GUI to change settings and view information.

LACT has an API over an unix socket.

github.com/ilya-zlobintsev/LACT/blob/…/API.md

TheButtonJustSpins, to linux in kando: 🥧 The Cross-Platform Pie Menu.

I’m currently eating pie for breakfast, so this confused me at first.

SpaceNoodle,

Is cheesecake a pie?

NixDev,

I vote yes

TheButtonJustSpins,

Agreed

FaeDrifter, to linux in GitHub - Acly/krita-ai-diffusion: Streamlined interface for generating images with AI in Krita. Inpaint and outpaint with optional text prompt, no tweaking required.

This is suuuper cool, but looks like having linux+amdgpu limits me to the cloud option.

I supposed this is bc we don’t have a DirectML equivalent yet.

pantherfarber,

I got it to work yesterday. Have to go into the python venv it installs, remove torch and install it the way it describes on the comfyui GitHub.

wim,

There have been some efforts to run pytorch and StableDiffusion on ROCm. Not sure if that could be combined with this.

warmaster,

Crap, I was hoping to try it. I wonder if AMD will announce something in their FOSS / AI event.

wewbull, (edited )

It works today. Only problem I have is the memory management is pretty poor, and it’s pretty easy to run out of vram.

Rx7600 8GB + 5900X Rocm 5.7.1 Pytorch 2.1

wim, (edited )

Interesting! Got any links that explain how to set it up?

I just got a laptop with an RX 6700M 10GB ans am eager to try it :)

wewbull,

Not really. I’ve had to do quite a bit of experimentation.

My setup that I’ve settled on:

  • Rocm system libraries from Arch Linux
  • PyTorch nightly for Rocm pip installed into a venv (see instructions on pytorch homepage)
  • Set HSA_OVERRIDE_GFX_VERSION to 11.0.0. This is just for the RX7600 and it tells it to use the RX7900 code as the pytorch version hasn’t been compiled with 7600 support.
  • Start software.
wim,

Thanks!

byteseb,

Hmm, that’s weird. I was able to run Stable Diffusion locally with Linux + RX6600.

Probably because I used Easy Diffusion. At first, I couldn’t get the GPU acceleration to work, and I was constantly running out of RAM (Not using VRAM), so my system always froze and crashed.

Turns out it was a ROCM bug, that I don’t know if it’s fixed by now, but I remember “fixing it” by setting an environment variable to a previous version.

Then, it all worked really good. Took between 30 seconds to 2 minutes to make an image.

radioactiveradio, to linux in Shadow Cast v0.6.1: GPU Accelerated Screen Recorder - Now with Wayland Support

Wow you’re fast, beautiful goofy picture guy.

sonymegadrive,

<3

Aatube, to programmer_humor in Someone has started answering to the github stalebot with memes
@Aatube@kbin.social avatar

They shouldn’t even be using the probot, it’s deprecated, unmaintained and thus potentially vulnerable

Deebster,
@Deebster@programming.dev avatar

Also just the whole concept is wrong and encourages “me too” spam just to keep the thing from timing out and not being fixed.

Aatube, (edited )
@Aatube@kbin.social avatar

I actually see a legitimate use case for it and helped add the actions version in a project where I'm a collaborator.

Quite a bit, certain bugs disappear after an update without us targeting it (partially because the logs get fudged a bit after going through dependencies, so sometimes multiple bugs have the same cause or it's actually a dependency issue that got fixed) and sometimes we forget about old feature requests.

The stale reminder doubles as a reminder for us to (re)consider working on the issue. When we know something probably isn't gonna get fixed suddenly, we apply a label to the issue. For enhancements that we'll definitely work on soon™, we apply help wanted. We've configured the action to ignore both. We also patrol notifications from stale to see if something shouldn't go stale. This is a medium-sized project so we can handle patrolling and IMO this helps us quite a bit.

Deebster, (edited )
@Deebster@programming.dev avatar

Fair enough; I didn’t consider artifacts like logs and traces. I suppose a stale marker might prompt the original reporter to retest and supply fresh ones (or confirm it’s fixed in the dependency case).

In an ideal world I suppose we’d have automated tests for all bug reports but that’s obviously never going to happen!

Devjavu, to privacy in Google seems to be blocking API access for Piped video servers

If you are using LibreTube this is fixable by disabling piped proxies in the setting. HOWEVER do be warned that Youtube will know your IP, so you should only really do this while using a VPN service.

xuxebiko,

Real heroes don't wear capes, they give solid advice.

ps: thanks

Potatos_are_not_friends, to programmer_humor in Someone has started answering to the github stalebot with memes

After a extremely long week, I sometimes participate in open source. I have to deal with malicious commits. I have to follow up on issues from misguided individuals who are actually looking for tech support. I have to guide new contributors to how this massive repo works and to submit tests. I have to negotiate with the core team and these convos can often last months/years.

And contributing to open-source is one of the few things that give me pleasure, even if it’s a extremely thankless job.

But I’m tired man.

I’m not dealing with low-quality memers who are providing zero value. Nor should we encourage it.

Anders429,

I would argue that in this case the maintainers are in the wrong for not even responding to the issue, not the reporter responding with memes.

db0, (edited )
@db0@lemmy.dbzer0.com avatar

I do FOSS as well, but I’d rather people have fun punting the stalebot than just keep repeating “this issue still exists”. I will probably get a chuckle out of it.

Recollectr, to privacyguides in Logseq: A privacy-first, open-source platform for knowledge management.

Another alternative, admittedly not open-source, is Recollectr (disclaimer: built by me.)

Recollectr was inspired by prior projects like Notational Velocity but aims to be a lot more - omnibox, markdown support, reminders; and for paid users: revisions, note-linking, and sync. I built it because I felt like other note-taking apps just weren't fast enough and they broke my concentration.

It's quite late here but I'd be happy to answer any questions tomorrow!

Helvedeshunden,

I was just checking out the site on my iPad. Only the top image loads and the rest are white boxes. I disabled all content blockers and reloaded but the problem persisted. It might still be a local problem, but now you have a heads-up that something MIGHT be wrong.

Recollectr,

Thanks very much for letting me know; I'll look into this! They're all videos so perhaps there's some encoding issue with Safari on iOS.

isotope, to privacyguides in Logseq: A privacy-first, open-source platform for knowledge management.

I remember this being marketed as the Emacs Org mode + Org Roam combo for the masses, which is totally fine. However, if you want true control over your data and you’re willing to step out of your comfort zone, consider using Emacs + Syncthing

bad3r,
@bad3r@lemmy.one avatar

you have complete control over your data with logseq.

xcxcb, to privacyguides in Logseq: A privacy-first, open-source platform for knowledge management.

Doesn’t like Firefox mobile apparently. For a privacy focused platform you’d think it would support that over Chrome.

bad3r,
@bad3r@lemmy.one avatar

its lack of protocol support from firefox end. Firefox doesn’t support the FS API. The logseq team plans to migrate to a different protocol that is supported by FF OPFS

nick, to linux in ripgrep 14 released with hyperlink support

Oh hell yes, hyperlinks! No more weird kitty alias to inject hyperlinks

elfio, to privacy in SimpleX Chat v5.4 is released

I gave it a try but I think I’ll wait until I can use the same ID from both phone and PC.

ithilelda, to privacy in SimpleX Chat v5.4 is released

been selfhosting the smtp relay and using the app for quite a while. If you use it as a private chat for sensitive content, it is PERFECT. Really looking forward to its future development in group chats.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • localhost
  • All magazines
  • Loading…
    Loading the web debug toolbar…
    Attempt #