selfhosted

This magazine is from a federated server and may be incomplete. Browse more on the original instance.

Yoddel_Hickory, in How well does the raspberry pi handle being a moonlight client

I use Moonlight Qt on a raspberry pi 5, and used it on a raspberry pi 4 before that. Both connected via ethernet, streaming at 150 mbps. It works very well, feels like being at the computer. It feels like there is next to no delay, and moonlight reports around 5 ms.

Somewhere else I use a raspberry pi 3 A+ with Moonlight Embedded, connected via Wi-Fi, and it works pretty well, but I can notice the delay a bit more. Still able to stream at 40 mbps.

MeatsOfRage,

Good to hear. I’ll give it a shot, thanks!

Yoddel_Hickory,

To add more details, I use Sunshine as the server, and stream 1080p, in HEVC for the pi 4 and 5, and h264 for the 3 A+.

beeng, (edited )

I have a 3b+ I want to try this with, it has double the ram and also Ethernet connection vs the 3a+. Do you see yours hit ram limit or you think the delay could be wifi related?

Keen for any tips, thanks!

hperrin, (edited ) in This Week in Self-Hosted (26 January 2024)

I released a relevant thing this week:

hub.docker.com/r/sciactive/nephele

It works well with nginx-proxy-manager.

savedbythezsh,

You should reach out to the creators and let them know! selfh.st/contact/

hperrin,

Good idea. Thanks. :)

sturlabragason, in PlanarAlly 2024.1 Release!

Now this is fucking awesome! Trying this tonight!

Berinkton, in File server with on-demand sync, preserve the filesystem, and runs without external DB?

I use Syncthing for this type of task on my PC and Phone and it stores a copy of the shared folder on the server with the option for file versioning. Having a Server is optional by the way.

rearview,

AFAIK, Syncthing clones the entire folder across peers (the server is just another peer it seems), which isn’t ideal for my use case Do you know any current way to configure it for selective syncing?

Jeief73,

I don’t think it can do selective syncing. I’ve been also searching for a similar solution but didn’t find one. Finally opted for syncthing with my most important files. Other files I can get them via web using filestash.

MangoPenguin, (edited )
@MangoPenguin@lemmy.blahaj.zone avatar

Owncloud supports selective sync, and seems a lot better for performance compared to Nextcloud.

Alternatively you could roll your own with rclone which is essentially an open source alternative to mountain duck. Then you can just use a simple connection via SFTP, FTP, WebDAV, etc…

rearview,

Non-OCIS Owncloud still needs a dedicated database and recommend against SQLite in prod

I’ve looked at rclone mounting with the --vfs-cache-* flags. But I’m not sure how it can smart sync like mountain duck or handle conflicts elegantly like the Nextcloud/Owncloud clients do. Let me know how to set it up that way if possible

recursivesive,

I vouch for Syncthing as well. I enabled storing in my own remote hosting provider marking it as untrusted, so my files are encrypted there.

folkrav, in Proxmox Ubuntu VM has "graphical" console

I have a feeling you’re talking about the TTY. You can’t use the mouse cause there’s no graphical interface to begin with. You’re in “pure” console mode. It’s probably why fonts look weird too. It’s probably just not running at your monitor’s native resolution.

As other people said though, it’s pretty much expected. Servers are more or less expected to run “headless”. You’d typically SSH in rather than plug a monitor directly in the machine.

NeoNachtwaechter,

I have a feeling you’re talking about the TTY.

I got the same creepy feeling.

Like someone trying to talk about fork and knife who has never seen fork and knife before :-)

possiblylinux127, in Haier hits Home Assistant plugin dev with takedown notice

I don’t know if there is a non profit to help devs with legal but there should be.

Maybe the Free software conservancy?

sep, in How often do you back up?

How often depends on how much work it is to recreate, or the consequences of loosing data.

Some systems do not have real data locally, get a backup every week. Most get a nightly backup. Some with a high rate of change , get a lunch/middle of the workday run.
Some have hourly backups/snapshots, where recreating data is impossible. CriticL databases have hourly + transaction log streaming offsite.

How long to keep a history depends on how likely an error can go unnoticed but minimum 14 days. Most have 10 dailes + 5 weeky + 6 monthly + 1 yearly.

If you have paper recipes and can recreate data lost easily. Daily seems fine.

fuckwit_mcbumcrumble, in Hosting websites over 4g

If you’re buying a VPS why not host the website there?

justawittyusername,

Good question, I will want to host more in the future, im trying to keep costs as low as possible.

taladar,

Depending on what you are trying to host and where you live power usage and your own hardware might be more expensive than the VPS you require to host those.

thirdBreakfast,
@thirdBreakfast@lemmy.world avatar

This. Hosting at home might be cheaper if you are serving a lot of data, but in that case, the speed’s going to kill you.

I’m a keen self-hoster, but my public facing websites are on a $4 VPS (Binary Lane - which I recommend since you’re in Aus). In addition to less hassle, you get faster speeds and (probably) better uptime.

AustralianSimon, in Immich release v1.92.0 (edit: v1.92.1 hotfix released)
@AustralianSimon@lemmy.world avatar

Ty

Samsy, (edited ) in what if your cloud=provider gets hacked ?

Easy, I always mirror my cloud. My setting is: cloud is extern and in my network there is always the same copy of everything on a simple smb-nas.

  1. My house burns to the ground (or easier, the NAS is broken) = online backup
  2. The online provider got hacked = No problem, I have an backup at home.
  3. The hackers burned my house down at the same time they killed my cloud = Well fuck.

PS. Since the most syncs are going directly to the cloud its just an rclone cronjob every night to backup everything on the NAS.

sturlabragason, (edited ) in Any good RSS Feed service for self-hosting?

For a self-hosted RSS feed service, there are several options:

  1. Tiny Tiny RSS: It’s an open-source web-based news feed reader and aggregator for RSS and Atom feeds, praised for its Android client availability.
  2. FreshRSS: A free, self-hosted RSS and Atom feed aggregator that is known for being lightweight, powerful, and customizable. It also supports multi-user access, custom tags, has an API for mobile clients, supports WebSub for instant push notifications, and offers web scraping capabilities.
  3. Miniflux: A minimalist and opinionated feed reader that is straightforward and efficient for reading RSS feeds without unnecessary extras. It’s written in Go, making it simple, fast, lightweight, and easy to install.

Not self hosted but I did it this way:

sturlabragason.github.io/…/Curated-News.html

six_arm_spider_man,
@six_arm_spider_man@reddthat.com avatar

I’ve been running Miniflux on a free tier GCP instance for a few months now. Then I use RSS Guard on my desktop and FeedMe on my phone to read stuff.

I’d like to try FreshRSS, but just cannot get my URLs to resolve correctly with it. After a few hours of trying, I reverted to if it ain’t broke, don’t fix it. Miniflux all the way for me (for now).

TCB13, in what if your cloud=provider gets hacked ?
@TCB13@lemmy.world avatar

I’m more worried about what’s going to happen to all the self-hosters out there whenever Cloudflare changes their policy on DNS or their beloved free tunnels. People trust those companies too much. I also did at some point, until I got burned by DynDNS.

Dave,
@Dave@lemmy.nz avatar

We start paying for static IPs. If cloudflare shuts down overnight, a lot of stuff stops working but no data is lost so we can get it back up with some work.

TCB13,
@TCB13@lemmy.world avatar

They’re just creating a situation where people forget how to do thing without a magic tunnel or whatever. We’ve seen this with other things, and a proof of this is the fact that you’re suggesting you’ll require a static IP while in fact you won’t.

Dave,
@Dave@lemmy.nz avatar

Where I live, many ISPs tie public IPs to static IPs if they are using CG-NAT. But of course there are other options as well. My point was that the other options don’t disappear.

Though I do get the point that Cloudflare aren’t giving away something for nothing. The main reason to me is to get hobbiest using it so they start using it (on paid plans) in their work, or otherwise get people to upgrade to paid plans. However, the “give something away for free until they can’t live without it then force them to pay” model is pretty classic in tech by now.

TCB13,
@TCB13@lemmy.world avatar

However, the “give something away for free until they can’t live without it then force them to pay” model is pretty classic in tech by now.

Yes, this is a problem and a growing one, like a cancer. This new self-hosting and software development trends are essentially someone reconfiguring and mangling the development and sysadmin learning, tools and experience to the point people are required to spend more than ever for no absolute reason other than profits.

originalucifer, in what if your cloud=provider gets hacked ?
@originalucifer@moist.catsweat.com avatar

haha

"the cloud" does not change the fact that if you data does not reside in 2 physical locations you do not have a backup.

so yes, standard practices that have existed... well, since the beginning, still apply.

kristoff,

Well, the issue here is that your backup may be physically in a different location (which you can ask to host your S3 backup storage in a different datacenter then the VMs), if the servers themselfs on which the service (VMs or S3) is hosted is managed by the same technical entity, then a ransomware attack on that company can affect both services.

So, get S3 storage for your backups from a completely different company?

I just wonder to what degree this will impact the bandwidth-usage of your VM if -say- you do a complete backup of your every day to a host that will be comsidered as “of-premises”

ErwinLottemann,

if you backup your vm data to the same provider as you run your vm on you don’t have an ‘off-site’-backup, which is one criteria of the 3-2-1 backup rule.

oranki, in Jellyfin on a vps

Most likely, a Hetzner storage box is going to be so slow you will regret it. I would just bite the bullet and upgrade the storage on Contabo.

Storage in the cloud is expensive, there’s just no way around it.

crony,
@crony@lemmy.cronyakatsuki.xyz avatar

I will most likelly just do that in the end.

Relly hope god will have mercy on me and allow me to move out soon to a bigger place.

electric_nan,

Why do you say that? I use it for my 12+ TB library and it works fine. I’m on the west coast USA, and my vps and storage box are on the east coast.

ikidd, (edited ) in PSA: The Docker Snap package on Ubuntu sucks.
@ikidd@lemmy.world avatar

Yah, it’s been trash from the start. I tried it 2 years ago and the unpredictable weird shit it did was useless to try to troubleshoot. It was worse than trying to run Docker on Windows, if that can be believed.

Debian with the Docker convenience script is the way to run Docker.

lemmyvore, (edited )

Docker has an apt repo. You can add it to your Debian/Ubuntu and install and update packages normally. No need to use a script install.

docs.docker.com/engine/install/ubuntu/

NotATurtle,

Is there a difference between the apt and the install script version?

aniki,

all depends on what your aptitude is configured to look for.

ikidd,
@ikidd@lemmy.world avatar

That’s essentially what the script does, then installs all the deps and docker, sets up the service.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • selfhosted@lemmy.world
  • localhost
  • All magazines
  • Loading…
    Loading the web debug toolbar…
    Attempt #