selfhosted

This magazine is from a federated server and may be incomplete. Browse more on the original instance.

Buffalobuffalo, in Have you tried LocalGPT PrivateGPT or other similar alternatives to ChatGPT?

Dbzero Lemmy has a relationship with the Horde AI shared LLM group. My primary use is for chat roleplay but they have streamlined guides to hosting your own models for personal or horde use. One of the primary interfaces is SillyTavern but they integrate numerous models

LaterRedditor, in Any good RSS Feed service for self-hosting?

Being hosting miniflux on OCI free tier for a few months. No complaints.

damnthefilibuster,

OCI free tier as in Oracle Cloud? How’s that working out for you? Not miniflux… the cloud…

LaterRedditor,

Solid so far. Running two instances for some services I used to host at home.

redcalcium, (edited ) in Streaming local Webcam in a Linux machine, and acessing it when on vacations - which protocol to choose?

If you have a Home Assistant instance, adding a webcam and accessing it from outside of your home network is quite easy: home-assistant.io/…/usb-webcams-and-home-assistan…

Home Assistant is a very useful platform to have around if you have a handful of IoT devices at home.

shadowintheday2,

Thanks, I will look into setting up Home Assist

vegetaaaaaaa, (edited ) in Streaming local Webcam in a Linux machine, and acessing it when on vacations - which protocol to choose?
@vegetaaaaaaa@lemmy.world avatar

I recently set up a personal Owncast instance on my home server, it should do what you’re looking for. I use OBS Studio to stream random stuff to friends, if your webcam can send RTMP streams it should be able to stream to Owncast without OBS in the middle - else, you just need to set up OBS to capture from the camera and stream to Owncast over RTMP.

the communication itself should be encrypted

I suggest having the camera/OBS and Owncast on the same local network as RTMP is unencrypted and could possibly be intercepted between the source and the Owncast server, so make sure it happens over a reasonably “trusted” network. From there, my reverse proxy (apache) serves the owncast instance to the Internet over HTTPS (using let’s encrypt or self-signed certs), so it is encrypted between the server and clients. You can watch the stream from any web browser, or use another player such as VLC pointing to the correct stream address [1]

it seems that I might need to self-host a VPN to achieve this

Owncast itself offers no authentication mechanism to watch the stream, so if you expose this to the internet directly and don’t want it public, you’d have to implement authentication at the reverse proxy level (HTTP Basic auth), or as you said you may set up a VPN server (I use wireguard) on the same machine as the Owncast instance and only expose the instance to the VPN network range (with the VPN providing the authentication layer). If you go for a VPN between your phone and owncast server, there’s also no real need to setup HTTPS at the reverseproxy level (as the VPN already provides encryption)

Of course you should also forward the correct ports (VPN or HTTPS) from your home/ISP router to the server on your LAN.

There are also dedicated video surveillance solutions.

aniki,

I second RTMP. I used to use it to send video all over the internet back in the covid days.

sailingbythelee, in Am I in over my head? Need some encouragement!

I’m a Linux newb and I managed to set this up a couple months ago. Despite being new to servers and containers, I did not find it too difficult.

Here is the guide I used: zerodya.net/self-host-jellyfin-media-streaming-st…

The guide above doesnt include Audiobookshelf installation, but you will quickly see that adding Audiobookshelf to the basic setup is very easy. There are two things I’ve learned since the initial setup, which are worth a deviation from the guide above.

First, the recommendation in the guide to use a separate userid and groupid (1001) for the docker containers vs. your own userid/groupid (1000) is a royal PITA and not necessary for most basic use cases.

Second, and much more important, you MUST set up your VPN in a Gluetun container and then make your torrent client container a “service” of the Gluetun container. Yes, I know, that sounds like some advanced-level abstraction, but it is actually extremely easy to do and it will save you from getting a nastygram from your ISP when your VPN loses connection. The MPIAA is extremely active with automated detection and processing of torrenting data, but if you set up your VPN with Gluetun, you have a perfectly effective kill switch when your VPN connection drops. And, no, the built-in killswitch on your VPN client won’t work with containers.

Here is the guide I used to make that modification to the initial setup: www.smarthomebeginner.com/gluetun-docker-guide/

Good luck! It was fun to set up, and even more fun to use.

Vendetta9076, (edited ) in pooling media libraries - like distributed storage
@Vendetta9076@sh.itjust.works avatar

I use Plex instead of jellyfin, but there’s the ability to just add a friends library and it pulls in without mounting anything. I thought Jellydin had that as well?

originalucifer,
@originalucifer@moist.catsweat.com avatar

plex uses a centralized service for this kinda of nonsense. most of us are using standalone server products.

this use case calls for either centralized storage (s3 bucket) or access mechanism(all them vpns) to distributed channels (ala plex)... but friends dont let friends use plex.

im curious about ipfs as distributed file systems sound like a new kink i should have

cm0002,

but friends dont let friends use plex.

I would love to get rid of Plex, but jellyfin failed the spouse test last summer and it never really liked my GDrive mount

Plus, Plex clients are everywhere, so it’s all but guaranteed that whoever I decide to onboard is going to have something compatible. I’ve even had early smart TV’s from like 2013 with that weird Yahoo app store thing that had a Plex app that still worked even when the Netflix app didn’t lolol

originalucifer,
@originalucifer@moist.catsweat.com avatar

ha, i feel ya on the spouse. in the house i use local kodi on pis with a shared backend. that same source runs jellyin for the kids/outside the house

ive had the same interface for the wife on kodi/xbmc for probably 10 years

ive found kodi+jellyfin fits all my use cases

Vendetta9076,
@Vendetta9076@sh.itjust.works avatar

Funnily enough, my wife is the only person who likes jellyfin. It works perfectly for her. Everyone else? I’ve never had it work even once. And I have no damn idea why.

density,
@density@kbin.social avatar

tell me why i shouldn't use plex as I'm always tempted by it whenever these threads come up and everyone who uses it is so happy.

But free/libre is so much more delicious.

But don't let the perfect be the enemy of the good.

Inui, (edited )

You have to pay for Plex to access features you just have on Jellyfin. Like being able to stream to a mobile device.

I don’t know how so many people seem to have issues with it when its always been as easy as installing it directly on my computer and booting up the web interface, or now running it in Docker with a simple compose file.

There are alternatives for most features people think are missing. There are several apps that work on mobile if you want to stream music and alternate clients for video playback as well.

originalucifer,
@originalucifer@moist.catsweat.com avatar

last time i checked plex required an account on their service. thats a big red flag for people who host their own shit.

suntzu,

This

AtariDump, (edited )

Does Plex work for you? Keep using it.

Jellyfin is nice but has a long way to go to replicate the features of Plex [like PlexAmp and Sonic Analysis] and features that are “Plex adjacent” [like Tautulli].

Inui, (edited )

K

AtariDump, (edited )

Does Jellyfin have:

A dedicated music app?
Music filtering/smart playlists? Sonic analysis?
Good 4k/x265 performance?
Has a third party (or built in) utility that shows me streaming usage per person?
Allows me to limit remote users to streaming from a single IP address at a time?
Let’s me watch something together with another remote user?
Has an app for most any device (like Plex or Emby) that does NOT require sideloading?
Has built in native DVR steaming/recording support?
Two factor authentication?

When it does, I’ll switch.

Vendetta9076,
@Vendetta9076@sh.itjust.works avatar

Can you not just use a reverse proxy for your jeyllfin server and add multiple servers to the same client?

originalucifer, (edited )
@originalucifer@moist.catsweat.com avatar

jellyfin addresses files locally. i dont know how you could stitch together remote machines

Vendetta9076,
@Vendetta9076@sh.itjust.works avatar

I’m surprised the client doesn’t support switching between servers. When I had jellyfin running I exposed it through traefik to allow external playback. Figure it would make sense that you could just show multiple servers in the UI. Add several reverse proxied addresses and boom.

theRealBassist,

You definitely can. Idk why the commentor above you thinks its local only?

I have two severs I swap between exactly like you describe.

Vendetta9076,
@Vendetta9076@sh.itjust.works avatar

Thats what I thought.

originalucifer,
@originalucifer@moist.catsweat.com avatar

yeah, that might work for what op is tryin to do, maybe, assuming jellyfin fits his client needs

suntzu,

Then I have multiple jellyfin servers in the app… That’s not what I want, I want a single mount where all the media of all nodes is accessible

lemann, (edited )

I run Plex too, and indeed library sharing is built right in and ridiculously easy to set up.

I think OP is already doing things the best way possible in Jellyfin by mounting others’ servers remotely over VPN

Dirk, in Uid/gid in docker containers don't match the uid/gid on the server?
@Dirk@lemmy.ml avatar

It’s actually a suggested configuration / best practice to NOT have container user IDs matching the host user IDs.

Ditch the idea of root and user in a docker container. For your containerized application use 10000:10001. You’ll have only one application and one “user” in the container anyways when doing it right.

To be even more on the secure side use a different random user ID and group ID for every container.

thesmokingman,

This is really dependent on whether or not you want to interact with mounted volumes. In a production setting, containers are ephemeral and should essentially never be touched. Data is abstracted into stores like a database or object storage. If you’re interacting with mounted volumes, it’s usually through a different layer of abstraction like Kibana reading Elastic indices. In a self-hosted setting, you might be sidestepping dependency hell on a local system by containerizing. Data is often tightly coupled to the local filesystem. It is much easier to match the container user to the desired local user to avoid constant sudo calls.

I had to check the community before responding. Since we’re talking self-hosted, your advice is largely overkill.

Dirk,
@Dirk@lemmy.ml avatar

This is really dependent on […]

… basically anything. Yes. You will always find yourself in problems where the best practice isn’t the best solution for.

In your described use case an option would be having the application inside the container running with 10000:10001 but writing the data into another directory that is configured to use 1000:1001 (or whatever the user is you want to access the data with from your host) and just mount the volume there. This takes a bit more configuration effort than just running the application with 1000:1001 … but still :)

Appoxo,
@Appoxo@lemmy.dbzer0.com avatar

Do I need to actually create the user in advance or can I just choose a string as I see fit?

Dirk,
@Dirk@lemmy.ml avatar

You don’t need to create the user first. Here’s the simplest I can come up with:


<span style="color:#323232;">FROM alpine:latest
</span><span style="color:#323232;">COPY myscript.sh /app/myscript.sh
</span><span style="color:#323232;">USER 10000:10001
</span><span style="color:#323232;">CMD ["sh", "/app/myscript.sh"]
</span>

This simply runs /app/myscript.sh with UID 10000 and GID 10001.

Appoxo,
@Appoxo@lemmy.dbzer0.com avatar

Wasnt aware that you can just think of IDs from fresh air.
Thought it was to create the user and ID manually amd then be able to use it.

Dirk,
@Dirk@lemmy.ml avatar

Yep! The names are basically just a convenient way for referencing a user or group ID.

Under normal circumstances you should let the system decide what IDs to use, but in the confined environment of a docker container you can do pretty much what you want.

If you really, really, really want to create a user and group just set the IDs manually:


<span style="color:#323232;">FROM alpine:latest
</span><span style="color:#323232;">COPY myscript.sh /app/myscript.sh
</span><span style="color:#323232;">RUN addgroup -g 10001 mycoolgroup && adduser -D -u 10000 -G mycoolgroup mycooluser
</span><span style="color:#323232;">USER mycooluser:mycoolgroup
</span><span style="color:#323232;">CMD ["sh", "/app/myscript.sh"]
</span>

Just make sure to stay at or above 10000 so you won’t accidentally re-use IDs that are already defined on the host.

scottmeme, (edited )

My go-to for user and group IDs is 1234:1234

onlinepersona, in Do any of you have that one service that just breaks constantly? I'd love to love Nextcloud, but it sure makes that difficult at times

I wish there were an alternative in a sane programming language that I could actually contribute to. For some reason PHP is extremely sparse in its logging and errors mostly only pop up on the frontend. Having to debug errors after an update and following some guide to edit a file in the live env that sets a debugging variable, puts the system in maintenance mode and stores additional state in the DB is scary.

Plus PHP is so friggin slow. Nextcloud takes noticeable time to load nearly anything. Even instances hosted by pros that only host nextcloud are just slow.

CC BY-NC-SA 4.0 🎖

PHLAK, in Linode Alternative Suggestions for Small Projects
@PHLAK@lemmy.world avatar

I’ve been using Linode for a decade (or more) now without any issues. I’d encourage you to contact their support about this issue. Assuming you’re on the up-and-up this sounds like a big and in sure they’d be happy to help.

If you decide not to go with Linode though I think Digital Ocean is a good alternative.

xia,

IIRC, they have been bought out in the last decade.

AnxiousOtter,

By Akamai, ya. I was devastated when I heard.

EncryptKeeper,

You were using Linode. Now you’re using “Akamai Connected Cloud”. Linode was acquired in 2022 and the brand retired in 2023.

PHLAK,
@PHLAK@lemmy.world avatar

Yes, I’m aware. Still as good at it’s always been

EncryptKeeper,

Yes, just as good but at 20% higher cost.

PHLAK,
@PHLAK@lemmy.world avatar

My bill hasn’t changed but maybe I was grandfathered in.

EncryptKeeper, (edited )

No they didn’t grandfather anybody in, they made the price changes to compute universally back in April of last year. The only plan not changed was the $5 nanode so if that’s all you’re running then that’s probably why your bill didn’t change.

grue, in Hardware question

This might be an X/Y problem. Why do you think you need HDMI output on a server?

AimlessNameless,

Because installing an OS without iLo, serial or video output would be a bit of a hassle

key, in No posts when surfing through my i stance
@key@lemmy.keychat.org avatar

19 has federation bugs. Mainly outgoing but I’ve also seen incoming federation gradually fail. Restart the docker container routinely (cron job) until fixes come out.

Valmond, (edited )

Ouch, thank you 🥲!

How often do you restart it/whats it doing/any idea what’s no longer working or why?

Good luck to the developers!

And thank you obviously!

BCsven, in Can I build a NAS out of a desktop? [Request]

You totally can, but since it will be on all day with 4 hdd look into wattages you want to live with. There are some small NUCs or Pi based NAS with low wattages. There is OpenMediaVault, FreeNAS/TrueNAS software to install

comfydecal,

Nice, good things to balance. Thanks for the info!

comfydecal,

Hey sorry, thinking on this more, could I just turn on the NAS when desired? What is the benefit of running it constantly?

Cyber, (edited )

Yep, look into Wake On LAN if you just want to power the NAS on remotely.

My NAS also powers on at certaIn times of day and off again after a while - IF - no-one’s connected / no network traffic / etc.

I do NOT need my NAS on at 3am…

Edit : forgot to say, check out OpenMediaVault

comfydecal,

Stellar! Thanks for the info!

lemmyvore,

You can also configure the HDDs to power down when they’re not in use. HDDs are the biggest power consumer anyway.

rentar42, (edited )

Note that there is some reliability drawback of spinning hard disks on and off repeatedly. maybe unintuitively HDDs that spin constantly can live much longer than those that spend 90% of their time spun down.

This might not be relevant if you use only SSDs, and might never affect you, but it should be mentioned.

BCsven,

You could totally turn on as needed, WakeOnLan is good for that. But typically when people run a NAS it is for streaming audio, video, file sync and backups and maybe docker running other services so the NAS is typically on 24/7 so it is available on demand. But it doean’t have to be 100% uptime if you don’t want it to be. For example I have two OpenMediaVaults one on a pi and one an old IomegaNAS. The pi is on always with an attached drive, and serves Samba Shares and DLNA/DAAP shares. Has docker running syncthing, CUPS print server, Trillium Notes, and homeassistant; so makes sense for it to be on all day, especially because my wife’s system backsup to it daily automatically. The converted Iomega NAS is mainly a backup machine sInce it is old and not as performant (only has 100 network speed. So that gets turned on to do a bulk backup and not much else.

AngryCommieKender, in Does anyone else harvest the magnets and platters from old drives as a monument to selfhosting history?

I thought you made a custom thermos bottle at first

kamen,

Same.

flop_leash_973, in Does anyone else harvest the magnets and platters from old drives as a monument to selfhosting history?

And here I thought I had a lot of hdd platter coaster’s.

Potatos_are_not_friends,

I have like 15 over the past decade and now I realize I am an ant to OP

ragica, in Self hosted photo library with S3
@ragica@lemmy.ml avatar

I haven’t tried it but I’ve been thinking about it… Since NextCloud supports s3 storage it would seem its photo apps, such as Memories should work that way?

EuridiceSequens,

Yep, that’s pretty much it. I have it working with iDrive this way. Install Nextcloud and the Memories app. Add S3 as external storage. Point Memories to external storage. Done.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • selfhosted@lemmy.world
  • localhost
  • All magazines
  • Loading…
    Loading the web debug toolbar…
    Attempt #