I’ve setup Nextcloud but have done next to nothing with it.
My Lemmy instance gives me the most problems, but it’s also the only publicly available service I run. Mostly the issue is it seems to have a memory leak that forces me to restart it every few days.
Everything else has been completely rock solid for me, running on a mini pc (formerly a pi4 until I wanted to start doing stuff with Jellyfin and needed more power for transcoding) on OpenSUSE Leap all in docker containers. Makes it insanely easy to move stuff. I had no issues basically just copying the docker-compose files and data and bringing them up even when switching architectures.
I dunno what you guys are doing that makes your nextcloud die without touching it. Mine runs happily until I decide to update it, and that usually goes fine, too. I don’t use docker for it, tho.
I’ve been reading nextcloud forums/reddit/lemmy/etc. for years now, and i feel like 90% of the problems are from people using docker or whatever easy one-click solution is out there
I’ve been running NC the old fashioned way for years now and i’ve never had problems of NC dying for no reason.
Have i had issues? Of course… Not not like the ones people keep coming here and shitting on NC
The only times i’ve had major issues and it was actually a problem with nextcloud, is buggy major version releases… So i never install a new major release until X.0.1 these days. Havent really had problems since
Try MySQL instead of MariaDB. They have some performance tweaks in version 10 that aren’t present in MariaDB.
Also, tune your MySQL (or MariaDB) server. Make sure all tables use InnoDB. Enable the slow query log and analyze slow queries (there may be missing indices). If there’s a lot of unique queries, increase the query cache size.
The easy approach is to run MySQLTuner after the MySQL or MariaDB server has been up for at least a week, and go through its suggestions.
There shouldn’t be a significant difference in performance between PostgreSQL and MySQL/MariaDB if both have been optimized. Out-of-the-box config isn’t ideal for a production system.
For years, I had an unstable unraid server. I was fixing it every couple of days after a lockup. I had decided that unraid sucked. When it was up for a week I celebrated. Every one of my dockers was a suspect. I learned to hate all of them.
I wonder what performance impact there would be if you were to move pgsql onto bare metal with enough ram dedicated to caching all of the db data (think: i5 or i7 nuc). That’s going to be my next step with my homelab; I want to migrate everything to a single db host with a lot of RAM and M2 storage and avoid the db process replication I have going on. I have no performance complaints with NC currently, I’m running PHP cache and redis as well as image preview and imaginary.
I didn’t realize that next Cloud was so bad, might I recommend people having issues try Seafile? Also open source and I’ve been using it for many years without issues. It doesn’t have as many features and it doesn’t look as shiny but it’s rock solid
I’m having a hard time believing that… There is a difference between being able to fix the update issues every time without problems or having no problems at all. But if so, neat.
Keep your Apple TV and use it as a streaming client for whatever you stand up on the backend. Personally I have a Synology NAS that I love and I use the net to get all my content. Use the net. 😉
Appreciate your comment, and that seems like a common setup. If you didn’t have the ATV, what would you front end the Plex server with? I have a Synology router and would probably buy a Synology NAS, if I went that route.
Actually with a Synology NAS you don’t need Plex, they have a built in equivalent called DS Video with apps for Apple TV, iOS, Android, etc!
I’ve had an Nvidia shield in the past as well and it works reasonably well, but the video experience is definitely better on the Apple TV. The Android boxes make more sense if you want a place to install emulators that also occasionally streams.
Thank you for this! I’ll look more at the Synology NAS devices and see what that’s all about. I’m probably the other way around, stream more, and emulate once in a while.
Good call. I do some backups now but I should formalize that process. Any recommendations on selfhost packages that can handle the append only functionality?
I use and love Kopia for all my backups: local, LAN, and cloud.
Kopia creates snapshots of the files and directories you designate, then encrypts these snapshots before they leave your computer, and finally uploads these encrypted snapshots to cloud/network/local storage called a repository. Snapshots are maintained as a set of historical point-in-time records based on policies that you define.
Kopia uses content-addressable storage for snapshots, which has many benefits:
Each snapshot is always incremental. This means that all data is uploaded once to the repository based on file content, and a file is only re-uploaded to the repository if the file is modified. Kopia uses file splitting based on rolling hash, which allows efficient handling of changes to very large files: any file that gets modified is efficiently snapshotted by only uploading the changed parts and not the entire file.
Multiple copies of the same file will be stored once. This is known as deduplication and saves you a lot of storage space (i.e., saves you money).
After moving or renaming even large files, Kopia can recognize that they have the same content and won’t need to upload them again.
Multiple users or computers can share the same repository: if different users have the same files, the files are uploaded only once as Kopia deduplicates content across the entire repository.
There’s a ton of other great features but that’s most relevant to what you asked.
I’ve used rclone with backblaze B2 very successfully. rclone is easy to configure and can encrypt everything locally before uploading, and B2 is dirt cheap and has retention policies so I can easily manage (per storage pool) how long deleted/changed files should be retained. works well.
also once you get something set up. make sure to test run a restore! a backup solution is only good if you make sure it works :)
As a person who used to be “the backup guy” at a company, truer words are rarely spoken. Always test the backups otherwise it’s an exercise in futility.
I haven’t had any issues with Nextcloud yet. But any torrent client refuses to work. I’ve tried various qbittorrent containers, transmission, deluge briefly, they all work for a while but eventual refuse to do anything.
selfhosted
Oldest
This magazine is from a federated server and may be incomplete. Browse more on the original instance.