I run it and mariaDB in docker and they run perfectly when left alone, but everything breaks horribly if I try to do an update. I recently figured out that you need to do updates for NC in steps, and docker (unRAID’s, specifically) defaults to jumping to the latest version. I think I figured out how to specify version now so fingers crossed I won’t destroy it the next time I do updates.
This is probably what I’m doing wrong. I’m using linuxserver’s docker which should be okay to auto update, but it just continuously degrades over time with updates until it becomes non-functional. Random login failures, logs failing to load, file thumbnails disappearing, the goddamn Collabora office docker that absolutely refuses to work for more than one week, etc.
I just nuke the NC docker and database and start from scratch every year or so.
You absolutely need to move from patch to patch and cannot just do a multiple version jump safely. You also need to validate the configs between versions, especially major release updates or you risk breaking. New features and optimizations happen and you also may need to change our update your reverse proxy configuration on update, or modify db table configuration (just puking this from memory as I’ve had to do it before). I don’t know that there’s automation for each one of those steps.
Because of that, I run nextcloud in a VM and install it from the binary package. I wrote a shell script that handles downloading, moving the files, updating permissions and copying the old config forward, symlinking and doing the upgrade. Then all I have to do is log in as administrator, check out the admin dashboard and make sure there aren’t new things I have to address in the status page. It’s a pain, but my nextcloud uses external db and redis and PHP caching so it’s not an easy out of the box setup. But it’s been solid for a long time once I adopted using this script.
There could probably be some additional refactoring here, but it works for my setup. I’m using default nginx paths, so they probably look different than other installs that use custom stuff like /var/www, etc.
Use it by putting it in a shell script, make it executable, then call it:
sudo scriptName.sh 28.0.1
Replace the version with whatever version you’re upgrading to. I would highly recommend never upgrading to a .0, always wait for at least a .1 patch. I left some sleeps in the when I was debugging a while back, those are safe to remove assuming it works in your setup. I also noticed some variables weren’t quoted, I’m not a bash programmer so there’s probably some consistency issues that could be addressed if someone is OCD.
debrid is basically “cloud torrenting”, a server torrents the files and you download them from there. extremely useful if you live in a jurisdiction where p2p filesharing is illegal.
stremio is a netflix-like video player app, torrentio an addon that let’s you crawl torrent sites and stream stuff via debrid.
Not sure if it’s a factor for you but roku tries to phone home a lot more than anything else on my network (or perhaps my firewall just catches it more than other devices and apps). Otherwise roku is pretty good.
Nvidia shield tv is better though. It’s the best set top box. Made even better by replacing the default launcher/ home screen (android TV default launcher now has 2/3 or more of the screen taken up by ads or “recommended content” which is just ads).
One of the concerns is the current state of “privacy” with these devices. Interesting that you can mod the Shield (as mentioned by a couple of others as well). Good to know, even if I get one just to tinker / experiment. Thank you so much!
I recently set up a personal Owncast instance on my home server, it should do what you’re looking for. I use OBS Studio to stream random stuff to friends, if your webcam can send RTMP streams it should be able to stream to Owncast without OBS in the middle - else, you just need to set up OBS to capture from the camera and stream to Owncast over RTMP.
the communication itself should be encrypted
I suggest having the camera/OBS and Owncast on the same local network as RTMP is unencrypted and could possibly be intercepted between the source and the Owncast server, so make sure it happens over a reasonably “trusted” network. From there, my reverse proxy (apache) serves the owncast instance to the Internet over HTTPS (using let’s encrypt or self-signed certs), so it is encrypted between the server and clients. You can watch the stream from any web browser, or use another player such as VLC pointing to the correct stream address [1]
it seems that I might need to self-host a VPN to achieve this
Owncast itself offers no authentication mechanism to watch the stream, so if you expose this to the internet directly and don’t want it public, you’d have to implement authentication at the reverse proxy level (HTTP Basic auth), or as you said you may set up a VPN server (I use wireguard) on the same machine as the Owncast instance and only expose the instance to the VPN network range (with the VPN providing the authentication layer). If you go for a VPN between your phone and owncast server, there’s also no real need to setup HTTPS at the reverseproxy level (as the VPN already provides encryption)
Of course you should also forward the correct ports (VPN or HTTPS) from your home/ISP router to the server on your LAN.
Why is your Collabora server on local host? Local host will always point to the device you are trying to access from. You need a publicly accessible URL
“Local host” in this instance refers to my desktop computer where all my super sweet Linux distros are saved. Nextcloud Office, while being an “app” appears to not have any function without the collabora; i.e. there will be no document viewer without the collabora “server” running next to NextCloud.
Or maybe it’s none of that. Coming from a Windows background, running docker is completely foreign.
Collabora need to be accessible at the URL you provide. As a example one might have nextcloud at nc.example.com and Collabora as cb.example.com. you would need to enter cb.example.com as the URL.
The easier way it to use the Nextcloud all in one image or the build in Collabora. Its not going to be as robust or fast but its much simpler.
Just throwing out an option if you aren’t aware, gohardrives on ebay and on their site sell used Hdds. 10Tb for $80. The catch is they’ve been used in data centers for 5 years. The company will guarantee the drives for an addition 5 years and it could save you a lot of money depending on how much you want to risk it. I went with 3, one being a parity drive in case other goes bad.
I currently have 6x10TB of these drives running in a gluster array. I’ve had to return 2 so far, with a 3rd waiting to send in for warranty also (click of death for all three). That’s a higher failure rate than I’d like, but the process has been painless outside of the inconvenience of sending it in. All my media is replaceable, but I have redundancy and haven’t lost data (yet).
Supporting hardware costs and power costs depending, you may find larger drive sizes to be a better investment in the long term. Namely, if you plan on seeing the drives through to their 5 year warranty, 18TB drives are pretty good value.
For my hardware and power costs, this is the breakdown for cumulative $/TB (y axis) over years of service (x axis):
The first two died within 30 days, the second one took about 4 months I think. Not a huge sample size, but it kind of matches the typical hard drive failure bathtub curve.
I just double checked, and mine were actually from a similar seller on Amazon - they all seem to be from the same supplier though - the warranty card and packaging are identical. So ymmv?
Warranty was easy, I emailed the email address included in the warranty slip, gave details on order number + drive serial number, and they sent me a mailing slip within 1 business day. Print that out, put the drive back in the box it shipped with (I always save these), tape it up and drop it off for shipping. In my case, it was a refund of the purchase pretty much as soon as it was delivered to the seller.
When you tried caddy and received an error, that looks like you are getting the wrong image name.
Then you mentioned deleting caddyfile as the configuration didn’t work. But, if I am following correctly the caddyfile wouldn’t yet be relevant if the caddy container hadn’t actually ran.
Pulling from Caddys docs, you should just need to run
I have not tried caddy through docker yet, just running it through a windows command line with admin priv. I’m looking into doing it with Docker, just haven’t started yet.
that surely is the issue. you can convert it to mp4 with ffmpeg: ffmpeg -i input.mkv -c copy output.mp4If you want to keep subtitles this will probably work: ffmpeg -i input.mkv -map 0 -c copy -c:s mov_text output.mp4
I’ve been using Linode for a decade (or more) now without any issues. I’d encourage you to contact their support about this issue. Assuming you’re on the up-and-up this sounds like a big and in sure they’d be happy to help.
If you decide not to go with Linode though I think Digital Ocean is a good alternative.
No they didn’t grandfather anybody in, they made the price changes to compute universally back in April of last year. The only plan not changed was the $5 nanode so if that’s all you’re running then that’s probably why your bill didn’t change.
If your title is system administrator, maybe you don’t get paid as much with the same responsibilities as a DevOps Engineer, System Reliability Engineer, Cloud Computing Engineer etc. Don’t get caught up in titles, sell the value of your skills.
Last time I checked renaming an empty text file to “Assassin’s Creed 2.zip” was legal in my jurisdiction but I now must fear a C&D Letter from Ubisoft it seems lmao.
Dear god, i fucking hate smart-asses like you. I bet OP could use gay porn in those thumbnails too to satisfy you aswell, but maybe images of popular games fit better right?
It’s not “false advertising”, idiot.
Because OP does not sell those goddamn games and only wanted to show off his UI capabilities. In fact it looks like he doesn’t even fucking sell anything, so what the hell should he advertise for?
Looks like OP’s been passionately working on this project for his own purposes since almost two years, shared it for good will and probably doesn’t give a singular fuck about you using it or not. Just take it or leave it.
What entitlement? I am just saying it’s weird that you would add games with DRM in mockup for a project that’s about DRM-free games. Gay porn or any porn for that matter would also not fit in DRM-free game category.
Entitled, because you’re blatantly calling posts about a free product “false advertisement” while OP is not advertising games.
It’s a MOCKUP, guy wont spend time to research the DRM Protection of every fucking game on Steam to provide you accurate thumbnails. Its a software for games and it showcases pictures of GAMES. He doesnt promote the games but his software.
You should definitely google the definition of the word “Mockup” before you continue being a retard in this forum. Have a nice day.
Maybe not false advertising, guess it’s true that only commercial products could be called that. Misleading then, bad marketing (probably also only for commercial things so not really), I don’t know how to call it.
Just a weird choice, that’s all. Same if you would add Metallica album if you showcase some hobby project meant to host royality free music :)
Nextcloud for me too, would break because of updates requiring manual DB updates sometimes, apps would randomly stop working after updating too, or the 2 times it caused total data loss on all my synced devices and the server itself which required a full restore from backups.
After getting rid of it and switching to Syncthing + Filebrowser + SFTPGo for WebDAV I haven’t really had anything break since then (about a year now). Stuff also runs much faster, NC was extremely slow even on good hardware with all their recommended settings for performance.
If Nextcloud “caused total data loss on all my synced devices and the server itself” I would probably do something unsavory to any responsible party I could locate, and take 10 TB of data out of their lousy hide.
Yeah the first time was the time/date bug they had (still have?) where it set the time on every folder and file to 00/00/0000 00:00 across all clients and the server.
Second time was I disabled virtual file support on my laptop so it would sync everything, but instead it went and wiped all the files from the server, because for some reason their sync client assumed the laptop that now had no files on it should be the master source or something.
Their own docs even state that’s how you’re supposed to disable VFS, with no mention that it will wipe your server clean.
I found a service that syncs our calendars self-hosted. That was the only thing that was missing. Can’t remember the name, works flawlessly and without any problems for a number of years now. If you are interested, I’ll look it up next weekend.
I want my docs and files on a self-hosted cloud (I can’t seem to get sftp, ftp, or sharing to work on windows 11 even after adding the missing features) , with the ability to at least open the contents without downloading them. I want to stop using google for calendars and notes, and it would be handy to have a self-hosted bulletin board I and my added users could write on.
According to the box, nextcloud does all these things, except that it doesn’t, without practically rewriting the code and somehow re-engineering linux to not be a fucking cunt.
I’ve tried and tried. It just won’t work. Maybe I need to get a different firewall program. I’m working in Pro, added the features, made the firewall exceptions, have my network setting as “private,” I’ve done everything. The host will be visible on the network, but logins time out or fail altogether.
Since writing my rant, I found HFS, which, though an OLD program, was stupid stupid easy to set up.
I also found Filebrowser, and though the config was way more of an asspain than it should have been, it’s fucking awesome. I’ve even moved on to trying to get HTTPS running for external connections using Win-Acme, but it isn’t going well.
Please do! I had spent solid day researching open source CALDAV server/clients to replace Google calendar for my boss. Almost no options on that front.
I have used Baikal for caldav for the server, with davx5 on Android. Was solid. Moved to NC for files, so went ahead with calendar sync on NC too. NC calendar sync has already worked well for me, no hiccups.
The only issue I’ve had with NC is auto upload of photos from my phone. It constantly has conflicts. Otherwise sync of regular files works great.
Damn FOSS Android Auto development is starting the new year off strong! First grapheneOS successfully implementing it on a non-stock OS and now this too. Too bad I got rid of my vehicles last year and no longer have a use for it on my ebike.
selfhosted
Active
This magazine is from a federated server and may be incomplete. Browse more on the original instance.