selfhosted

This magazine is from a federated server and may be incomplete. Browse more on the original instance.

superpants, in Kubernetes? docker-compose? How should I organize my container services in 2024?

A plug for the pro Kubernetes crowd:

I run microk8s on a 3 node cluster, using FluxCD to deploy and manage my services. I also work with Kubernetes at work, so I’m very familiar with the concepts. But I will never use anything else.

If you want maximum control and flexibility, learn Kubernetes. For a lot of people (myself included) it’s overkill, but IMO it’s the best.

My main gripe with docker-compose, which is what I used to use, is that service changes require access to the machine. I have to run commands on the host to alter services. With Kubernetes, and more precisely a GitOps model, you can just make a commit to a git repo and it will roll out.

Lem453,

For your last point, portainer fixes that. I use portainer to pull compose files from my gitea instance. There is an option to auto update on git comit but I prefer to press the button to update.

I write the compose files in vscode and push them to my repo.

atzanteol,

FWIW I manage docker compose files with ansible. Allows me to centrally manage them without the need to go logging into multiple vms. I also create a systemd service file to start/stop the containers (also managed with ansible).

That said I’m starting to switch over to k8s as well (also with microk8s which has been the easiest to work with). Definitely overkill but I want to learn it.

nopersonalspace,

Yes very true, I really would much prefer GitOps as I feel… uneasy about how handwired and ephemeral my current setup is and would love it to be more declarative and idempotent. It does seem like Kubernetes is the way to do that.

Toribor, in Should I use Restic, Borg, or Kopia for container backups?
@Toribor@corndog.social avatar

I really like Kopia. I backup my containers with it, my workstations, and replicate to s3 nightly. It’s great.

fuckwit_mcbumcrumble, in Hosting websites over 4g

If you’re buying a VPS why not host the website there?

justawittyusername,

Good question, I will want to host more in the future, im trying to keep costs as low as possible.

taladar,

Depending on what you are trying to host and where you live power usage and your own hardware might be more expensive than the VPS you require to host those.

thirdBreakfast,
@thirdBreakfast@lemmy.world avatar

This. Hosting at home might be cheaper if you are serving a lot of data, but in that case, the speed’s going to kill you.

I’m a keen self-hoster, but my public facing websites are on a $4 VPS (Binary Lane - which I recommend since you’re in Aus). In addition to less hassle, you get faster speeds and (probably) better uptime.

StrawberryPigtails, in Hosting websites over 4g

Sounds like a connection would work with that setup but it would depend on what you are planning on hosting. Anything that is sensitive to latency would probably not work well. Static sites should be fine though.

Samsy, in Kubernetes? docker-compose? How should I organize my container services in 2024?

I was familiar with just organise my docker-compose containers without any frontend. But I discovered casaOS, which make things pretty simple. An AppStore and a SMB-Shared File manager gave me a really good workflow. Things that aren’t on the AppStore can be handled outside of Casa, too.

PS. But never make the mistake to integrate the outside handled containers, this mess things up.

nopersonalspace,

Thanks, yeah I’ve heard good things about casaOS. I think that I’m trying to move in the other direction though: fewer UI’s and more CLI’s + Configuration files.

possiblylinux127, (edited ) in Intel N100 good enough for 1Gbits internet ?

Honestly I’m a big fan of openWRT as it can give very good performance on cheap and used hardware.

I’ve never used it on amd64 but it may be worth a shot.

StopSpazzing,
@StopSpazzing@lemmy.world avatar

Is there a good gui suggestion?

possiblylinux127,

It comes with a fairly extensive GUI

StopSpazzing,
@StopSpazzing@lemmy.world avatar

Last time I used Luci gui was like 12 years ago. How has it improved since?

possiblylinux127,

I haven’t been using it for 12 years but right now the Luci GUI is the most extensive router GUI I’ve used.

JonnyJaap,

I used devices from gl iNet, the devices are good, but I find the UI of opnsense way better (compared to advance ui of openWRT) and updates are directly from opnsense.

I still have them for smaller network tests but for some reason I never got close to it. Probably another reason is that my brother uses opnsense too, if we have any issues we can ask each other for help.

nopersonalspace, (edited ) in Hosting websites over 4g

I mean I think it really depends on the type of website you’re trying to host. A static blog would use way less bandwidth than a media server for example. Traffic would have the same effect too, where 1 concurrent visitor to a blog would probably be fine but 10,000 would be a problem.

sep, in How often do you back up?

How often depends on how much work it is to recreate, or the consequences of loosing data.

Some systems do not have real data locally, get a backup every week. Most get a nightly backup. Some with a high rate of change , get a lunch/middle of the workday run.
Some have hourly backups/snapshots, where recreating data is impossible. CriticL databases have hourly + transaction log streaming offsite.

How long to keep a history depends on how likely an error can go unnoticed but minimum 14 days. Most have 10 dailes + 5 weeky + 6 monthly + 1 yearly.

If you have paper recipes and can recreate data lost easily. Daily seems fine.

computergeek125, in How often do you back up?

I’m probably the overkill case because I have AD+vC and a ton of VMs.

RPO 24H for main desktop and critical VMs like vCenter, domain controllers, DHCP, DNS, Unifi controller, etc.

Twice a week for laptops and remote desktop target VMs

Once a week for everything else.

Backups are kept: (may be plus or minus a bit)

  • Daily backups for a week
  • Weekly backups for a month
  • Monthly backups for a year
  • Yearly backups for 2-3y

The software I have (Synology Active Backup) captures data using incremental backups where possible, but if it loses its incremental marker (system restore in windows, change-block tracking in VMware, rsync for file servers), it will generate a full backup and deduplicate (iirc).

From the many times this has saved me from various bad things happening for various reasons, I want to say the RTO is about 2-6h for a VM to restore and 18 for a desktop to restore from the point at which I decide to go back to a backup.

Right now my main limitation is my poor quad core Synology is running a little hot on the CPU front, so some of those have farther apart RPOs than I’d like.

NeoNachtwaechter, (edited ) in Hosting websites over 4g

I know 4g is not fast, but I would like to use it

There was a time when people used to have 2400 bits per second from home (for the youngsters: that is 0.0003M). So if you are doing it for fun, why not.

filcuk,

True, but everything is more bandwidth demanding these days, plus we’re used to fast loading.

henfredemars, in How often do you back up?

I still have drawings I made in MS Paint on Windows 95 when it had just come out, my first text document, and the first report I ever typed in grade school.

Btrfs snapshots of the root volume in RAID1 configuration with 8 hourly, 7 daily, 3 weekly, and automated rsync backups to NAS, with primary and secondary offsite, physically disconnected backups stored in sealed, airtight, and waterproof containers at two different banks prepaid storage and with advanced directive in the event of my demise.

Bit of a hobby really. I acknowledge it’s completely unnecessary. I don’t like to lose data.

Dave,
@Dave@lemmy.nz avatar

Sealed, airtight, and waterproof but what if both banks burn down at the same time? You didn’t mention fire-proof.

henfredemars,

You got me there! Not fireproof. In that case I’m just hoping that having two off-site backups at different locations has me covered, but that’s a good idea. I should consider fireproof foil.

Appoxo,
@Appoxo@lemmy.dbzer0.com avatar

You are backed up better than some enterprises…

Just wow.

henfredemars,

Another perspective is data hoarding.

I have system images of machines of relatives who have died. Many of the photos that I have retained are the only ones. However, that was more an emergent utility than a motivating one.

bitwaba,

How often do you update your off-site backups?

henfredemars,

Monthly, alternating locations.

homegrowntechie, in What is your favourite selfhosted wiki software and why?

Trilium for personal use cases. It is very easy to host, edit, and navigate. Public sharing is easy too.

Appoxo, in How often do you back up?
@Appoxo@lemmy.dbzer0.com avatar

My PC: Every day and when it is online
My drives the backups go to: Once a week.

69420,

Wait, so you backup your backups? Why not just 2 backups of the same thing?

Appoxo,
@Appoxo@lemmy.dbzer0.com avatar

Limitation of hardware.
It is essentially just a file copy.

sgh, in Hosting websites over 4g

Have you looked into Cloudflare Tunnel? It’s a turnkey solution that does exactly what you want. No idea what the cost is though.

conorab, in How often do you back up?
  • Personal and business are extremely different. In personal, you backup to defend against your own screwups, ransomware and hardware failure. You are much more likely to predict what is changing most and what is most important so it’s easier to know exactly what needs hourly backups and what needs monthly backups. In business you protect against everything in personal + other people’s screwups and malicious users.
  • If you had to setup backups for business without any further details: 7 daily, 4 weekly, 12 monthly (or as many as you can). You really should discuss this with the affected people though.
  • If you had to setup backups for personal (and not more than a few users): 7 daily, 1 monthly, 1 yearly.
  • Keep as much as you can handle if you already paid for backups (on-site hardware and fixed cost remote backups). No point having several terabytes of free backup space but this will be more wear on the hardware.
  • How much time are you willing to lose? If you lost 1 hour of game saves or the office’s work and therefore 1 hour of labour for you or the whole office would it be OK? The “whole office” part is quite unlikely especially if you set up permissions to reduce the amount of damage people can do. It’s most likely to be 1 file or folder.
  • You generally don’t need to keep hourly snapshots for more than a couple days since if it’s important enough to need the last hours copy, it will probably be noticed within 2 days. Hourly snapshots can also be very expensive.
  • You almost always want daily snapshots for a week. If you can hold them for longer, then do it since they are useful to restoring screwups that went unnoticed for a while and are very useful for auditing. However, keeping a lot of daily snapshots in a high-churn environment gets expensive quickly especially when backing up Windows VMs.
  • Weekly and monthly snapshots largely cover auditing and malicious users where something was deleted or changed and nobody noticed for a long time. Prioritise keeping daily snapshots over weekly snapshots, and weekly snapshots over monthly snapshots.
  • Yearly snapshots are more for archival and restoring that folder which nobody touched forever and was deleted to save space.
  • The numbers above assume a backup system which keeps anything older than 1 month in full and maybe even a week in full (a total duplicate). This is generally done in case of corruption. Keeping daily snapshots for 1 year as increments is very cheap but you risk losing everything due to bitrot. If you are depending on incrementals for long periods of time, you need regular scrubs and redundancy.
  • When referring to snapshots I am referring to snapshots stored on the backup storage, not production. Snapshots on the same storage as your production are only useful for non-hardware issues and some ransomware issues. You snapshots must exist on a seperate server and storage. Your snapshots must also be replicated off-site minus hourly snapshots unless you absolutely cannot afford to lose the last hour (billing/transaction details).
  • All
  • Subscribed
  • Moderated
  • Favorites
  • selfhosted@lemmy.world
  • localhost
  • All magazines
  • Loading…
    Loading the web debug toolbar…
    Attempt #