linux

This magazine is from a federated server and may be incomplete. Browse more on the original instance.

seaQueue, in Super weird error, what's happening?
@seaQueue@lemmy.world avatar

Shit’s broke yo.

Sleep/wake issues with AMD gpu and platform drivers are super, super, super common. Fish back through your kernel journal after a reboot (journalctl -kb -1 should do it) and look for the driver errors immediately after the wake event. If this has been fixed in a later kernel release then update your kernel, if not go report it to either the Ubuntu folks or on the amdgpu gitlab.

iwasgodonce, in What's an elegant way of automatically backing up the contents of a large drive to multiple smaller drives that add up to the capacity of the large drive?

www.gnu.org/software/…/Using-Multiple-Tapes.html

Might do kind of what you want.

Catsrules, in Debian Likely Moving Away From i386 In The Near Future

Can someone explain like I am 5?

Is just just talking about 32bit processor support? Or are we also talking about 32 bit programs as well?

eutampieri,

The first

Catsrules,

Thanks

eutampieri,

☺️

Endorkend, in Super weird error, what's happening?
@Endorkend@kbin.social avatar

Before doing anything, if your screen allows it, swap DP to HDMI or HDMI to DP as output, that may fix this to the point of being able to actually boot and further fix the issue.

I've had this before with drivers where suddenly it would fail on either port but would still run on one of the others.

AbidanYre, (edited ) in What's an elegant way of automatically backing up the contents of a large drive to multiple smaller drives that add up to the capacity of the large drive?

Git annex can do that and keep track of which drive the files are on.

git-annex.branchable.com

restlessyet, in What's an elegant way of automatically backing up the contents of a large drive to multiple smaller drives that add up to the capacity of the large drive?

I ran into the same problem some months ago when my cloud backups stopped being financially viable and I decided to recycle my old drives. For offline backups mergerfs will not work as far as I understand. Creating tar archives of 130TB+ also doesnt sound like a good option. Some of the tape backup solutions looked to be possible options, but are often complex and use special archive formats…

I ended up writing my own solution in python using json state files. It’s complete enough to run the backup, but otherwise very work-in-progress with no restore at all. So I do not want to publish it.

If you find a suitable solution I am also very interested 😅

captcha, in What's an elegant way of automatically backing up the contents of a large drive to multiple smaller drives that add up to the capacity of the large drive?

Im going to say that doesnt exist and restoring from it would be a nightmare. You could cobble together a shell or python script that does that though.

You’re better off just getting a drive bay and plugging all the drives in at once as an LVM.

You could also do the opposite, which is split the 4TB into the different logical volumes. Each the same size as a drive.

lemmyvore,

It wouldn’t be so complicated to restore as long as they keep full paths and don’t split up subdirectories. But yeah, sounds like they’d need a custom tool to examine their dirs and do a solve a series of knapsack problems.

FigMcLargeHuge, in What's an elegant way of automatically backing up the contents of a large drive to multiple smaller drives that add up to the capacity of the large drive?

It’s going to take a little work here, but I have a large drive on my plex, and a couple of smaller drives that I back everything up to. On the large drive get a list of the main folders. You can do a “du -h --max-depth=1 | sort -hk1” on the root folder to get an idea of how you should split them up. Once you have an idea, make two files, each with their own list of folders (eg: folders1.out and folders2.out) that you want to go to each separate drive. If you have both of the smaller drives mounted, just execute the rsync commands, otherwise, just do each rsync command with the corresponding drive mounted. Here’s an example of my rsync commands. Keep in mind I am going from an ext4 filesystem to a couple of ntfs drives, which is why I use the size only. Make sure and do a dry run or two, and you may or may not want the ‘–delete’ command in there. Since I don’t want to keep files I have deleted from my plex, I have it delete them on the target drive also.

sudo rsync -rhi --delete --size-only --progress --stats --files-from=/home/plex/src/folders1.out /media/plex/maindrive /media/plex/4tbbackup

sudo rsync -rhi --delete --size-only --progress --stats --files-from=/home/plex/src/folders2.out /media/plex/maindrive /media/plex/other4tbdrive

Deckweiss, (edited ) in What's an elegant way of automatically backing up the contents of a large drive to multiple smaller drives that add up to the capacity of the large drive?

If you are lucky enough, borgbackup could deduplicate and compress the data enough to fit a 1tb drive. Depending on the content of course, but it’s deduplication & compression is really insanely efficient for certain cases. (I have 3 devices with ~900GB each (so just shy of 3TB in total) which all gets stored in a ~400gb borgbackup)

Psythik, (edited ) in Cool fancy programs?

Not a Linux app but I’m willing to bet that you’d love windows93.net.

GnomeComedy, in What's an elegant way of automatically backing up the contents of a large drive to multiple smaller drives that add up to the capacity of the large drive?

Don’t become so concerned with if you could, that you overlook if you should.

I would buy a larger drive.

HiddenLayer5, (edited )
@HiddenLayer5@lemmy.ml avatar

That would probably be the most elegant solution overall and I appreciate the suggestion, but a new drive costs money that I don’t currently have an abundance of and I already have empty drives that aren’t being used, which I had accumulated over time and had already paid for ages ago. If I’m being honest, the reason I want to do it this way is because I don’t really see the value of using a brand new drive for an offline backup of personal data where the drive will be plugged in at best once a month before being stored in a drawer. If I buy a brand new drive I’d rather actually use it as part of the active storage in my server and keep it running to get the most utility out of it.

cadekat, in Considering Gentoo

You’ll need to be a bit more specific about the iMac. What year is it?

If it’s pre-2017, I’d expect some difficulty with the WiFi. If it’s newer, you might have luck with wiki.t2linux.org/distributions/…/installation/ . I haven’t followed that guide, so YMMV.

dylanmorgan,

It’s a 2015 Retina 27”.

I’m fine rocking Ethernet for the purposes of this experiment.

cadekat,

Go for it then! Gentoo is a blast (if you enjoy this sort of thing) and is surprisingly stable once you get it set up.

One tip, before I forget, is to save your firmware from MacOS before wiping the drive. Unfortunately I don’t remember where it’s located, and no longer have access to try and find it 😅

just_another_person,

You can easily add Wifi with a USB dongle anyway. Hardly a hurdle.

ryannathans, in Super weird error, what's happening?

Read the messages on the screen. It’s telling you how to check the logs for the error

jayrhacker, in What's an elegant way of automatically backing up the contents of a large drive to multiple smaller drives that add up to the capacity of the large drive?

ZFS will let you setup a RAID like set of small volumes which mirror one larger volume, it takes some setup, but that's the most "elegant" solution in that once it's configured you only need to touch it when you add a volume to the system and it's just a mounted filesystem that you use.

Does not solve the off-site problem, one fire and it's all gone.

lemmyvore,

It would also require all the secondary drives to be connected at all times, wouldn’t it?

satanmat, in Considering Gentoo

Please log what you do? Please?

I’ve got an older iMac intel and mulling what to use now that it is no longer getting updates

dylanmorgan,

Will do!

  • All
  • Subscribed
  • Moderated
  • Favorites
  • linux@lemmy.ml
  • localhost
  • All magazines
  • Loading…
    Loading the web debug toolbar…
    Attempt #