@Atemu@lemmy.ml avatar

Atemu

@Atemu@lemmy.ml

Interested in Linux, FOSS, data storage systems, unfucking our society and a bit of gaming.

I help maintain Nixpkgs.

github.com/Atemu
reddit.com/u/Atemu12 (Probably won’t be active much anymore.)

This profile is from a federated server and may be incomplete. Browse more on the original instance.

Atemu, (edited )
@Atemu@lemmy.ml avatar

ifconfig.me. Can also be be curl’d.

Easier to remember is to just search for what is my ip in clear net DuckDuckGo (or Kagi if you have it).

they all ask for CAPTYA which is an obvious attempt to obtain ones true IP.

How exactly is a CAPTCHA supposed to discover your “true IP”?

Also note that your IP address is by far not the only thing used to fingerprint you. See abrahamjuliot.github.io/creepjs/ and browserleaks.com.

Use TOR browser if you want your starting conditions to be reasonably anonymous.

Even more critical for fingerprinting is user behaviour though.

Considering Gentoo

I have an old iMac that I am planning to install some flavor of Linux on and while I was looking at various distros it occurred to me that it might be a good exercise to install Gentoo on it. Other than a separate machine for documentation and downloading the necessary packages, what else should I have set up to try this? Has...

Atemu,
@Atemu@lemmy.ml avatar

I’d also add a build machine to the setup. Building a modern desktop system on such a machine would take days.

What's an elegant way of automatically backing up the contents of a large drive to multiple smaller drives that add up to the capacity of the large drive? (on Linux)

So I have a nearly full 4 TB hard drive in my server that I want to make an offline backup of. However, the only spare hard drives I have are a few 500 GB and 1 TB ones, so the entire contents will not fit all at once, but I do have enough total space for it. I also only have one USB hard drive dock so I can only plug in one...

Atemu,
@Atemu@lemmy.ml avatar

I don’t want to do any sort of RAID 0 or striping because the hard drives are old and I don’t want a single one of them failing to make the entire backup unrecoverable.

This will happen in any case unless you had enough capacity for redundancy.

What is in this 4TB drive? A Linux installation? A bunch of user data? Both? What kind of data?

The first step to this is to separate your concerns. If you had e.g. a 20GiB Linux install, 10GiB of loose home files, 1TiB of Movies, 500GiB of photos, 1TiB of games and 500GiB of Music for example, you could back each of those up separately onto separate drives.

Now, it’s likely that you’d still have more data of one category than what fits on your largest external drive (movies are a likely candidate).

For this purpose, I use git-annex.branchable.com. It’s a beast to get into and set up properly with plenty of footguns attached but it was designed to solve issues like this elegantly.
One of the most important things it does is separate file content from file metadata; making metadata available in all locations (“repos”) while data can be present in only a subset, thereby achieving distributed storage. I.e. you could have 4TiB of file contents distributed over a bunch of 500GiB drives but in each one of those repos you’d have the full file tree available (metadata of all files + content of present files) allowing you to manage your files in any place without having all the contents present (or even any). It’s quite magical.

Once configured properly, you can simply attach a drive, clone the git repo onto it and then run a git annex sync --content and it’ll fill that drive up with as much content as it can or until each “file”'s numcopies or other configured constraints are reached.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • localhost
  • All magazines
  • Loading…
    Loading the web debug toolbar…
    Attempt #