npastaSyn,

Outside my depth but I'll give it a stab. Identify what data is important, (is the full 42Tb needed?). Can the data be split into easier to handle chunks?

If it is, then I personally do an initial sneakernet to get the fist set of data over. Then mirror different on a regular basis.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • datahoarder@lemmy.ml
  • localhost
  • All magazines
  • Loading…
    Loading the web debug toolbar…
    Attempt #