I have a server running Debian with 24 TB of storage. I would ideally like to back up all of it, though much of it is torrents, so only the ones with low seeders really need backed up. I know about the 321 rule but it sounds like it would be expensive. What do you do for backups? Also if anyone uses tape drives for backups I am kinda curious about that potentially for offsite backups in a safe deposit box or something.

TLDR: title.

Edit: You have mentioned borg and rsync, and while borg looks good, I want to go with rsync as it seems to be more actively maintained. I would like to also have my backups encrypted, but rsync doesn’t seem to have that built in. Does anyone know what to do for encrypted backups?

  • ancoraunamoka@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 months ago

    Going unmaintained is a non issue, since you can still restore from your backup. It is not like a subscription or proprietary software which is no longer usable when you stop to pay for it or the company owning goes down.

    Until they hit a hard bug or don’t support newer transport formats or scenarios. Also the community dries up eventually

    • ShortN0te@lemmy.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      Until they hit a hard bug or don’t support newer transport formats or scenarios. Also the community dries up eventually

      That is why you test your backuo. It is unrealiatic, that in a stable software release there is suddenly, after you tested your backup a hard bug which prevents recovery.

      Yes unmaintained software will not support new featueres.

      I think you misunderstood me. You should not use unmaintained software as your backup tool, but IMO it is no problem when it suddenly goes unmaintained, your backup will most likely still work. Same with any other software, that goes unmaintained, look for an alternative.

      • ancoraunamoka@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 months ago

        It is unrealiatic, that in a stable software release there is suddenly, after you tested your backup a hard bug which prevents recovery.

        How is unrealistic? Think of this:

        • day 1: you backup your files, test the backup and everything is fine
        • day 2: you store a new file that triggers a bug in the compression/encryption algorithm of whatever software you use, now backups are corrupted at least for this file Unless you test every backup you do, and consequently can’t backup fast enough, I don’t see how you can predict that future files and situations won’t trigger bugs in a software
        • ShortN0te@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 months ago

          We talk about software that is considered stable. That has verification checks for the backup. Used by thousands of ppl. It is unrealistic.