I have a proxmox+Debian+docker server and I’m looking to setup my backups so that they get backed up (DUH) on my Linux PC whenever it comes online on the local network.

I’m not sure if what’s best is backing up locally and having something else handling the copying, how to have those backup run only if they haven’t run in a while regardless of the availability of the PC, if it’s best to have the PC run the logic or to keep the control over it on the server.

Mostly I don’t want to waste space on my server because it’s limited…

I don’t know the what and I don’t know the how, currently, any input is appreciated.

  • adr1an@programming.dev
    link
    fedilink
    English
    arrow-up
    3
    ·
    4 months ago

    As to how, I’d probably use zfs send | receive, any built-in functionality on a CoW filesystem, rsnapshot, rclone or just syncthing. As to when, I’d probably hack something with systemd triggers (e.g. on network connection, send all remaining incremental snapshots). But this would only be needed in some cases (e.g. not using syncthing ;p)