The original post: /r/selfhosted by /u/djbiccboii on 2024-04-16 04:44:29.
I run a small homelab/network with a few servers running debian linux.
I’d like to back up the hosts in a way that if there was a catastrophic failure that I could essentially repair or purchase a replacement machine and have it in virtually the exact same state.
I was thinking about committing all non-sensitive configs/scripts/etc (i.e. nothing that is of significant size) to version control via cron or something and then tarballing any data directories to a backup host or the cloud.
The problem with this approach is it feels kind of clunky and error prone. At my work we take a similar approach but it’s better and it works because it was written and tested by many people over many years.
I think Ideally I’d love something like what you can do with a digital ocean VPS where you just take a “snapshot” of the host which takes like no time at all and at any time you just click a button and the host becomes as it was in that state.
What do y’all think? Any recommendations? Requirements here are they’re either simple/scalable (like bash scripts) or open source. Not interested in using closed source software even if it fits my needs.
I’m aware of a few options out there like rsnapshot & bacula though I’ve never used any of them, so any hands on opinions and/or experience there is most welcome.
Thanks!