Hi Everyone
I need some help
I’m currently selfhosting some of my applications on Digital Ocean and i run a container using Portainer CE. I was wondering how you guys keep backups for the applications running on docker.
I’m currently using Digital ocean’s snapshots feature but is there a better way i could use, any help on this is highly appreciated.
For databases and data I use restic-compose-backup because you can use labels in your docker compose files.
For config files I use a git repository.
When backing up Docker volumes, should not the docker container be stopped first.?? I can’t se any support for that in the backup tools mentioned.
Yes the containers do need to be stopped. I actually built a project that does exactly that.
Thanks, I will look into this.
I just run a pg_dump through kubectl exec and pipe the stdout to a file on my master node. The same script then runs restic to send encrypted backups over to s3. I use the host name flag on the restic command as kind of a hack to get backups per service name. This eliminates the risk of overwriting files or directories with the same name.
I backup all the mounted docker volumes once every hour (snapshots). Additionally i create dumps from all databases with https://github.com/tiredofit/docker-db-backup (once every hour or once a day depending on the database).
ZFS snapshots.
duplicati to take live, crash-consistent backups of all my windows servers and VMs with Volume Shadowcopy Service (VSS)
I use Nautical. It will stop your containers before performing an RSYNC backup.
On Proxmox i use for my Backup Solution - Hetzner Storage Bix
Use resticker to add an additional backup service to each compose allowing me to customize some pre/post backup actions. Works like a charm 👍
Proxmox Backup Server (PBS) snapshotting all my VM’s / LXC’s.
External VPS’ and anything that can’t run PBS-Client I am rsync’ing important data into my home network first, then doing a file based backup of that data to PBS via PBS-Client tool. All this is automated through cron jobs.
Those backups then get sync’d to a 2nd datastore for a bit of redundancy.
I use duplicati to backup to a secure off site location. Useful for something like vaultwarden.
Unraid with Duplicacy and Appdata Backup incremental to Backblaze
Most of mine are lightweight so private repos on git.
For big data I have two NAS that sync on the daily.
Cronjobs to backup important folders to a separate disk
Git repo(s) for services & configs with weekly automated commits and pushes
I do the reverse… all configs are ansible scripts and files and I just push them to the servers. That way I can spin up a new machine from scratch, completely automated within minutes… just the time it takes the machine to set itself up.
As others said, use volume mounts, and I incrementally backup those with borg to minimize storage space requirements