![]() I used Duplicati for a long time, but I had a disk failure on my home server recently and when I went to restore from a Duplicati backup on S3, it was corrupted. In the future, I should really be doing some kind of integrity checking of these files once on the cloud. The healthchecks.io alerts are created with the terraform provider, but I would love a way to integrate this with my current setup I will get an alert if this task does not run in a set amount of time. On completion, the entrypoint script calls healthchecks.io to inform that a ran was done. Zip, compress and place the data on a cloud service like S3 In the future for databases, I want to actually restore it to a database in a container and query it for data The script will do some basic integrity checking, like making sure the file containers some data, is over 1MB and a few other things. The cron runs an entrypoint script that calls a mounted mounted script in a consistent location (/cron/run.sh) The cron is run from an env var set to the cron schedule syntax This container will always run, and start a cron service Run a backup container as part of the docker-compose file as part of the service definition This looks good, I've been meaning to backup my emails to the cloud for a while.įor anyone interested, or has suggestions on how to improve my general backup process (DBs, pfsense router config dumps, bitwarden passwords etc), here is the general process I use:
0 Comments
Leave a Reply. |