Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This looks good, I've been meaning to backup my emails to the cloud for a while.

For anyone interested, or has suggestions on how to improve my general backup process (DBs, pfsense router config dumps, bitwarden passwords etc), here is the general process I use:

  - Run a backup container as part of the docker-compose file as part of the service definition 
  - This container will always run, and start a cron service
  - The cron is run from an env var set to the cron schedule syntax
  - The cron runs an entrypoint script that calls a mounted mounted script in a consistent location (/cron/run.sh)
  - The script will do some basic integrity checking, like making sure the file containers some data, is over 1MB and a few other things. This could be greatly improved. In the future for databases, I want to actually restore it to a database in a container and query it for data
  - Zip, compress and place the data on a cloud service like S3
  - On completion, the entrypoint script calls healthchecks.io to inform that a ran was done. I will get an alert if this task does not run in a set amount of time. The healthchecks.io alerts are created with the terraform provider, but I would love a way to integrate this with my current setup
In the future, I should really be doing some kind of integrity checking of these files once on the cloud.

I used Duplicati for a long time, but I had a disk failure on my home server recently and when I went to restore from a Duplicati backup on S3, it was corrupted. I should have had alerts set up for this, but I have now sworn off any solution that I don't fully understand whats happening with.



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: