This sums up the whole story that I am about to describe
Since I’ve been using the Maybe app on my home server and adding many transactions to it, I became worried about the possibility of accidentally deleting the database or something else going wrong as I tinker with it a lot. To prevent losing valuable data, I decided to back up the database regularly. Since it’s hosted on my personal server, I wanted the backup to be stored somewhere else — specifically, on a cloud storage service.
After some research, I found a tool called rclone, which is a command-line utility for managing files on cloud storage. I configured rclone with my Google Drive, and it made it easy to sync files between my home server and Google Drive. All I had to do was run the below script.
rclone sync maybe-setup gdrive:/Backup/maybe -P
This script syncs the maybe-setup directory to the Backup/maybe directory in Google Drive. The -P flag shows the progress of the sync. Now the first problem was solved. I could now sync my files to Google Drive. The next problem was what to keep as the backup.
To back up my PostgreSQL database, I used the pg_dump command, which creates a backup of the database. With the below command, I was able to back up the database.
docker exec my_db_container pg_dump -Fc -U maybe_user -d maybe_production > /root/maybe-setup/backup-sql/backup.sql
I combined these two tools — rclone for syncing to Google Drive and pg_dump for creating the backup — into a script that:
1. Dumps the PostgreSQL database.
2. Syncs the backup file to a specific directory on Google Drive (Backup/maybe).
To create the script, I first created a file called backup.sh and made it executable.
touch backup.sh
chmod +x backup.sh
The script is as follows:
docker exec my_db_container pg_dump -Fc -U maybe_user -d maybe_production > /root/maybe-setup/backup-sql/backup.sql
rclone sync maybe-setup gdrive:/Backup/maybe -P
So now comes the last part. To ensure that I get regular backups, I registered the script to run automatically every day using the crontab.
0 0 * * * /root/backup.sh >> /root/backup.log 2>&1
This cron job runs the script at midnight every day, creating a new backup and syncing it to Google Drive.
In case I ever need to restore the database, I can simply download the backup file from Google Drive and use pg_restore to restore the database:
# Copy the backup file to the container
docker cp backup.sql maybe-postgres-1:/tmp/backup.sql
# Restore the database
docker exec my_db_container pg_restore -Fc -U maybe_user -d maybe_production /tmp/backup.sql
And that’s how I set up my backup and restore process. While there are other tools and methods available, this is how I tackled the problem of securing my database.