Backup website database to Google Drive

I needed a basic solution to back up the website database into Google Drive, just to have some backups in case something bad happened.

I ended up using this tool: https://github.com/glotlabs/gdrive. It is a command line tool to interact with  Google Drive. It was fairly easy to set up, but it needs an app created on Google Cloud Console. Setup also needs to create an auth token and it does use HTTP for it, so it was confusing at first to get it working on the server nd the documentation recommends doing it locally and copying files to the remote server, but I found a workaround - after granting access, it will direct to localhost URL and I just copied and opened it on the server (new session) with lynx.

My solution uses a cron job to create a database dump and send it to Google Drive. The cronjob runs every day at 1 a.m. It dumps the database into a specific folder on my server and then uses the gdrive command to send it to into specific folder. It will delete older backup files, if there are any.

The backup script I ended up with:

File: backup.sh
#!/bin/bash # Directory where local backups are stored local_backup_dir="/home/user/backups/" # Google Drive folder ID folder_id="folder_id" # MySQL database information db_user="your_username" db_password="your_password" db_name="your_database_name" # Get the current date in yyyy-mm-dd format current_date=$(date +%Y-%m-%d) # Get the current date 30 days ago in yyyy-mm-dd format thirty_days_ago=$(date -d "30 days ago" +%Y-%m-%d) # Execute mysqldump and save it to a local file backup_filename="db-$current_date.sql.gz" backup_filepath="$local_backup_dir$backup_filename" mysqldump --user="$db_user" --password="$db_password" "$db_name" | gzip > "$backup_filepath" # Upload the local backup to Google Drive gdrive upload "$backup_filepath" --parent "$folder_id" # Execute the gdrive command with the query and process the output gdrive_output=$(gdrive files list --query "createdTime < '$thirty_days_ago' and '$folder_id' in parents and trashed = false" --field-separator , --skip-header) echo "$gdrive_output" | while IFS=',' read -r file_id file_name file_type file_size created_date; do # Ensure the line contains data in all expected columns if [ -n "$file_id" ] && [ -n "$file_name" ] && [ -n "$file_type" ] && [ -n "$file_size" ] && [ -n "$created_date" ]; then # Check if the filename starts with "db-" if [[ "$file_name" == "db-"* ]]; then # Delete the file from Google Drive gdrive files delete "$file_id" # Delete the corresponding local file local_file_path="$local_backup_dir$file_name" if [ -e "$local_file_path" ]; then rm "$local_file_path" fi fi fi done

Small things to keep in mind:

  • If the password happened to use a dollar sign $ inside the database password, you need to escape it with a slash \.
  • --parent command uses folder ID, which you can find in the URL on your Google Drive if you open the folder. If you want to save it into root directly, then you can skip that.
  • It does not back up site files.
  • IF you need to see, what it runs during execution, add set -x under the hashbang line, which will print all the commands the script executes.

To run it, you can add line to crontab with crontab -e

0 1 * * * /bin/bash /home/user/backup.sh

The solution is not fully tested yet, but it is currently running.

Because this tool can do anything you need to be careful not to delete unnecessary files. The script tries to check that files start with db-

Let's see, how it works out.

Buy Me a Coffee at ko-fi.com

Add new comment