LightYear
/Docs
DocsStorageBackup and Restore a MySQL Database

Backup and Restore a MySQL Database

Use mysqldump and automated cron jobs to back up your MySQL database to object storage.

intermediate
10 min read
LightYear Docs Team
Updated April 24, 2026
mysqlbackupmysqldumpcrondatabase

Regular database backups are essential for disaster recovery. This guide shows how to back up a MySQL database using mysqldump and automate the process with a cron job that uploads backups to object storage.

Manual Backup with mysqldump

Backup a Single Database

>_BASH
$mysqldump -u root -p myapp > /tmp/myapp_backup_$(date +%Y%m%d_%H%M%S).sql

Backup All Databases

>_BASH
$mysqldump -u root -p --all-databases > /tmp/all_databases_$(date +%Y%m%d).sql

Compressed Backup

>_BASH
$mysqldump -u root -p myapp | gzip > /tmp/myapp_$(date +%Y%m%d).sql.gz

Restore from a Backup

Restore a Single Database

>_BASH
$mysql -u root -p myapp < /tmp/myapp_backup_20260424_100000.sql

Restore a Compressed Backup

>_BASH
$gunzip < /tmp/myapp_20260424.sql.gz | mysql -u root -p myapp

Automated Backup Script

Create a backup script at /usr/local/bin/backup-mysql.sh:

>_BASH
$#!/bin/bash
$set -euo pipefail
$
$DB_NAME="myapp"
$DB_USER="backup_user"
$DB_PASS="SecureBackupPassword!"
$BACKUP_DIR="/var/backups/mysql"
$S3_BUCKET="s3://myapp-backups"
$S3_ENDPOINT="https://sgp-01.objectstorage.lightyear.host"
$RETENTION_DAYS=30
$DATE=$(date +%Y%m%d_%H%M%S)
$FILENAME="${DB_NAME}_${DATE}.sql.gz"
$
$# Create backup directory
$mkdir -p "$BACKUP_DIR"
$
$# Dump and compress
$mysqldump -u "$DB_USER" -p"$DB_PASS" "$DB_NAME" | gzip > "$BACKUP_DIR/$FILENAME"
$
$# Upload to object storage
$aws s3 cp "$BACKUP_DIR/$FILENAME" "$S3_BUCKET/daily/$FILENAME" \
$ --endpoint-url "$S3_ENDPOINT"
$
$# Remove local backups older than 7 days
$find "$BACKUP_DIR" -name "*.sql.gz" -mtime +7 -delete
$
$# Remove remote backups older than retention period
$aws s3 ls "$S3_BUCKET/daily/" --endpoint-url "$S3_ENDPOINT" | \
$ awk '{print $4}' | \
$ while read -r file; do
$ file_date=$(echo "$file" | grep -oP '\d{8}')
$ if [[ $(date -d "$file_date" +%s) -lt $(date -d "-${RETENTION_DAYS} days" +%s) ]]; then
$ aws s3 rm "$S3_BUCKET/daily/$file" --endpoint-url "$S3_ENDPOINT"
$ fi
$ done
$
$echo "Backup completed: $FILENAME"
>_BASH
$chmod +x /usr/local/bin/backup-mysql.sh

Create a Dedicated Backup User

>_BASH
$mysql -u root -p
SQL
CREATE USER 'backup_user'@'localhost' IDENTIFIED BY 'SecureBackupPassword!';
GRANT SELECT, LOCK TABLES, SHOW VIEW, EVENT, TRIGGER ON myapp.* TO 'backup_user'@'localhost';
FLUSH PRIVILEGES;

Schedule with Cron

>_BASH
$crontab -e

Add a daily backup at 2:00 AM:

INI
0 2 * * * /usr/local/bin/backup-mysql.sh >> /var/log/mysql-backup.log 2>&1

Verify the Backup

>_BASH
$/usr/local/bin/backup-mysql.sh
OUTPUT
Backup completed: myapp_20260424_020000.sql.gz
>_BASH
$aws s3 ls s3://myapp-backups/daily/ --endpoint-url https://sgp-01.objectstorage.lightyear.host
OUTPUT
2026-04-24 02:00:15    1245678 myapp_20260424_020000.sql.gz

[!IMPORTANT] Test your restore process regularly. A backup that cannot be restored is worthless. Schedule a monthly restore test to a staging server.

Was this article helpful?

Your cookie choices for this website

This site uses cookies and related technologies, as described in our privacy policy, for purposes that may include site operation, analytics, and enhanced user experience. You may choose to consent to our use of these technologies, or manage your own preferences. Cookie policy