Bash + Cron = Hands-Free Database Backups
Overview
Regular database backups are essential for data safety, but performing them manually is tedious and error-prone. Thankfully, with a simple Bash script and a cron job, you can create a robust, hands-free backup system that automates the entire process—from dumping your database to compressing and rotating old backups. In this guide, we’ll walk through every step, from script creation to scheduling, with complete working examples.
1. Setting Up the Environment
Before writing any code, ensure that the following utilities are installed on your Linux or macOS environment:
mysqldumporpg_dump(depending on your database)gzipfor compressioncronfor job scheduling
To verify installation, run:
which mysqldump && which gzip && which cron
If any command returns empty, install via your package manager, e.g., sudo apt install mysql-client gzip cron.
2. Creating the Backup Script
Now let’s create a simple Bash script to dump and compress your database. We’ll store the backups in /var/backups/db.
#!/bin/bash
# Configuration
db_user="root"
db_password="mysecret"
db_name="production_db"
backup_dir="/var/backups/db"
timestamp=$(date +"%Y%m%d_%H%M%S")
backup_file="$backup_dir/${db_name}_$timestamp.sql.gz"
# Ensure backup directory exists
mkdir -p "$backup_dir"
# Dump and compress
mysqldump -u $db_user -p$db_password $db_name | gzip > "$backup_file"
# Verify completion
if [ $? -eq 0 ]; then
echo "[$(date)] Backup successful: $backup_file"
else
echo "[$(date)] Backup failed!"
fi
Place this file at /usr/local/bin/db_backup.sh and make it executable with chmod +x /usr/local/bin/db_backup.sh.
Tip: Avoid storing passwords directly in scripts by using MySQL’s ~/.my.cnf file or environment variables for credential management.
3. Adding Rotation Logic
Backups can accumulate quickly, consuming disk space. Let’s modify the script to delete records older than seven days. Add the following snippet at the end of your script:
# Delete backups older than 7 days
find "$backup_dir" -type f -name "${db_name}_*.sql.gz" -mtime +7 -exec rm {} \;
echo "[$(date)] Old backups cleaned up."
This ensures that only the latest seven days of backups are retained. Adjust -mtime as needed for your policy.
4. Scheduling Backups with Cron
Now that our script is functional, we’ll automate it using cron. Edit your crontab file using:
crontab -e
Add the following entry to run the backup daily at 2:00 AM:
0 2 * * * /usr/local/bin/db_backup.sh >> /var/log/db_backup.log 2>&1
This logs output to /var/log/db_backup.log for monitoring. Cron provides powerful automation for recurring tasks, and this line ensures that no matter what you’re doing, backups are reliably triggered.
5. Testing and Validation
After setting up cron, always validate that backups are working correctly. Run the script manually first:
/usr/local/bin/db_backup.sh
Then verify with:
ls -lh /var/backups/db
Finally, check your log file with:
tail -f /var/log/db_backup.log
This helps confirm your automation flow is solid and reliable. You can also restore a backup to ensure the files are valid by decompressing and importing:
gunzip -c production_db_20240101_020000.sql.gz | mysql -u root -p new_database
Bonus: Optimization and Security Tips
- Use
gzip -9for maximum compression, though it may slow backups slightly. - Redirect logs to a monitoring system (like Logrotate or Prometheus).
- Encrypt the resulting files using
gpgif the dumps contain sensitive information. - For larger databases, consider using incremental backups or
mysqldump --single-transactionto avoid locks.
Conclusion
With this approach, you’ve built a fully automated database backup system using simple yet powerful Unix tools—Bash, gzip, and cron. This setup ensures your valuable data is safely backed up every day, freeing you to focus on development rather than maintenance chores.
Useful links:


