Automate File Backup With a Bash Script and Cron Job
Backing up important data is essential for developers, system administrators, and anyone handling critical files. In this post, we’ll explore how to automate file backups using a lightweight Bash script, scheduled with a cron job. This approach is fast, flexible, and works across virtually any Unix-based system.
1. Why Automate Backups With Bash and Cron?
Manual backups are prone to human error and often forgotten until it’s too late. Automating backups ensures your data is consistently archived without requiring daily attention. Bash provides a powerful but easily understood scripting language, while cron
is a long-established scheduling tool available on nearly all Linux and macOS systems.
Use cases include:
- Automatically archiving project folders at midnight
- Rotating logs or reports weekly
- Maintaining versioned snapshots of configuration files
2. Writing the Bash Backup Script
Let’s create a script that compresses a source directory into a date-stamped archive and stores it in a backup directory.
#!/bin/bash
# Define source and target directories
SOURCE_DIR="$HOME/projects"
BACKUP_DIR="$HOME/backups"
# Create backup directory if it doesn't exist
mkdir -p "$BACKUP_DIR"
# Format the date string
DATE=$(date +"%Y-%m-%d_%H-%M-%S")
# Compose the target filename
BACKUP_FILE="$BACKUP_DIR/projects_backup_$DATE.tar.gz"
# Create the compressed archive
tar -czf "$BACKUP_FILE" -C "$SOURCE_DIR" .
# Print confirmation
echo "Backup completed: $BACKUP_FILE"
Explanation:
mkdir -p
ensures the backup directory exists.tar -czf
compresses the entire contents of the source directory.date
‘s timestamp creates unique filenames to avoid overwrites.
Save this as backup.sh
and make it executable:
chmod +x backup.sh
3. Scheduling With Cron
Once your backup script is ready, it’s time to schedule it with cron
.
Run crontab -e
to edit your personal cron jobs. To run the script every day at 2 AM, add this line:
0 2 * * * /path/to/backup.sh >> $HOME/backup.log 2>&1
Explanation:
0 2 * * *
means “at 2:00 AM every day”.- Redirects both
stdout
andstderr
tobackup.log
for auditing.
Ensure the script has the correct path and sufficient permissions.
4. Adding Rotation and Cleanup
Archives can quickly consume disk space. Introduce rotation by deleting backups older than a certain age. Add this cleanup step to the script:
# Delete backups older than 7 days
find "$BACKUP_DIR" -name "projects_backup_*.tar.gz" -mtime +7 -exec rm {} \;
This command safely deletes backup files older than 7 days using find
and rm
.
Tips:
- For critical systems, consider syncing backups offsite or to cloud storage using tools like
rclone
orrsync
. - Log script operations for easier debugging.
5. Enhancements and Best Practices
Here are some ways you can expand or improve your backup automation:
- Use Configuration Files: Avoid hardcoding paths by using a
.conf
file for directory definitions. - Compress Selectively: Use include/exclude patterns to filter what gets backed up (
--exclude
flag intar
). - Add Email Alerts: Use
mail
orsendmail
to notify you of success or failure. - Use Systemd Timers: For modern Linux systems, consider
systemd
‘s built-in scheduling for better logging and error tracking.
Sample enhancement to backup only certain file types:
tar --exclude="*.tmp" -czf "$BACKUP_FILE" -C "$SOURCE_DIR" .
This excludes all temporary files from your archive.
Conclusion
Automating file backups with Bash and cron is a highly effective way to safeguard important data using native tools. It’s simple to implement, easy to maintain, and easily customized. Whether you’re backing up code, configs, or documents, embedding this automation into your workflow will provide peace of mind and save future headaches.
Useful links: