Node.js Script to Back Up Your MongoDB Daily via Cron Job
When managing production-grade applications, database backups are your safety net. Automating those backups ensures you can recover data quickly in case of a crash or human error. This tutorial walks you through creating a Node.js script that uses `mongodump` to back up your MongoDB database daily using cron jobs. You’ll learn how to write the script, store backups locally or remotely, and schedule the task with crontab.
1. Why Automate MongoDB Backups?
Automatic backups are essential for any production database. Whether you host MongoDB on your local server or via services like MongoDB Atlas, having frequent snapshots can protect against data corruption, accidental deletions, or ransomware attacks. Manual backups are too error-prone and time-consuming for regular use.
In this guide, you’ll:
- Write a Node.js script that runs `mongodump` to export your MongoDB database
- Organize backups with timestamped folders
- Use cron for daily execution
- Optionally upload backups to remote storage (e.g., AWS S3)
2. Setting Up the Node.js Project
First, create a new Node.js project and install necessary dependencies. We’ll use Node’s built-in `child_process` module to run shell commands. For S3 integration, you can use the AWS SDK.
mkdir mongodb-backup-script
cd mongodb-backup-script
npm init -y
npm install aws-sdk dotenv
Create a `.env` file to store your database URI and S3 credentials safely:
DB_URI=mongodb://localhost:27017/mydb
BACKUP_DIR=/path/to/backups
AWS_ACCESS_KEY_ID=your-access-key
AWS_SECRET_ACCESS_KEY=your-secret-key
S3_BUCKET_NAME=your-bucket-name
Make sure to add `.env` to your `.gitignore` to avoid leaking credentials.
3. Writing the Backup Script
Now let’s write the script that dumps your MongoDB database using `mongodump` and saves it to a timestamped folder.
// backup.js
require('dotenv').config();
const { exec } = require('child_process');
const path = require('path');
const fs = require('fs');
const backupDir = process.env.BACKUP_DIR;
const dbUri = process.env.DB_URI;
const timestamp = new Date().toISOString().replace(/:/g, '-');
const backupPath = path.join(backupDir, `backup-${timestamp}`);
if (!fs.existsSync(backupDir)) {
fs.mkdirSync(backupDir, { recursive: true });
}
const cmd = `mongodump --uri="${dbUri}" --out="${backupPath}"`;
exec(cmd, (error, stdout, stderr) => {
if (error) {
console.error(`Backup failed: ${error.message}`);
return;
}
if (stderr) {
console.error(`stderr: ${stderr}`);
} else {
console.log(`Backup successful! Saved to: ${backupPath}`);
}
});
This script uses `mongodump` to export your database into a subdirectory with a timestamp. Using ISO format helps with chronological sorting.
4. Scheduling with Cron
Now that the script works, schedule it using crontab. Type `crontab -e` to edit your crontab file and add the following line to run the backup daily at 2 AM:
0 2 * * * /usr/bin/node /full/path/to/mongodb-backup-script/backup.js >> /var/log/mongodb-backup.log 2>&1
Make sure `/usr/bin/node` points to your Node.js binary (run `which node` to get it). Redirecting stdout and stderr to a log file will keep the cron errors visible.
Tip: Set up log rotation for `/var/log/mongodb-backup.log` to avoid it growing too large.
5. Optional: Uploading to S3
If you want offsite storage, uploading your backup to Amazon S3 is a great idea. Here’s how to zip the backup and upload it.
const AWS = require('aws-sdk');
const archiver = require('archiver');
const s3 = new AWS.S3();
const zipPath = `${backupPath}.zip`;
function zipBackup() {
return new Promise((resolve, reject) => {
const output = fs.createWriteStream(zipPath);
const archive = archiver('zip', { zlib: { level: 9 } });
output.on('close', () => resolve());
archive.on('error', err => reject(err));
archive.pipe(output);
archive.directory(backupPath, false);
archive.finalize();
});
}
async function uploadToS3() {
await zipBackup();
const fileContent = fs.readFileSync(zipPath);
const params = {
Bucket: process.env.S3_BUCKET_NAME,
Key: path.basename(zipPath),
Body: fileContent,
};
s3.upload(params, function (err, data) {
if (err) {
console.error('S3 Upload Error:', err);
} else {
console.log(`Uploaded to S3: ${data.Location}`);
}
});
}
Call `uploadToS3()` after a successful dump in the original `backup.js`. Ensure your IAM user has correct S3 permissions.
6. Final Thoughts & Best Practices
- Set up monitoring to alert you if a backup fails (e.g., send an error email).
- Use environment variables securely, preferably with a secrets manager in production.
- Regularly test your restore process. Backups are only useful if you can restore them!
- Consider deleting backups older than X days to save disk space.
By integrating this script and cron job into your deployment workflow, you create a more robust system ready for production. Automation is not just about convenience — it’s about protecting your data investments.
Useful links:


