Yong Sen - Full-Stack Developer

Automated MongoDB Backup to S3 with Shell Script on Ubuntu

Learn how to create an automated MongoDB backup system using shell scripts that dump databases, compress them, and upload to AWS S3 with automatic cleanup.

January 25, 2025
6 min read

Automated MongoDB Backup to S3 with Shell Script on Ubuntu

This guide walks you through setting up an automated MongoDB backup system on Ubuntu that:

  • Creates MongoDB database dumps
  • Compresses backups for efficient storage
  • Uploads backups to AWS S3 for off-site storage
  • Automatically cleans up old local backups
  • Runs daily via cron

This is essential for production databases where data loss can be catastrophic. Regular automated backups ensure you can recover quickly from any data loss scenario.

Prerequisites

On your EC2 (Ubuntu):

sudo apt update
sudo apt install -y zip

Make sure your AWS CLI is configured with a role or credentials that have S3 upload permissions:

aws configure

Enter your Access Key, Secret Key, region, etc. You can verify the configuration with:

aws s3 ls

If you're using IAM roles instead of access keys, ensure your EC2 instance has the appropriate IAM role attached with S3 upload permissions.

Setting Up Permissions

Create the backup directory and set proper ownership. If you're running as the ubuntu user:

sudo mkdir -p /var/backups/mongodb
sudo chown ubuntu:ubuntu /var/backups/mongodb

This ensures your script can write to the backup directory without requiring sudo permissions.

Creating the Backup Script

Create the backup script at /usr/local/bin/mongo_backup.sh:

sudo nano /usr/local/bin/mongo_backup.sh

Copy the following script and customize the configuration section with your MongoDB and AWS details:

#!/bin/bash

# ========== CONFIGURATION ==========
MONGO_HOST="localhost"
MONGO_PORT="27017"
MONGO_DB="portalJksb"
MONGO_USER="<db user>"
MONGO_PASS="<password>"
AUTH_DB="<auth_db>"

BACKUP_DIR="/var/backups/mongodb"
DATE=$(date +'%Y-%m-%d_%H-%M-%S')
BACKUP_NAME="${MONGO_DB}_backup_${DATE}"
S3_BUCKET="s3://<bucket name>/mongodb-backups"
RETENTION_DAYS=7   # How many days to keep local backups
# ===================================

echo "[$(date)] Starting MongoDB backup for $MONGO_DB..."

# Ensure backup directory exists
mkdir -p "$BACKUP_DIR"

# Perform MongoDB dump to folder
mongodump \
  --host "$MONGO_HOST" \
  --port "$MONGO_PORT" \
  --username "$MONGO_USER" \
  --password "$MONGO_PASS" \
  --authenticationDatabase "$AUTH_DB" \
  --db "$MONGO_DB" \
  --out "$BACKUP_DIR/$BACKUP_NAME"

# Check if dump succeeded
if [ $? -ne 0 ]; then
  echo "[$(date)] ERROR: mongodump failed!"
  exit 1
fi

# Compress the backup folder
cd "$BACKUP_DIR"
zip -r "${BACKUP_NAME}.zip" "$BACKUP_NAME" >/dev/null 2>&1
rm -rf "$BACKUP_NAME"   # remove uncompressed folder

# Verify zip file exists
if [ ! -f "${BACKUP_NAME}.zip" ]; then
  echo "[$(date)] ERROR: zip compression failed!"
  exit 1
fi

# Upload to S3
aws s3 cp "${BACKUP_NAME}.zip" "$S3_BUCKET/"

if [ $? -eq 0 ]; then
  echo "[$(date)] Backup uploaded to S3 successfully."
else
  echo "[$(date)] ERROR: Failed to upload to S3."
fi

# Delete old local backups
find "$BACKUP_DIR" -type f -name "*.zip" -mtime +$RETENTION_DAYS -delete

echo "[$(date)] Backup completed successfully: ${BACKUP_NAME}.zip"

exit 0

Configuration Variables Explained

  • MONGO_HOST: MongoDB host (usually localhost for local instances)
  • MONGO_PORT: MongoDB port (default is 27017)
  • MONGO_DB: Name of the database to backup
  • MONGO_USER: MongoDB username with read permissions
  • MONGO_PASS: Password for the MongoDB user
  • AUTH_DB: Authentication database (often admin or the same as MONGO_DB)
  • S3_BUCKET: Your S3 bucket path (e.g., s3://my-backup-bucket/mongodb-backups)
  • RETENTION_DAYS: Number of days to keep local backups before deletion

Making the Script Executable

After creating the script, make it executable:

sudo chmod +x /usr/local/bin/mongo_backup.sh

Testing the Script

Before setting up the cron job, test the script manually to ensure everything works:

sudo /usr/local/bin/mongo_backup.sh

Check the output and verify:

  1. The MongoDB dump completes successfully
  2. The zip file is created
  3. The file uploads to S3 (check with aws s3 ls s3://<bucket name>/mongodb-backups/)
  4. No errors occur during execution

Setting Up Automated Daily Backups

Add a cron job to run the backup script daily at midnight:

sudo crontab -e

Add the following line:

0 0 * * * /usr/local/bin/mongo_backup.sh >> /var/log/mongo_backup.log 2>&1

This cron expression means:

  • 0 0 - Run at 00:00 (midnight)
  • * * * - Every day of the month, every month, every day of the week
  • >> /var/log/mongo_backup.log 2>&1 - Append all output (including errors) to the log file

Alternative: Running Multiple Times Per Day

If you need more frequent backups, adjust the cron schedule:

# Every 6 hours
0 */6 * * * /usr/local/bin/mongo_backup.sh >> /var/log/mongo_backup.log 2>&1

# Every 12 hours (noon and midnight)
0 0,12 * * * /usr/local/bin/mongo_backup.sh >> /var/log/mongo_backup.log 2>&1

# Every hour (for critical databases)
0 * * * * /usr/local/bin/mongo_backup.sh >> /var/log/mongo_backup.log 2>&1

Monitoring Backup Logs

Check the backup logs regularly to ensure backups are running successfully:

# View recent log entries
tail -f /var/log/mongo_backup.log

# View last 50 lines
tail -n 50 /var/log/mongo_backup.log

# Search for errors
grep -i error /var/log/mongo_backup.log

Advanced Configuration

Multiple Database Backups

If you need to backup multiple databases, you can modify the script to loop through them:

# In the CONFIGURATION section, define databases as array
MONGO_DBS=("db1" "db2" "db3")

# Then loop through each database
for MONGO_DB in "${MONGO_DBS[@]}"; do
  BACKUP_NAME="${MONGO_DB}_backup_${DATE}"
  # ... rest of backup logic ...
done

Adding S3 Lifecycle Policies

To manage costs, set up S3 lifecycle policies to transition old backups to cheaper storage or delete them:

  1. Go to your S3 bucket in AWS Console
  2. Navigate to Management → Lifecycle rules
  3. Create a rule to transition backups to Glacier after 30 days
  4. Optionally, delete backups older than 90 days

Backup Verification

Add a verification step to ensure backups are restorable:

# After creating the zip, add a verification step
cd "$BACKUP_DIR"
unzip -t "${BACKUP_NAME}.zip" >/dev/null 2>&1

if [ $? -ne 0 ]; then
  echo "[$(date)] ERROR: Backup file verification failed!"
  exit 1
fi

Email Notifications

Send email notifications on backup failure (requires mail setup):

# After S3 upload check
if [ $? -ne 0 ]; then
  echo "[$(date)] ERROR: Failed to upload to S3." | mail -s "MongoDB Backup Failed" admin@example.com
  exit 1
fi

Security Best Practices

Protecting Credentials

For better security, consider storing MongoDB credentials in environment variables or a secure config file:

# Store in /etc/mongo_backup.env (restrict permissions)
sudo chmod 600 /etc/mongo_backup.env

# In script, source the file
source /etc/mongo_backup.env

IAM Roles vs Access Keys

If running on EC2, prefer IAM roles over access keys:

  • No credentials to manage
  • Automatically rotated
  • More secure
  • Remove aws configure step and ensure EC2 instance has IAM role

Troubleshooting

Common Issues

mongodump command not found:

# Install MongoDB tools
sudo apt install -y mongodb-database-tools
# Or download from MongoDB website

Permission denied errors:

# Check script permissions
ls -l /usr/local/bin/mongo_backup.sh

# Check directory ownership
ls -ld /var/backups/mongodb

# Ensure user has write access

S3 upload fails:

# Verify AWS CLI is configured
aws s3 ls

# Test S3 access manually
aws s3 cp test.txt s3://<bucket>/test/

Cron job not running:

# Check cron service
sudo systemctl status cron

# Verify cron job exists
sudo crontab -l

# Check cron logs
sudo grep CRON /var/log/syslog

Restoring from Backup

To restore a backup:

  1. Download from S3:
aws s3 cp s3://<bucket>/mongodb-backups/<backup_name>.zip /tmp/
  1. Extract:
cd /tmp
unzip <backup_name>.zip
  1. Restore:
mongorestore \
  --host localhost \
  --port 27017 \
  --username <user> \
  --password <password> \
  --authenticationDatabase <auth_db> \
  --db <db_name> \
  /tmp/<backup_name>/<db_name>

Conclusion

You now have an automated MongoDB backup system that:

  • ✅ Runs daily backups automatically
  • ✅ Compresses backups for efficient storage
  • ✅ Uploads to S3 for off-site redundancy
  • ✅ Cleans up old local backups automatically
  • ✅ Logs all operations for monitoring

Regular backups are crucial for data protection. Test your backup restoration process regularly to ensure you can recover when needed. Consider setting up monitoring alerts for backup failures to catch issues early.

Post Details

January 25, 2025
6 min read
Tags
MongoDBUbuntuBackupAWSS3DevOpsShell ScriptCron