A simple backup strategy is to have:
There are a lot of options out there to do this (GoogleDrive, Dropbox, OneDrive, etc.). This guide documents Box and Mega. I chose thgese two because they give 50GB of free storage. I only backup my scripts and docker directories to them so that's plenty of space.
Here's a script I use to backup my scripts directory. I tar the scripts directory, compress it, and load that to MEGA:
#!/bin/bash
set -euo pipefail
# === CONFIG ===
SRC="YOUR_SCRIPTS_DIRECTORY"
EXCLUDES=(
"venv"
"spotify_env"
)
DEST_DIR="/backup-scripts" # This is the name of the dicertory on MEGA. Use whatever you want
DATE=$(date +%F)
ARCHIVE="scripts-${DATE}.tar.gz"
# === BUILD TAR COMMAND WITH EXCLUDES ===
TAR_CMD=(tar -czf "$ARCHIVE")
for EX in "${EXCLUDES[@]}"; do
TAR_CMD+=(--exclude="$EX")
done
TAR_CMD+=("$SRC")
echo "[INFO] Creating archive: $ARCHIVE"
"${TAR_CMD[@]}"
echo "[INFO] Logging out of MEGA"
mega-logout # You must be logged out of MEGA or you'll get an error when you try to log in
echo "[INFO] Logging in to MEGA"
mega-login "YOUR_EMAIL" "YOUR_PASSWORD" # Use the email and password you used when creating the account
echo "[INFO] Uploading to MEGA: $DEST_DIR"
mega-put "$ARCHIVE" "$DEST_DIR" # Upload the tar file to MEGA
echo "[SUCCESS] Backup complete: $ARCHIVE uploaded to $DEST_DIR"
rm -f "$ARCHIVE"
I keep 30 days of backups. Here's a script to clean up MEGA and keep only 30 days of backups.
#!/bin/bash
set -euo pipefail
TARGET="backup-scripts" # Your MEGA backup directory
DAYS=30 # Keep 30 days worth of backups
# Get cutoff timestamp (seconds since epoch)
CUTOFF=$(date -d "$DAYS days ago" +%s)
echo "[INFO] Logging out of MEGA"
mega-logout
echo "[INFO] Logging in to MEGA"
mega-login "YOUR_EMAIL" "YOUR_PASSWORD"
echo "[INFO] Listing MEGA folder"
mega-ls -l "$TARGET"
# Skip the first TWO lines (folder name + header)
mega-ls -l "$TARGET" | tail -n +3 | while read -r LINE; do
[[ -z "$LINE" ]] && continue
FLAGS=$(echo "$LINE" | awk '{print $1}')
SIZE=$(echo "$LINE" | awk '{print $3}')
DATE=$(echo "$LINE" | awk '{print $4}')
TIME=$(echo "$LINE" | awk '{print $5}')
NAME=$(echo "$LINE" | awk '{for(i=6;i<=NF;i++) printf $i " "; print ""}')
# Convert 09May2026 → "09 May 2026"
DATE_FMT=$(echo "$DATE" | sed 's/\([0-9]\{2\}\)\([A-Za-z]\{3\}\)\([0-9]\{4\}\)/\1 \2 \3/')
echo "TIME: $TIME"
echo "DATE: $DATE"
echo "DATE_FMT: $DATE_FMT"
FILE_TS=$(date -d "$DATE_FMT $TIME" +%s)
if (( FILE_TS < CUTOFF )); then
echo "Deleting old file: $NAME"
mega-rm "$TARGET/$NAME"
fi
done
Now let's do the same for Box:
You don't really install Box, you setup rclone to use box.
Here's a simple script to tar a directory, compress it, and send it to box:
#!/bin/bash
set -euo pipefail
# ============================
# CONFIGURATION
# ============================
SRC_PATH="YOUR_DIRECTORY_TO_BACK_UP"
DEST_PATH="box:backups-scripts" # Your directory on Box
DATE=$(date +%F)
ARCHIVE="scripts-${DATE}.tar.gz"
EXCLUDES=(
"venv"
"spotify_env"
"__pycache__"
".cache"
)
LOG() {
echo "[INFO] $1"
}
# ============================
# BUILD TAR COMMAND
# ============================
TAR_CMD=(tar -czf "$ARCHIVE")
for EX in "${EXCLUDES[@]}"; do
TAR_CMD+=(--exclude="$EX")
done
TAR_CMD+=("$SRC_PATH")
# ============================
# CREATE ARCHIVE
# ============================
LOG "Creating archive: $ARCHIVE"
"${TAR_CMD[@]}"
# ============================
# UPLOAD TO BOX
# ============================
LOG "Uploading to Box: $DEST_PATH"
rclone copy "$ARCHIVE" "$DEST_PATH" --checksum --stats=0
# ============================
# VERIFY UPLOAD
# ============================
if rclone ls "$DEST_PATH" | grep -q "$ARCHIVE"; then
LOG "Upload verified: $ARCHIVE exists in $DEST_PATH"
else
echo "[ERROR] Upload failed: $ARCHIVE not found in $DEST_PATH"
exit 1
fi
# ============================
# CLEANUP
# ============================
rm -f "$ARCHIVE"
LOG "Local archive removed"
# ============================
# EXPLICIT CLOSURE
# ============================
LOG "Backup completed successfully"
exit 0
Here's the cleanup script like the one for MEGA:
#!/bin/bash
set -euo pipefail
TARGET="box:backups-scripts"
DAYS=30 #retain 30 days of backups
# Cutoff timestamp (seconds since epoch)
CUTOFF=$(date -d "$DAYS days ago" +%s)
echo "[INFO] Listing Box folder: $TARGET"
# rclone lsjson gives deterministic machine‑readable output
rclone lsjson "$TARGET" | jq -c '.[]' | while read -r ITEM; do
NAME=$(echo "$ITEM" | jq -r '.Name')
MODTIME=$(echo "$ITEM" | jq -r '.ModTime')
# Convert ModTime → epoch
FILE_TS=$(date -d "$MODTIME" +%s)
echo "[DEBUG] File: $NAME | ModTime: $MODTIME | TS: $FILE_TS"
if (( FILE_TS < CUTOFF )); then
echo "[INFO] Deleting old file: $NAME"
rclone delete "$TARGET/$NAME"
fi
done
echo "[SUCCESS] Box prune complete for $TARGET"
I run these nightly using cron. Backing up my docker and scripts directories on my three systems for 30 days should consume less than 10GB of data so these fit comforatbly in the "Free" tiers of MEGA and Box>
If you find my content useful, please consider supporting this page: