Why I worked on this
Home Assistant’s built-in backup tool is fine for a single destination, but I wanted two things it still won’t do out of the box:
- Push the same snapshot to two different cloud buckets (Backblaze B2 and Cloudflare R2) so I’m not married to one provider.
- Rotate the remote copies automatically—keep seven daily and four weekly versions—without me logging in to either UI and clicking “delete”.
I also didn’t want to open any inbound ports or run a VPN tunnel just for backups; the box has to phone home, nothing phones in.
My real setup or context
- Home Assistant OS 12.2 running on an Intel NUC-like fanless box (Proxmox VM, 4 vCPU, 8 GB RAM, 128 GB virtual disk).
- Add-on: jcwillox/hassio-rclone-backup (v3.3.5) installed from the community repo. This gives me an rclone binary inside the HA OS container and a small wrapper that fires after every native backup.
- Two S3-compatible buckets already created:
b2://ha-encon Backblaze B2 (us-west-002)r2://ha-encon Cloudflare R2 (auto-region)
- Encryption password stored in HA’s secrets.yaml; rclone uses that to encrypt file names and contents before upload.
What worked (and why)
1. One rclone remote per bucket, both pointing to the same crypt overlay
I defined two remotes in /config/rclone.conf:
[b2-raw] type = s3 provider = Backblaze account = 003... key = K003... endpoint = s3.us-west-002.backblazeb2.com [r2-raw] type = s3 provider = Cloudflare account = 6c5... key = 3b8... endpoint = https://489...r2.cloudflarestorage.com [b2-crypt] type = crypt remote = b2-raw:ha-enc password = **REDACTED** filename_encryption = standard directory_name_encryption = true [r2-crypt] type = crypt remote = r2-raw:ha-enc password = **REDACTED** # same 128-bit key derived from the same passphrase
Using the same password means either bucket can restore the same file; I tested a restore from both sides to be sure.
2. Let the native backup job run first, then hook the add-on
In the add-on’s config I left trigger on backup_success and pointed rclone_copy to a small shell script I dropped into /config/rclone-post.sh:
#!/command/with-contenv bashio
# /config/rclone-post.sh
set -e
LATEST_TAR=$(ls -1t /backup/*.tar | head -n1)
NAME=$(basename "${LATEST_TAR}")
# copy
rclone copy "${LATEST_TAR}" b2-crypt:daily/ --bwlimit 08:00,2M 23:00,8M
rclone copy "${LATEST_TAR}" r2-crypt:daily/ --bwlimit 08:00,2M 23:00,8M
# rotate daily (keep last 7)
rclone delete b2-crypt:daily --min-age 7d --include "*.tar"
rclone delete r2-crypt:daily --min-age 7d --include "*.tar"
# once a week move the oldest daily to weekly
if [ "$(date +%u)" -eq 7 ]; then
rclone moveto "b2-crypt:daily/${NAME}" "b2-crypt:weekly/${NAME}"
rclone moveto "r2-crypt:daily/${NAME}" "r2-crypt:weekly/${NAME}"
fi
# keep 4 weekly
rclone delete b2-crypt:weekly --min-age 28d
rclone delete r2-crypt:weekly --min-age 28d
The add-on runs the script as root inside the HA container, so no permission issues. Upload speed is throttled to stay friendly to the rest of the household.
3. A cheap cron-double-check
I also added a normal cron job on the Proxmox host (not inside HA) that runs rclone sync b2-crypt: r2-crypt: --checksum --dry-run every night. If the exit code is non-zero I get a Pushover. So far it has never complained, but the extra five-line script lets me sleep better.
What didn’t work
- Duplicating the built-in schedule. HA still only allows one automatic backup configuration. I tried firing
hassio.backup_fullfrom an automation with a different location key, but the action ignores the location parameter on OS installs. The add-on route was the only reliable way. - Using the same bucket credentials for both copies. Backblaze and R2 use different signature versions; a single remote entry with conditional endpoints failed halfway through. Separate remotes solved it.
- Trying to rotate by
--max-ageinstead of--min-age. I mixed them up the first night and accidentally deleted the newest file. Lesson: always dry-run.
Key takeaways
- Let Home Assistant create the tar; use rclone only for off-site push and rotation—keeps the add-on simple and avoids re-implementing HA’s exclude logic.
- Encrypt at rest with rclone’s crypt remote; both providers see only random blobs, so I don’t have to trust either TOS.
- Bandwidth limiting inside rclone is easier than trying to shape traffic at the router; one flag and the family doesn’t notice.
- Keep the rotation logic outside HA automations; a ten-line bash script is easier to debug than a YAML automation with five choose branches.
Total cost last month: 1.8 GB × 30 snapshots ≈ 55 GB, $0.27 on B2 and $0.23 on R2. The cron job plus Pushover is the only part I still need to move inside HA so everything lives in one repo—something for the next rainy weekend.