My goal is to to fully ditch Google Photos for Immich. I have about ~3TB of photos and videos. Looking for a super simple way of backing up the library to cloud storage in case of a drive failure without spending a ton.
Ideally, this will require nothing on my part besides copying files into a given folder. And ideally the storage will be encrypted and have basic privacy assurances.
Also if it matters my home server is running Debian. But I’d prefer something that runs in docker so I can more easily check on it remotely.
Hetzner Storage Box Costa 3,81€ per month for 1TB and you can acess it via SSH or WebDav.
Hetzner will increase their prices by April, still a good value
https://www.hetzner.com/pressroom/statement-price-adjustment/
Another +1 for Hetzner.
I did an initial backup of my music (so I wasn’t concerned about encryption) with plain old
rsyncto get a feel for the system first, do a restore, etc. to feel comfortable with it all - and see if there were any hidden costs.Then I wiped all that and moved over to
rcloneto encrypt my data into different chunks (photos, music, work, etc)It all worked well and they even skipped charging me 1 month becuase I hadn’t exceeded their minimum charge (rolls up to the following month)
I’ve had proactive emails from them notifying me of work which might have reduced my ability to access their system, but ad it was outside the time of my backups, then no issue.
+1 for Hetzner Storage Boxes, even thought with the general increase in pricing for hardware I don’t know how long they’ll be able to keep them so cheap.
To back up the data automatically, I use BackRest (it’s a nice web interface that uses restic under the hood).
I’ve been using Backblaze buckets to hold data I serve to friends with GameVault, and a few other things. Sitting at just shy of 1TB if memory serves? I’m paying about 4 bucks a month.
If you serve it through cloudflare, you aren’t charged for bandwidth. Which works out nicely because I use a cloudflare worker to redirect all download requests in GameVault. Super speedy downloads for my users, zero bandwidth coming off my home network.
I second backblaze. Switched to them about 5 years ago when AWS and GCP were charging me $20-30 for storing my backups. The same backups cost $5-7 on backblaze.
borg backup + ms office family, it comes with 6*1TB (split in 6 accounts)
(split in 6 accounts)
This makes it not really suitable for this.
Backblaze desktop is your best option.
Buy two 4tb extern drives. Copy your photos onto both. Leave on at your mom’s house in a closet. Leave the other in a locker at work or a safety deposit box.
No monthly fees, no techbro cloud capitalists.
Time is money, and the time it would take to keep those backups up to date is not worth it over cloud backups.
This is the way
Media doesnt dedupe.
And OP liekly doesnt have every picture 4 times.
Borgbase has good options for Borg and restic backups.
I highly recommend using one of these 2 for proper backups. Borg with borgmatic scripts are fantastic
You could check out FolderFort. Sometimes they have lifetime pricing. I paid something like $170 for 3TB lifetime.
They have SFTP access in beta.
Hetzner storage box plus borgbackup (also to save on storage since it dedups).
I also use Hetzner, but with restic (through the web interface BackRest) instead of borg.
Is restic fine? Or should I migrate to Borg?
Pcloud has lifetime deals with encryption.
I’ve had it for a very long time and paid once long ago. It works on Linux as well.
Use Restic backups to a local drive then sync that with something like rsync to ovhcloud cloud archive (not cold archive but that can work too). You can also skip the local copy but it’s better to have one and if you sync weekly it gives you opportunities to do things like cull photos you took too many of before it slaps them all up. There are plenty of GUI based restic interfaces now if you want a quick check or browse. Use healthchecks.io to monitor the cronjobs and alert you if they aren’t working.
And ideally the storage will be encrypted and have basic privacy assurances.
Do it locally with cryptomator or similar so the cloud will only see encrypted data.
Not the same, but for my Immich backup I have a raspberry pi and an HDD with family (remote).
Backup is rsync, and a simple script to make ZFS snapshots (retaining X daily, Y weekly). Connected via “raw” WireGuard.
Setup works well, although it’s never been needed.
raspberry pi and an HDD with family (remote)
Is this the way to go for off-site backups w/ family? In terms of low power draw, uptime, etc.
That absolutely works, but when I built my offsite backup to hetzner I also thought about setting up own hardware and came to conclusion that for myself it doesn’t really make a ton of sense. New RPi + 4TB ssd/m.2 drive with accessories adds up to something around 400€ (if that’s even enough today), or few years worth of cloud backups. With own hardware there’s always need to maintain it and hardware failures are always an option, so for me it makes more sense to just rely on big players with offsite backups. Your case might be different for various reasons, but sometimes renting capacity just makes more sense in the big picture.
Why would you use SSDs for backup? I think a HDD should be fine for that.
Especially because SSDs start losing data is they’re powered off for some time.
Sound and power consumption. At least in my case those are important if I was going to store data at my mothers house. Power consumption might not matter that much, but HDD sound definetly does. And even with spinning rust hardware cost would be somewhere around 250€ compared to ~20€/month of cloud storage.
YMMV, in my scenario it’s just easier to use a cloud provider.
I’ve been pleased with it. Family is very relaxed about projects like this, but yeah it’s low power draw. I don’t think I have anything special set up but the right thing to do for power would be to spin down drive when not in use, as power is dominated by the spinning rust.
Uptime is great. Only hiccups are that it can choke when compiling the ZFS kernel modules, triggered on kernel updates. It’s an rpi 3/1GB RAM (I keep failing at forcing dkms to use only 1 thread, which would probably fix these hiccups 🤷).
That said, it is managed by me, so sometimes errors go unnoticed. I had recent issues where I missed a week of rsync because I switched from pihole to technitium on my home server and forgot to point the remote rpi there. This would all have been fixed with proper cron email setup…I’m clearly not a professional :)
If you’re already running ZFS, sanoid would be an option.
Okay, how do you get sanoid & syncoid to run, because I’ve tried, and I’m just too dummy. When it makes a backup, is it literally making a zfs data record/pool/whatever on the other machine? Or is it more like a file? I have a Proxmox running cockpit (SMB & NFS) and the machine is connected to a USB drive bay that has ZFS. My immich is saving pictures to my ZFS drive bay via SMB.
I’ve tried to do
syncoid pool_name/data/immich root@cockpit.service.IP.addr:mnt/samba/backupsbut I get hit with:
Long ass error message
WARNING: ZFS resume feature not available on target machine - sync will continue without resume support. INFO: Sending oldest full snapshot Orico2tera4/data/immich@syncoid_nova_2026-01-27:13:38:44-GMT-05:00 to new target filesystem root@192.168.0.246:/mnt/samba/backups (~ 42 KB): /dev/zfs and /proc/self/mounts are required. Try running 'udevadm trigger' and 'mount -t proc proc /proc' as root. 44.2KiB 0:00:00 [ 694KiB/s] [===========================================] 103% CRITICAL ERROR: zfs send 'Orico2tera4/data/immich'@'syncoid_nova_2026-01-27:13:38:44-GMT-05:00' | pv -p -t -e -r -b -s 43632 | lzop | mbuffer -q -s 128k -m 16M | ssh -S /tmp/syncoid-root1921680246-1772385641-845218-1784 root@192.168.0.246 ' mbuffer -q -s 128k -m 16M | lzop -dfc | zfs receive -F '"'"'/mnt/samba/backups'"'"' 2>&1' failed: 256I’ve tried reading the github docs and some forums but I’m dummy. I just want to have backups that I can encrypt and keep in a cloud for cheap somewhere. Does it literally have to be two different machines (god I’m dumb)? Can I just auto run ZFS snapshots and encrypt then save those to Drive/OneDrive/Whoever?
You can do a sanoid sync to another zpool or dataset on the same machine or a remote host, they behave the same. It’s replicating that dataset on the other machine, then sending the snapshots after that point over via
zfs send. You can instruct sanoid to prune those snapshots after the send and start new ones for the next send, or just accumulate them so you have points in time to revert to.IIRC, you can send a zfs snapshot to a file, but I can’t recall how to do that, so AFAIK, you can’t just send it to a file based service like Onedrive. You can use a service like zfs.rent and send them a harddrive with your base sync on it (encrypt it) and then once they’ve brought it online, you can sync to that. Best to test out your methods with the drive hooked up locally.
I know it’s anathema to Lemmy, but the best help you’ll get is Claude where you can paste the errors in and have it sort it out for you as you troubleshoot. It’s pretty good at shit like that.
I use hetzner storagebox for similar needs. It’s not encrypted, so you need to manage that by yourself, but they support a ton of protocols and pricing is decent, even if they’re increasing the price shortly.
What does the setup look like on your end? Is there like, an app? Also how would I look into managing encryption by myself?
I use Borg Backup to backup specific folders of my hard disk to my hetzner storage box.
The software is triggered by corn/systemd to start a backup.
Does borg need an entire python venv?
I was looking at “modern” backup tools while back and when I saw borg was python I decided not to bother.
Instead I focused on restic for a little while and then rsync was already there and I already knew the commands so… Rsync. Though I still have restic on my list.
If you want to go on the restic route, you can try BackRest: it’s a web interface for restic with graphs and all.
I’m using proxmox backup server to make copies of full virtual machines, it takes care of encryption and verification of the data, so it’s not exactly the same than your scenario. Borg Backup is commonly recommended, but restic and dejadup are worth checking out too.
Backblaze B3, backup software of your choice pointed at the Immich library. Photos get put into Immich, backup runs, data encrypted and saved offsite.
Backup software of your choice pointed at the Immich library
Any recommendations? Preferably something something with a homeassistant integration or docker container with webui so I can more easily access it remotely. New to all this.
I use Duplicati and I THINK it has a container option? It is a web UI though.
I have my Immich library on a network drive and I took the lazy way and have my desktop duplicati just back up the network drive instead of directly on the server 😅
Looks like it does have a container option! $100/year for Backblaze computer backup is above what I was hoping to spend but it’s unlimited and I’m looking for a set it and forget it option so I’ll probably do exactly that, thank you.
This is what I did, only I set it up such that my family’s computers are backing up to my large external drive, and this drive is connected to the computer with the unlimited BB running and backing up. Just to get a little more benefit out of the cost.
There’s a plan where you pay some tiny amount per gb. Thats the one to use.
It’s $6/tb which isn’t bad but for my 3tbs is still more than $100/yr.
I’d be surprised if you find cheaper, but if you do, please report back.
Fwiw, BB have been super reliable for me over that past few years I’ve used them.
Yes you just taught me I’m paying more than I needed to using their B2 directly lol but I gave a few different backup buckets configured and I don’t mind paying a little extra for flexibility, vs paying for each machine I want data backed up on.
I use restic to backup immich (and everything else) to b3.
Be sure to stop the docker container while you backup to avoid skew.
Backblaze saved my ass at the end of last year when I had a hardware failure.
Acronyms, initialisms, abbreviations, contractions, and other phrases which expand to something larger, that I’ve seen in this thread:
Fewer Letters More Letters IP Internet Protocol NFS Network File System, a Unix-based file-sharing protocol known for performance and efficiency RPi Raspberry Pi brand of SBC SBC Single-Board Computer SMB Server Message Block protocol for file and printer sharing; Windows-native SSD Solid State Drive mass storage SSH Secure Shell for remote terminal access ZFS Solaris/Linux filesystem focusing on data integrity
7 acronyms in this thread; the most compressed thread commented on today has 8 acronyms.
[Thread #123 for this comm, first seen 1st Mar 2026, 17:30] [FAQ] [Full list] [Contact] [Source code]
I’ve had a good experience with pCloud. One-time lifetime fee. Just set the Immich directory in its entirety as a backup folder.
3TB is a weird place to be with their pricing, though. You can buy 2 TB twice, iirc.
How the fuck does that business model work? 10TB is cheaper than Backblaze B2 in 20 months.
It’s not unlimited transfer like Backblaze. Also not as fast.











