After receiving multiple notification emails from Google about my Google Photos storage running out—and getting tired of paying while handing over all my data—I finally decided to take matters into my own hands and manage my photos locally, right on my home server.
Since I’ve been familiar with Immich for a while, and it has now reached an impressive level of maturity, I figured it was the perfect time to install it on a Raspberry Pi.
So here is my post and its Table of Contents for the steps to fully switch from Google Photos to Immich:
- Hardware list
- Software
- Migrating from Google Photos
- Organize in Immich
- Backup
- Performance and monitoring
- Conclusion and considerations
Hardware list
I therefore chose the necessary components:
- Raspberry PI 5 8GB of RAM. (97,84€)
- 1TB SSD ⚠️ I initially bought an Orico NVMe PCIe 3.0 (J10) (link) because it came with a nice heatsink. But then I realized it wasn’t compatible with the M.2 PCIe adapter I had ordered. So, I sent it back and went for a more standard option: the Crucial P3 CT1000P3SSD8 (link), which works flawlessly. (64.20€)
- Heatsink with integrated M.2 slot and small fan (for emergencies). I went with the GeeekPi Armor Case with CN01 M.2 M-Key PCIe 3.0 x1 (link) – a good option that keeps the SBC cool, with a PWM fan that kicks in if the Pi is running heavy workloads (i.e. machine learning tasks). (29.99€)
- Official Raspberry Pi power supply (17.68€ on sale). Don’t cheap out on this! The power supply is crucial for the Pi 5, so avoid weird off-brand alternatives just to save a few euros.
Here it’s in its final stage:
Software
As always, I prefer the one SBC, one purpose approach rather than cramming everything into a single big server with tons of VMs. This way, I avoid software conflicts and other potential headaches if anything in the hardware fails. So, for the base layer of my Immich instance, I went with my beloved lightweight OS: DietPi.
After the first launch, remember to enable PCIe Gen3 in /boot/config.txt
. I also added a bit of safe overclocking, and after some testing, these values have been totally stable for me:
#pci gen 3
dtparam=pciex1_gen=3
#overclock
over_voltage=1
Arm_freq=2600
core_freq=930
The setup to run Immich is super-easy. First install via dietpi-software
:
[*] 134 Docker Compose: Manage multi-container Docker applications
[*] 162 Docker: Build, ship, and run distributed applications
Then create the user and run Immich via Docker:
root@Immich:~# useradd -m -s /bin/bash immich
passwd ***
root@Immich:~# su immich
usermod -aG sudo immich
immich@Immich:/root$ cd
immich@Immich:~$ mkdir immich-app
immich@Immich:~$ cd immich-app/
And to retrieve and install Immich, use the official guide: Docker Compose [Recommended] | Immich
immich@Immich:~/immich-app$ sudo wget -O .env https://github.com/immich-app/immich/releases/latest/download/example.env
.....
immich@Immich:~/immich-app$ sudo docker compose up -d
.....
And that’s it! Now the Immich instance is running at http://ip_address:2283
(in my case, http://192.168.1.15:2283
).
You’ll be prompted with some configuration options on the first setup, I just left the default ones.
Migrating from Google Photos
Now the veeeeery difficult stuff… I have 105 albums and about 95GB of photos. My idea was to download one album at a time, then download all the photos that aren’t inside an album to reorder and reorganize everything within the albums.
But Google Photos doesn’t allow you to create an album with all the photos that aren’t in an album! 🤬
After some research, I found this great and useful tool on GitHub to create an album with all the photos not in an album: Google-Photos-Toolkit.
You can simply install it via a userscript manager and set it up to create a new album with all the photos outside an album, making it much easier to organize everything.
Okay, after a few minutes, it created the album “Not in Album” with 9,923 items.
Now, with a lot of patience, you have to download each album from Google Photos using the three-dot menu.

…yes for every albums, obviously!
prepare the photos for Immich
Now, with all the albums and photos on my Mac, I organized the overall album structures in this way:
kinda - event name @ location [date]
A few examples:
Home@Mac-Mini Archivio Foto % du -sh * | sort -k2
610M Auto - Volkswagen Golf Highline DSG 160hp [2016]
1.1G Esposizione FriuliDOC 'Frasche e Osterie Friulane' @ Udine [2019]
983M Eventi - BorderWine @ Cividale [2018]
337M Lavoro - Cena gala Palazzo Kecher [2016]
208M Presentazione 'Cittadinanza Digitale II' @ Salerno [2016]
1.1G Varie - Micio 🐈⬛
6.9G _Home Networking 📡
692M _Lavoro 👨🏼💻
242M _Me 🧑🏼🦱
...
the mess with the metadata
The bad surprise: Google Photos makes a huge mess with metadata!
Any photo without the IPTC Captured Time data, basically all screenshots or images downloaded from WhatsApp, Reddit, and so on… will have the Captured Time set to the date you downloaded them. Terrible!
Luckily, almost all of my photos had the correct metadata, but for the ones that didn’t, I had to use Lightroom Classic to fix the mess! And, unfortunately, it took me a lot of time (for around 150-200 photos), but after about a week of evening and late night work, I finally managed to fix everything.
Here’s an example of the workflow in Lightroom. I had to search for the file name in Google Photos to find the “real time” and then edit it in Lightroom. Obviously and fortunately I was able to adjust multiple images at once.
Tip:
If you need to set a fixed time using the creation date of multiple photos, adjust timestamps, correct time zones, or shift photo times in bulk, I suggest you this useful (and free) app: Photo Date Adjustator
And another very useful (and free, again) app is PicArrange. It lets you search for similar photos based on colors and composition. And this is really great for spotting duplicate or nearly identical photos with different timestamps or EXIF data, something most duplicate finder apps can’t detect.
It took me about a week of evening and late night work to remove duplicates, delete useless photos and (worst of all) manually set the correct timestamps for tons of pictures and screenshots.
But after all these steps, it’s finally time to upload everything to Immich!
Organize in Immich
Time to finally upload everything!
Before starting, I tweaked a few settings in Immich:
- Corrected the Default Locale to Italian (yes, here in EU we use
dd/mm/yyy
) - Enabled the sidebar tags
- Disabled Memories
It might also be useful to change the CLIP model. I switched to ViT-B-16-SigLIP-384__webli
for better accuracy, and the Raspberry Pi 5 (8GB RAM) seems to handle it just fine.
I also adjusted the job concurrency settings:
- Generate Thumbnails concurrency: 3
- Smart Search and Face Detection concurrency: 6
I uploaded all the photos via the browser by creating a “Safari web app” (using File > Add to Dock…). Take your time with this! Don’t open 3-4 tabs and start uploading from everywhere, just upload one album at a time, wait for it to finish, then move on to the next. It’s better to be patient than to create a mess and have to redo everything.
After uploading a lot of photos, scanning and analyzing can take hours (you can check progress under Administration > Jobs), let Immich do its thing and don’t rush it! I recommend starting uploads in the evening and checking the jobs the next day.
Backup
After a few days, once everything is set up, it’s time to think about backups of course! As the Immich website clearly warns:
⚠️ The project is under very active development. Expect bugs and changes. Do not use it as the only way to store your photos and videos!
So, my first backup was a simple copy-paste of the old albums to my NAS, since those photos will never be modified. For the database index, you can use the built-in backup tool.
But for a full system backup, I mean a complete clone that can be easily swapped from one drive to another if needed, I use the built-in dietpi-backup
tool
I’m making the backup from the Raspberry Pi running Immich to my Synology NAS, and it’s super useful because it’s based on rsync, and this means it can compare and copy only modified items. In addition you can set the number of backups and tweak other settings, see the official pages for details: DietPi and NFS: Basics and improving security – DietPi Blog
Here’s how it looks:
It’s easy to do, but there are a lot of steps to remember each time. Since I’m already doing the same for my Grafana and Pi-hole DietPi Raspberry Pis, I’m used to it. But for those who don’t want to search and spend time figuring it out, here’s the basic process:
On the NAS add the necessary permissions for the Immich server to access the backup folder and create the Immich folder:
On DietPi install NFS support using:
apt install nfs-common -y
Create a mount point and add the newly created folder from the NAS:
root@Immich:/mnt# mkdir -p /mnt/backup
root@Immich:/mnt# mount -t nfs 192.168.1.8:/volume1/Backup-Raspberry-PI/Immich /mnt/backup
Created symlink /run/systemd/system/remote-fs.target.wants/rpc-statd.service → /lib/systemd/system/rpc-statd.service.
then open dietpi-backup
> Location > List and select the correct network drive
Now, switch back and check the other settings. You can change the settings there, such as how many backups to keep and which folders to exclude. For Immich, I keep only one backup, but for Grafana, for example, I use two backups, and the folder on the NAS will be named data, data_2, data_3, data_...
In the end, I have my photo library stored on three different drives across three different machines.
To summarize, my backup setup is:
Immich RPi server
├── Backup via dietpi-backup to NAS
│
└── Backup the Immich backup on the NAS to a USB drive using Hyper Backup
Or visualized:
Here, the NAS handles backing up the Immich backup to a USB external drive:
Performance and monitoring
First of all, I was a bit worried about whether a Raspberry Pi had enough power to handle the computational workload of Immich. Especially since the official site states:
RAM: Minimum 4GB, recommended 6GB.
CPU: Minimum 2 cores, recommended 4 cores.
But to my surprise, I found that these specs are overkill. During the 4-5 days of importing and indexing photos, I never saw RAM usage go above 3.5GB. I’m not sure if this is thanks to the lightweight DietPi OS or if the listed specs are more relevant for x86 machines. But either way, the Raspberry Pi 5 8GB handles Immich and all its operations flawlessly, without any issues.
Of course, I monitored everything using Prometheus and Grafana, so here are the general stats during the importing and indexing process:
During the importing process the system uses more overall resources but surprisingly less RAM. On the other hand, reindexing all the photos consumes more power in watts and increases CPU usage. Because of this, the PWM fan kicked into higher speed a few times. However, once I turned on the server cabinet fans, the temperature stabilized. I’ve never seen the Raspberry Pi go above 65-67°C, though in the summer, I expect it might reach around 72-75°C.
Here’s how the Raspberry Pi sits idle, with no active jobs, on a fresh install of Immich:
And here’re the resources used while importing and processing around 2k photos:
Conclusion and considerations
Well, in the end, I’m really happy because regaining full control of my photos was something I had wanted to do for a long time. I just never had the time, and I was afraid that alternatives to Google Photos either wouldn’t be good enough or would require too much hardware power.
With this setup, the power usage is really, really low. As you can see from the graph above, the Raspberry Pi uses about 3W when idling (which is almost all the time) and never goes above 12/13W under full load, only when importing thousands of photos.
To my surprise, Immich works just as well as Google Photos in daily use and user experience, but it’s way more customizable, flexible, and powerful! Not to mention privacy—I can finally upload more private documents to my library without leaving traces for Google.
Obviously the downside is that I always need to be connected to my home network, but with WireGuard’s on-demand VPN running all the time, I don’t even notice when I’m outside.