Migrating Google Photos to my Personal NAS

Recently I’ve been working to somewhat detach myself from Google services after seeing growing reported cases of users being suddenly locked out of their account without any possible recourse.

As part of this, I had about 5 year’s worth of photos stored solely within Google Photos that I wanted to migrate and organise within my personal NAS storage.

Initial Export

Grabbing all my Google Photos content was pretty easy. It was just a case of using Google Takeout to generate a bulk export. The export was split, in my case, across multiple .tgz files of a chosen 10GB size. I chose this instead of 50GB to keep download chunks a little more manageable in the event of connection failure during download.

Once my export was ready, a few hours after, I downloaded the archives, extracted them and combined them into a single folder to centralise things. One minor annoyance was the amount of duplicates due to the same file being stored across albums & other folders. I organised and de-duped everything into my own custom structure before cleaning up the files and compressing using the scripts below.

Cleaning Up JSON Files

The export will contain a lot of JSON files for album and photo metadata. I didn’t care about these at all so running the below from the root photo folder directory quickly took care of those:

find ./ -type f -name '*.json' -delete

“Edited” Image Versions

Many images had copies with -edited appended to their name. Some of these were my own edits via Google Photos, but many others I assumed to be auto-edits by Google Photos. Again, I didn’t care about these so a quick find command from the root photo folder got rid of them:

find ./ -type f -iname '*-edited.jpg' -delete

Compressing Images to WebP

The export resulted in many gigabytes of JPG images. Many of these were very high resolution and thus individually quite significant in size. I didn’t want to delete images, but at the same time, wanted to regain some space. I decided to compress down many old images to reasonable quality WebP images. To be space efficient I was using lossy WebP so I’d lose some image quality to achieve this but that was a trade-off I thought was fair to keep all images. For a select few important images, I set them aside and out of the scope of my compression commands.

These are the commands I used for this compression:

# Convert jpg images, within the current directory, over 1MB to WebP
# Requires parallel, imagemagick and find to be installed.
# Runs the compression across threads in parallel for maximum speed-up.
# WARNING: This compression is lossy (Quality will be lost).
find ./ -type f -iname '*.jpg' -size +1M -print0 | parallel -0 mogrify -format webp -quality 80 {}

# Find and delete the old jpg images we converted above
find ./ -type f -iname '*.jpg' -size +1M -delete

Just a consideration on using WebP, while generally quite well supported now their support is still not as universal on jpeg. I use WebP here since I’ve found the lossy compression format to produce less artifacts while allowing smaller file sizes.

You can play with the -quality 80 number to get the balance of quality/filesize you desire. I’d usually spot-check a few images before running the second delete command to ensure I was getting the right output. For some specific directories, with images of very little sentimental value, I’d lower this down to 60 for even better compression ratios.

Compressing MP4 Videos

My export contained many videos I had taken with my phone. Like with images, I wanted to keep these and was happy to compress, at a loss of quality, to retain them with a smaller filesize.

I did this with the following bash script:

# Find mp4 videos in the current directory and convert each via FFMPEG
# to lower quality mp4 videos.
# WARNING: This compression is lossy (Quality will be lost).
for filepath in ./*.mp4
do
filename=$(basename -s ".mp4" "$filepath")
echo $filename
ffmpeg -i "$filepath" -vcodec libx264 -crf 32 "cmp-$filename.mp4"
done

There’s no deletion of old videos in the above since I did that manually afterwards. I did play with newer video codecs (vp9, h265) but found the vastly slower encoding speed, to gain a better size/quality output, was not worth it for my use. I also roughly played with the -crf value in the above to land on 32 as the output quality I was happy with.

Future Photo Handling

While I’ve migrated my existing content from Google Photos, I’ll probably still continue to use it for temporary phone photo backup, since it’s proved pretty reliable and seamless for that use. Every so often, I’ll take the content and archive it within my NAS as I’ve done above for long-term safe-keeping.

A Note on Compression

As mentioned above, I’ve made heavy use of lossy compression for my content. When using lossy compression you need to be aware you’ll lose quality/data on each compression pass, so you’d only want to compress in this way once, or ideally not compress at all.

For me, I wanted the keep storage space reasonable. I had plenty of space on my NAS to store all original content, but my main concern was long-term backup storage, since I’d be syncing these files from my NAS to Backblaze B2.