Comment by dakial1

Comment by dakial1 15 hours ago

2 replies

Some time ago I configured Photostructure on my Synology (with the amazing help of the author, @mceachen) and the most paindful part was rescuing my 1.5TB of photos from Google Photos. Takeout was very cumbersome to use and download 100+ files of 4gb, so ultimately resorted to paying a higher tier at Google Drive, using takeout Google Drive option and the sync to the NAS. I still don’t have a good method to keep everything in sync as Google Photos does not offer a viable option for a cloud-to-premises sync.

mceachen 14 hours ago

I wrote some tips and tricks that I've found to help coax Google Takeouts into working: https://photostructure.com/faq/takeout/

Also: you should try out the latest build! https://photostructure.com/about/v2026.1/

FWIW all of these projects rely on ExifTool (which people should donate to!) and my open-source node.js wrapper (that adds concurrency, does a ton of extra parsing work, and makes things a bit more ergonomic to live with): https://github.com/photostructure/exiftool-vendored.js

bombela 14 hours ago

For what is worth, google takeout can export in 50GB tgz.

Downloading the takeout files is miserable through, the download link is only valid when being downloaded via human interaction in a web browser.

There is a silly trick. Start the download, pause it. Get the cookies from the page (only need to do that once for the session). Then copy the download link. Now you can curl on your server. When the file is downloaded, you can then cancel the download in your web browser. And do the same for the next file. One at a time.

Warning: google will cancel downloads if you run more than one or two at a time. After 3 download (failed or not) of a file google will delete the whole takeout.

The amount of engineering they must have deployed to purposefully crimple takeout with plausible deniability must be significant.