It appears you didn't really read the article. There's no need to pull 100gb of data.
And 100gb is comparatively small for a photo and video library. The article mentions testing with 250k photos, which would be well over 100gb (2-3 times that at minimum, significantly more if RAW and high res).
>It appears you didn't really read the article. There's no need to pull 100gb of data.
I read the article, paying special attention to section 4, aptly titled "Smoke and Mirrors".
There's always a need to pull in as much data as the user wants to actually interact with.
No matter what tricks you pull, you'd still need about 1s to pull in a 10MB photo over a 100mbps connection at full speed, and most people don't have that. Reading data off a disk is an order (or two, with an SSD) of magnitude faster.
And sometimes, I just want to flip through the library and look at a lot of photos without having to wait seconds for each one to load.
Sure, the layout algorithms are still important, and that's what a lot of the article is about. But all king's horses and all king's men can't make the latency of a javascript UI with a network backend comparable to a compiled program interacting with a disk.
Thank you, but smoke and mirrors aren't for everyone. Google Photos fulfills an obvious need, but it's not a Picasa replacement.
1) Folders date-labelled when the files were moved from camera to computer 20180712 and files date & sequence labelled 20180712-005.ARW 20180712-005-1.tif as variants are created
2) Edit mostly via RawTherapee (with special help from MS ICE & photoFXlab masks & Silver Efex Pro)
3) Distribute & Share with Google Photos (jpgs down-sized below 16MP) [upload with web browswer interface avoids all the confusion around auto backup and deletes]
And 100gb is comparatively small for a photo and video library. The article mentions testing with 250k photos, which would be well over 100gb (2-3 times that at minimum, significantly more if RAW and high res).