Heh, yeah. The protobuf people's expectation was if you had a really large dataset you'd wrap it up in your own mini-protocol of "sequence of protobuf messages". But of course that's way more friction, so in practice it will end up not getting done when it should be (plus also, it requires a certain amount of ability to predict the future).
Lesson for technologists: if you want to make the world a better place arrange your tech such that the lowest-friction path is also the correct path.
(Another example: disasterous multi-byte UTF encodings [correct solution was more friction] vs basically successful UTF8 [correct solution was less friction].)
I don't know if you're still dealing w/ this particular problem for protobufs, but based on my experience with thrift, a very similar library, there are probably some not too terrible ways you can kinda hack up the client-side parsing to be more streaming-ish...
Lesson for technologists: if you want to make the world a better place arrange your tech such that the lowest-friction path is also the correct path.
(Another example: disasterous multi-byte UTF encodings [correct solution was more friction] vs basically successful UTF8 [correct solution was less friction].)
I don't know if you're still dealing w/ this particular problem for protobufs, but based on my experience with thrift, a very similar library, there are probably some not too terrible ways you can kinda hack up the client-side parsing to be more streaming-ish...