have you tried to extract the vignetting pattern form the captures and using it to normalize them? My first try would be to calculate the median grayscale image of all 9000 captures and then using this to normalize the intensity.
Yes, and the vignetting pattern isn't so much the problem. It's a color distortion problem (spatially varying chromaticity rather than brightness that is not just a function of the position within the field of view but also of the underlying material, unfortunately). So there isn't a nice way to correct each capture in a predictable way to ensure that overlapping pixels have the same colors consistently.
I've worked with similar problems for agricultural mapping from drone images. You would need to build a BRDF model for the different colour channels for various types of materials, then assign the material based on a combination of best representative models. Then you can re-render with uniform normal lighting.
They are talking about a bidirectional reflectance distribution function, a now common modeling technique for practical representation of complex surface properties when illuminated.
That said, a full BRDF is not the only way to approach this problem, especially at reduced resolution where the artifacts are more apparent.
If you're an ACM member, there is a wealth of information in SIGGRAPH publications. Having been in graphics since the 90s, I started with Foley and Van Dam (Computer Graphics: Principles and Practice) and Watt and Watt (Animation and Rendering Techniques) and then stayed on top of SIGRAPH (attending regularly, though not annually) since then.
For a more whimsical survey from the pen of a straight up genius, Jim Blinn's books (e.g. Dirty Pixels) are fantastic reads.