Strictly speaking ("film noise"), I suppose you're right. But we are all aware that there is such a thing as CCD noise, right? That's not being synthesized.
I wonder if one is preferable to the other: that is, is CCD noise worse than emulsion noise? My sense is that the CCD, with the Bayer filter in place, gives you wild chroma noise while film gives more of a tonal noise.
Oh, I hadn't even thought of that! Yes, digital cameras would of course not have film grain!
I was making a more stupid point: these days image you are seeing on screen is always synthesized from 0s and 1s. No matter how that stream of data was original produced (ie by scanning actual film stock).
the noise floor for cameras (when working in the correct params) is ridiculously low.
modern slow 35mm film (now there are loads of types and speeds) has an optical resolution of something like 5-12megapixels. a full frame CCD has easily got an optical resolution of 50mp.
We are probably talking about CMOS here, right? I haven't seen CCD camera for like 15 years, and that had resolution like 10 mpx. I know CCD chips are still used in astrophotography because of their other qualities, but aside from that they seem to have pretty much died out.
Strictly speaking ("film noise"), I suppose you're right. But we are all aware that there is such a thing as CCD noise, right? That's not being synthesized.
I wonder if one is preferable to the other: that is, is CCD noise worse than emulsion noise? My sense is that the CCD, with the Bayer filter in place, gives you wild chroma noise while film gives more of a tonal noise.