I also think it's silly that imperfections should be artificially maintained just to keep a "vintage" look, but when you think about it, nothing really about modern cameras actually reflect the real world. It's all a judgment call by the sensor/algorithm/photographer/editor who together decide how the published photo looks.
- CCD layering & subpixel arrangements
- aperture openings & shapes
- vignetting & chromatic aberrations
- focal distance, white balance, ISO...
- shutter speed, aperture, depth of field, focus, composition...
- post-production...
The comparison to FLAC (a storage medium) is a bit flawed, vs say the actual creative instruments (guitars, pianos). The article is more like the photography equivalent of lamenting the loss of Stradivarius's secret sauce.
Long before a photo even gets to RAW, a bunch of artistic decisions are made by the photographer and the engineers who put together the camera and lens (and their firmware). You can't turn off all the processing, and even if you could, the camera lens doesn't really work the same way the human eye does. A "lifelike" photo would have a huge field of view but be blurry anywhere except the center focal point, have a huge dynamic range but limited depth of field, etc.
And then after you go from RAW to whatever renderer makes it a JPEG, a bunch more decisions and corrections are made, with or without your input as a photographer/editor.
Even if you shoot with film, there is no "calibration to reality" per se, it's all an artistic choice in terms of body, lens, film, development techniques, etc.
Cameras are not "reality capture devices", they are advanced signals processors that turn a gazillion rays of light into images that match a certain preprogrammed look (or one of several, like "natural", "vivid", "sepia", etc. -- all of those being just different parameters that tweak the algorithms).
> A "lifelike" photo would have a huge field of view but be blurry anywhere except the center focal point, have a huge dynamic range but limited depth of field, etc.
The trope of the two overlapping discs that tells you that someone is using binoculars could be replaced with this when the director wants to emphasize some subjective aspect.
- CCD layering & subpixel arrangements
- aperture openings & shapes
- vignetting & chromatic aberrations
- focal distance, white balance, ISO...
- shutter speed, aperture, depth of field, focus, composition...
- post-production...
The comparison to FLAC (a storage medium) is a bit flawed, vs say the actual creative instruments (guitars, pianos). The article is more like the photography equivalent of lamenting the loss of Stradivarius's secret sauce.
Long before a photo even gets to RAW, a bunch of artistic decisions are made by the photographer and the engineers who put together the camera and lens (and their firmware). You can't turn off all the processing, and even if you could, the camera lens doesn't really work the same way the human eye does. A "lifelike" photo would have a huge field of view but be blurry anywhere except the center focal point, have a huge dynamic range but limited depth of field, etc.
And then after you go from RAW to whatever renderer makes it a JPEG, a bunch more decisions and corrections are made, with or without your input as a photographer/editor.
Even if you shoot with film, there is no "calibration to reality" per se, it's all an artistic choice in terms of body, lens, film, development techniques, etc.
Cameras are not "reality capture devices", they are advanced signals processors that turn a gazillion rays of light into images that match a certain preprogrammed look (or one of several, like "natural", "vivid", "sepia", etc. -- all of those being just different parameters that tweak the algorithms).