Ok, so we agree the things that were limitations could also be used for artistic effects. (I technically disagree about the avoidance: don't buy something if you're not going to bother learning how to use it.)
Then, let's look at the reproduction part only.
Technically, it might work. More choice in reproduction. But maybe in the real world, movie theaters (and certainly in the television world!) would crank up the brightness and edge sharpening and frame interpolation to make the director's artistic material look totally horrible.
If you think about the whole life cycle of any kind of art delivered to some audience, it's lined with these huge pitfalls at every point.
Theaters certainly could screw it up. Wouldn't be the first time. But maybe they wouldn't! I have no problem acknowledging and discussing potential downsides. But blanket statements that better reproduction will provide a worse experience don't seem to be supported by the facts, and it bugs me.
"Better reproduction" probably means very different things to different people.
Likely a lot would depend on good defaults and good training. If you were as skeptical about people and organizations as I am, you would assume it could on average worsen many movies. The original Murphy's Law and all that. :)
I'd accept "on average." The original comment up there didn't even say that. It just said, blanket statement, high resolution would be worse, and high framerates would be worse still.
Then, let's look at the reproduction part only.
Technically, it might work. More choice in reproduction. But maybe in the real world, movie theaters (and certainly in the television world!) would crank up the brightness and edge sharpening and frame interpolation to make the director's artistic material look totally horrible.
If you think about the whole life cycle of any kind of art delivered to some audience, it's lined with these huge pitfalls at every point.
Maybe I'm obsessive compulsive about it.