Sure. And that is also perfectly consistent with considering pixels are dimensionless points forming an image, which they really are in practice much more than they are simply squares.
Indeed, an edge that goes from 0 to 100 can be considered as part of a wave of twice the frequency but the same amplitude as compared to an edge that goes from 0 to 200. Which is, by the way, why increasing the contrast in an image, especially micro-contrast, in practice increases resolution.
This is supposed to be an edge with nearly perfect sharpness.
If you take a single point sample, then slowly moving objects will appear to jump an entire pixel at a time, looking awful.
If you antialias, then movement will look smooth, but you'll also notice than when you align to the pixel squares the edge will preserve its sharpness better.
You have to be really careful when you're applying wave equations to resolution, especially when declaring that a certain number of samples fully captures an image.
If you want to display a perfect image with point samples, you may need to go as far as 10x the 'retina' density.
https://en.wikipedia.org/wiki/Hyperacuity_(scientific_term)
Indeed, an edge that goes from 0 to 100 can be considered as part of a wave of twice the frequency but the same amplitude as compared to an edge that goes from 0 to 200. Which is, by the way, why increasing the contrast in an image, especially micro-contrast, in practice increases resolution.