I read a study that put people in a dark room with a strobing LED and told them to dart their eyes left and right. 1000Hz was the limit before everyone stopped seeing glowing dashes and saw a solid line streak instead.
I was researching this because I was wondering how fast you can make LED flicker for lighting effects before it looks like constant brightness.
I found most of the information on Wikipedia[0], and the limit seems to be at about 80hz, but together with movement, some people can see stroboscopic effects up to 10khz.
I’m reminded by an old Microsoft input research video, where 1ms latency response is what is needed for the most lifelike response for touch drawing on a screen: https://m.youtube.com/watch?v=vOvQCPLkPt4
That's definitely not the limit. At 700Hz, a 700px wide screen with the 1px line crossing it every 1s would qualify as reaching the limit in that situation. But, speed that line up so it crosses every 0.5s, and it's no longer good enough. You've introduced an artifact - the object looks like multiple copies now, equally spaced apart by 1px gaps. It's not a smooth blur. The display never displayed a gap, but human eyes merge together an afterimage, so we see multiple frames at once over the last ~1/30s. Now go to 1400Hz, but double the speed of the line so it crosses in 1/4s. Now you need 2800Hz to eliminate the artifact. Or, you can artificially render motion blur, but then it won't look right if you eye follows the line as it crosses. So it's also a function of how fast your eye muscles can move across the screen. Thirdly, we can't limit ourselves to a 700px screen - a screen filling the human field of view would need to be 20-30 million pixels wide before one can no longer resolve a 1px gap between two vertical lines. There is eventually a completely artifactless limit, but it's way higher than 700Hz. Of course, 700Hz is nice, and if you fudge the criteria (how often do you see a 1px line moving across your field of view at high-speed in real life) you can argue it's good enough.