The well-known fact you allude to is stated incorrectly, and thus you
misinterpret the statistic.
At 1 fps, the human distinguishes a series of similar images (like a series
of frames in motion picture film) as 1 individual picture per second. At 10
fps the eye/brain still sees 10 individual images. At 24 or 25 fps the
*average* human eye and brain no longer the interprets this series of images
as distinct, individual images - but as a "motion picture". This does *not*
mean that the *average* human eye cannot distinguish the difference between
a series of images displayed at 25 fps vs. 70 fps. As you increase the fps,
the eye/brain sees the motion picture as being more fluid (smooth). A real
world proof that the human eye/brain can determine the difference at higher
fps is the notion of acceptable refresh rates on monitor displays. I for
one can walk through my office and point out which monitors are run at 60 Hz
(non-interlaced). I see "flicker" on these displays (in fact, I get dizzy
and nauseous within about 30 seconds of looking at these monitors) until I
bump up the refresh to 70 Hz. My eye can distinguish the difference in fps
up to at least 70 fps (which is equivalent to 70 Hz in that the monitor is
redrawing my Windows display 70 frames or cycles per second). You can prove
it to yourself by playing a game at 30, 40, 50, etc fps. You tell me if you
can see the difference. I bet that you agree that the motion of moving
objects in your game appears "smoother", "more realistic", "less choppy",
however you want to phrase it.
Hope this helps,
--
Philip D'Amato
1999 Ninja 500R