I need to save this off so I can copy and paste it :-)
Depending on the phosphor persistence, 30 fps is where high-90-something
percent of people *stops seeing flicker*. That's very different than "30
fps is all the eye can detect." And that's with an interlaced display.
With non-interlaced displays, the high-90% cutoff is 72 fps. That's
supported by studies in Switzerland and 72 fps VESA displays were widely
advertised in the early 1990s. With film, 24 fps is where most people
stop seeing flicker.
The human eye/brain can detect faster frame rates. Some film makers have
run tests as high as 70 fps. Audiences reported that the faster frame
rates looked much more realistic. At the highest frame rates, the
incidence of motion sickness while watching roller coaster material
skyrocketed. This is caused by the eye being fooled into thinking it's
seeing real motion while the inner ear is reporting that the person is
sitting still. That discrepancy causes vertigo in many people.
In addition to the eye, a faster frame rate affects a sims response to
controls. The higher frame rate allows more precise control. I've seen
reports of some limited tests using Quake and the test subjects were able
to discern the faster frame rate 100% of the time.
You may be able to test this yourself. Recently, I worked on a flight
sim. In the debug build frame rates of 30 fps were common with one plane
on the screen. In the release build we saw 70+ fps. The 30 fps looked
pretty good, but 70 fps was noticeably smoother. If you can find a sim
where you can crank up the detail and change the frame rate you should be
able to see this for yourself.