Indeed, the eye sees more like an old chemical camera than a computer screen. Whatever happens between each "snapshot" gets taken all at once. This is why motion blur is such a hot topic in gaming graphics.

(I've played a game recently that had motion blur and looked really nice)
Anyway, in terms of framerate, the eye can actually see a lot faster than 26 fps, but even so, a frame in terms of your eye is very different than a frame drawn on the computer.
@z-man: Something that was driven home to me relatively recently that you might be interested in.... What makes animation work isn't just that the frames are rapidly cycled, it's that the frames are rapidly cycled with a period of darkness between each frame. I guess that's the vertical blank period, and the monitor is actually blacked out during that time. So a 60hz monitor is actually showing 120 fps, but 60 of those frames are pure blackness.
Something about how the eye works (I forgot the technical explanation) is that it'll keep the afterimage of the previous light frame during the dark frame, and the next light frame will therefore be superimposed on the afterimage while it's fading out.
So, I suspect the noticeable difference between, say, a 60hz monitor and a 120hz monitor isn't actually the number of visible frames being shown, but the fact that the dark frames are shown twice as many times, but each for only half as long.
So I guess that theoretically, if you have a strobe light that actually flashes at 60 frames per second and complete darkness otherwise, your eyes may not be able to tell the difference between that and animation?
Edit: I think that's also why if you're in a movie theater and you look at the screen, you get a smooth image, but if you look at the camera you see the light flickering.