The human eye can technically detect up to 1000 fps, but the commonly "accepted" number is 150 fps.
Human eyes don't see in distinct "frames" though, depending on the person and the situation you can get people who can see extremely quick visual changes (Something like 1/1200th of a second)
How many we can technically see vs how many are needed in order to create the illusion of motion are two different questions. In the context of games (like my comment), 30 is more than enough to do this. So I don't understand why people believe 60 will look better. I have seen games run at both, and it's almost impossible to tell the difference.
It's certainly not "impossible" to see the difference between 30fps and 60fps, the difference is huge. Look at all the complaints with the Hobbit, because it went from 24fps to 48fps (And so the motion was much smoother, too smooth for some people), the only reason we consider such low framerates to be "ok" is because they're either blurred to hell and back, or we're just used to them (As is the case with games on consoles, we've had the last 7-8 years to get used to them struggling to hit 30fps).