Your eye cannot perceive any difference much past 40-50fps. Most monitors (LCD especially) don't refresh more then 60 times a second either (as I understand it). So any number of FPS past the 60 mark doesn't have any visual effect.
That said, the more FPS you can generate, the faster the response time of the game. This is generally regarded as the "feel" of a game. Many people can tell the difference between 45fps and 80fps, primarilly in how fast it responds to input. This is why many guides will tell you to turn off V-SYNC options.
V-SYNC will cap your games FPS at the refresh rate of your monitor, so for most LCD's that would be 60fps. Your computer may be capable of running a game at a FPS of 300 but the extra frames would be discarded. This ensures that each refresh tick of your monitor will be showing a complete and non-malformed image. If you generate more FPS then your monitor can handle, "tearing" can occur where part of one frame is drawn over top of another frame resulting in some odd graphical artifacts. This isn't much of an issue in games where most of the screen doesn't change, but in fast moving FPS games, it can be distracting.
Personally, I usually turn on v-sync so that everything looks pretty
But then, I don't play competitive FPS games much.
edit:
Just some additional thoughts:
TV is broadcast at a rate of 30 frames per second. Film is shown at a rate of 24 frames per second. The comparison isn't a direct linear relationship but it does give a decent guide line to why you might notice differences between frame rates. And to why 25fps or less is irritating.
Apparently many incandescent light bulbs also run at 60Htz. Interesting.