... and who can really, honestly tell the difference between 40fps, 50fps, or 60fps without the aid of software to calculate it?
Yet, there are gamers that claim they can.... and drop the cash for a graphics card that says it can do it.
A large portion of the gamers only buy hardware because of what the specs are. It makes no difference how much better the specs are, or if they are even noticeable... if the box says the specs are "higher" it is therefore "Better" and they buy it.
Yes there is a difference, you can tell the difference, and there is a simple reason why.
If you play with V-sync disabled, the difference between 50fps and 60 fps is barely noticable, true. However, remember these are averages, so the min and max framerates are more important - if your avg is 50 fps, then your min and max will be 25 - 75 fps for example. However, switching off v-sync causes tearing and artifacting issues, so for the best image, you leave v-sync on.
With V-sync 'on', the card will try and sync with your monitor refresh rate (usually 60hz or 60-fps on modern flat panels). So, the card must display the frame-rate at a factor of the refresh rate. So lets say the card can output at 100fps (like the really top end ones do) then your actual on screen refresh rate will be 60fps (the refresh rate) and this will be constant. As soon as the rendering speed drops below 60fps, the output drops to 30fps immediately.
If your card does a max of 65 fps and a min of 25 fps (which is realistic), the onscreen fps will be between 60 fps and
15 fps (as soon as the rendering dips below 30fps) because it goes to the next lowest factor of 60. This is noticable by most gamers. It even happens on consoles.
So assuming your refresh rate is 60hz, these are the actual frame rates you'll see
120 fps + = 60fps
60fps + = 60fps
30fps - 59fps = 30fps
15fps - 29 fps = 15 fps
10 fps - 14 fps = 10fps
Of course, unelss you're a hardcore eye-candy lover, you might not care. But when you're talking about gaming rigs, these kinds of things are important and this is the difference between a card costing $120 and one costing $400 and yes, people pay for this. Even more so when playing online, because a sudden slowdown will lose you accuracy and possible a match.