Forums
New posts
Articles
Product Reviews
Policies
FAQ
Log in
Register
What's new
Search
Search
Search titles only
By:
New posts
Menu
Log in
Register
Install the app
Install
Forums
General Discussions
Switcher Hangout (Windows to Mac)
Gaming?
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
<blockquote data-quote="Zoolook" data-source="post: 550719" data-attributes="member: 21101"><p>Yes there is a difference, you can tell the difference, and there is a simple reason why.</p><p></p><p>If you play with V-sync disabled, the difference between 50fps and 60 fps is barely noticable, true. However, remember these are averages, so the min and max framerates are more important - if your avg is 50 fps, then your min and max will be 25 - 75 fps for example. However, switching off v-sync causes tearing and artifacting issues, so for the best image, you leave v-sync on.</p><p></p><p>With V-sync 'on', the card will try and sync with your monitor refresh rate (usually 60hz or 60-fps on modern flat panels). So, the card must display the frame-rate at a factor of the refresh rate. So lets say the card can output at 100fps (like the really top end ones do) then your actual on screen refresh rate will be 60fps (the refresh rate) and this will be constant. As soon as the rendering speed drops below 60fps, the output drops to 30fps immediately.</p><p></p><p>If your card does a max of 65 fps and a min of 25 fps (which is realistic), the onscreen fps will be between 60 fps and <strong>15</strong> fps (as soon as the rendering dips below 30fps) because it goes to the next lowest factor of 60. This is noticable by most gamers. It even happens on consoles.</p><p></p><p>So assuming your refresh rate is 60hz, these are the actual frame rates you'll see</p><p></p><p>120 fps + = 60fps</p><p>60fps + = 60fps</p><p>30fps - 59fps = 30fps</p><p>15fps - 29 fps = 15 fps</p><p>10 fps - 14 fps = 10fps</p><p></p><p>Of course, unelss you're a hardcore eye-candy lover, you might not care. But when you're talking about gaming rigs, these kinds of things are important and this is the difference between a card costing $120 and one costing $400 and yes, people pay for this. Even more so when playing online, because a sudden slowdown will lose you accuracy and possible a match.</p></blockquote><p></p>
[QUOTE="Zoolook, post: 550719, member: 21101"] Yes there is a difference, you can tell the difference, and there is a simple reason why. If you play with V-sync disabled, the difference between 50fps and 60 fps is barely noticable, true. However, remember these are averages, so the min and max framerates are more important - if your avg is 50 fps, then your min and max will be 25 - 75 fps for example. However, switching off v-sync causes tearing and artifacting issues, so for the best image, you leave v-sync on. With V-sync 'on', the card will try and sync with your monitor refresh rate (usually 60hz or 60-fps on modern flat panels). So, the card must display the frame-rate at a factor of the refresh rate. So lets say the card can output at 100fps (like the really top end ones do) then your actual on screen refresh rate will be 60fps (the refresh rate) and this will be constant. As soon as the rendering speed drops below 60fps, the output drops to 30fps immediately. If your card does a max of 65 fps and a min of 25 fps (which is realistic), the onscreen fps will be between 60 fps and [B]15[/B] fps (as soon as the rendering dips below 30fps) because it goes to the next lowest factor of 60. This is noticable by most gamers. It even happens on consoles. So assuming your refresh rate is 60hz, these are the actual frame rates you'll see 120 fps + = 60fps 60fps + = 60fps 30fps - 59fps = 30fps 15fps - 29 fps = 15 fps 10 fps - 14 fps = 10fps Of course, unelss you're a hardcore eye-candy lover, you might not care. But when you're talking about gaming rigs, these kinds of things are important and this is the difference between a card costing $120 and one costing $400 and yes, people pay for this. Even more so when playing online, because a sudden slowdown will lose you accuracy and possible a match. [/QUOTE]
Verification
Name this item 🌈
Post reply
Forums
General Discussions
Switcher Hangout (Windows to Mac)
Gaming?
Top