Your thoughts on vertical sync?


Recommended Posts

I've enabled it in every game I've played since the early 2000s, purely because screen tearing looks really awful and disabling vsync does absolutely nothing to increase my frame-rates.

 

I've been to several LAN parties recently and many of the gamers there swear by keeping vsync off - and they somehow don't notice the screen tearing when it's very visible to me on their screens. They also suggest turning vsync off if someone gets a lower-than-60 frame-rate, which again, makes no sense to me. Disabling it has never increased my frame-rate in such circumstances.

 

Maybe vsync mattered more in the CRT days? (I can't remember, it's been almost 10 years since I had a CRT monitor)

 

Thoughts?

Link to comment
Share on other sites

Well, IMO it varies from game to game,  in hitman:absolution it brings my fairly old system to a crawl, in fifa 2013 the game is unplayable from the stuttering ( and mind you my system is way above the recommended requirements. A contrary example would be DmC ( Devil May Cry) it apparently does nothing to performance, the same applies to games using unreal engine.

Link to comment
Share on other sites

I keep it off because it's a double-edged sword. It can haunt you at the worst possible, the most intense (and important) moment of the game.

 

Enabling triple-buffering solves the tearing problem much better imho.

Link to comment
Share on other sites

It was definitely a CRT thing. I never turn it on anymore unless the game sets it as the default

Link to comment
Share on other sites

I agree, the performance hit seems to vary by game, and probably by hardware. In the end its one of those preferential things--some people are ok with low quality stereos or cheap food for the trade off in money. The same applies for gaming. That being said, screen tearing really annoys me too!

Link to comment
Share on other sites

If your PC consistently outputs 60+ and the game isn't some online twitch fest you could leave it on. There are also some games that are not graphically demanding (or just a game menu) but lack a frame limiter and the GPU ends up rendering hundreds of fps and overheating in the process (eg. Starcraft II's main menu at launch). Outside of these examples, I keep it off because I don't really notice tearing. Probably got used to it after using weak GPUs in the past with which vsync was not feasible.

Link to comment
Share on other sites

Every time I've turned it on the mouse feels so sluggish (I only play FPS games), so I keep it off. I've never really had much problem with tearing either.

Link to comment
Share on other sites

If you have an Nvidia card you can enable adaptive v-sync, then leave it off in game. I've found that to be an acceptable compromise. More recently I usually just do whatever the Nvidia tuner tells me to do.  :laugh:

Link to comment
Share on other sites

All depends on the game itself and just how bad the tearing is, so I would say it varies from game to game.

IIRC, the Unreal Engine tears like a mofo, so I tend to enable it for all Unreal based games.

I know for example on Bioshock Infinite I definitely enabled it.

 

For a game like BF3 MP, I disable it, so I can get more than 60 FPS to have a absolutely smooth experience. The more frames the merrier I have always believed with MP. I know people say your eye cannot process more, but I believe it also comes down to processing what is happening in game, more than just visuals, so I disable it for most MP games.

 

I too also never enable it if I am not getting a constant 60 FPS prior to enabling it. All you are doing then is forcing the game to run at a frame rate you are not able to achieve, so it is definitely skipping lots of frames to make up for the fact.

 

So yeah, it all really depends on the game and it varies. I do not have a set approach. I adjust accordingly to each game. I am a bit OCD like that though, usually with PC gaming, I take a good 15-30 minutes to find my optimal graphic settings. I will literally enable each one one at a time, either fire up an app like FRAPS or EVGA Precision and make note of the frame rate I get, and/or use the built in benchmarking app. So yeah, I tend to be a little OCD with getting the optimal frame rate versus optimal visuals.

Link to comment
Share on other sites

If you can get 60+ FPS constantly (on a 60Hz Monitor), turn it on.

 

Otherwise, with v-sync you will not see 55FPS, it will dip your FPS straight to 30.

I disabled it in Bioshock: Infinite for that reason - game plays fine without it, with it, lots of time when things were laggy to to v-sync going to 30.

Link to comment
Share on other sites

If you can get 60+ FPS constantly (on a 60Hz Monitor), turn it on.

 

Otherwise, with v-sync you will not see 55FPS, it will dip your FPS straight to 30.

I disabled it in Bioshock: Infinite for that reason - game plays fine without it, with it, lots of time when things were laggy to to v-sync going to 30.

Are you sure? I've never witnessed a game that overs between 55-65 FPS jump between 60 FPS and 30 FPS constantly.

Link to comment
Share on other sites

I've always kept it on.  Never had a reason not to.  Will have to try shutting it off and see if it changes anything.

Link to comment
Share on other sites

This guy did an excellent job explaining how Vsync works.

I have never witnessed my frame-rate half if I dip below 60 FPS by even 1 frame when I have vsync enabled. It just doesn't happen. Not on my systems, at least.

Link to comment
Share on other sites

I play games in borderless windowed mode whenever possible which gives me tear-free through dwm with very little input lag (and nice instantaneous alt tabbing)

Link to comment
Share on other sites

I play games in borderless windowed mode whenever possible which gives me tear-free through dwm with very little input lag (and nice instantaneous alt tabbing)

Doesn't the experience and reduce performance?

Link to comment
Share on other sites

If you have an Nvidia card you can enable adaptive v-sync, then leave it off in game. I've found that to be an acceptable compromise. More recently I usually just do whatever the Nvidia tuner tells me to do.  :laugh:

I just got that...and I'm kind of impressed.  (I don't think leaving it off in game has any effect if its on in the CP btw.)

 

Since we're quoting HardOCP anyway here's the review of it - http://www.hardocp.com/article/2012/04/16/nvidia_adaptive_vsync_technology_review/

Link to comment
Share on other sites

Doesn't the experience and reduce performance?

Not anymore than vsync would, in fact in a lot of games it seems to perform slightly better than the in-game vsync option for me.

 

And quite sure exactly what you mean by "the experience", perhaps you accidentally a word :). Anyway I think I know what you meant, so let me clarify about borderless windowed mode: *Borderless* windowed mode looks identical to playing a game in fullscreen. Its a window that takes up the *full screen* with no visible window borders. It combines the advantages of fullscreen and windowed mode. Unfortunately all games don't support it, but many do these days (all recent source games do for example).

Link to comment
Share on other sites

Not anymore than vsync would, in fact in a lot of games it seems to perform slightly better than the in-game vsync option for me.

 

And quite sure exactly what you mean by "the experience", perhaps you accidentally a word :). Anyway I think I know what you meant, so let me clarify about borderless windowed mode: *Borderless* windowed mode looks identical to playing a game in fullscreen. Its a window that takes up the *full screen* with no visible window borders. It combines the advantages of fullscreen and windowed mode. Unfortunately all games don't support it, but many do these days (all recent source games do for example).

Well, playing dead space with borders, would ruin the "ambient", but alas you clarified.

Link to comment
Share on other sites

The problem is that vsync can be implemented in various ways and so when people say "vsync does this" and "vsync causes this problem" they never achieve any consensus because they're all talking about different things.

 

I only disable vsync when it's implemented using double buffering and my video card cannot keep a steady 60fps. This causes your framerate to jump to whole dividers of your monitor's refresh rate, so if it's 60, it'll jump between 60, 30, 20, 15, etc. Software like Fraps will still report intermediate framerates like 45 because it averages over several frames, but the game feels jerky and unresponsive. It's worse than playing at a consistently low framerate. Game developers who still use double buffered vertical synchronisation should be fired. Nvidia's Adaptive Vsync is a hack around the issue but doesn't really solve the problem.

 

Tearing is not a "CRT problem". Tearing can and does happen on any monitor when your video card is outputting frames at an interval that is not your monitor's refresh rate. Tearing is completely intolerable in this day and age. It's better than enduring double-buffered vsync, granted, but that's like saying it's better to watch a stream at 240p than having it lag constantly. Both are terrible experiences.

 

There is a proper way to implement vsync and that is triple-buffered vsync. Unfortunately, most game developers use a render ahead queue instead and call that "triple-buffered vsync", which leads to gamers thinking that triple-buffered vsync sucks and introduces tons of input lag. Render ahead queues suck. From what I understand, it's what Direct3D does by default so it's pretty much what most game engines do, but I'm not an expert and it's hard to get a clear answer from game developers. I'm not sure many of them know the intricacies of this, which is a shame because it's such a fundamental issue.

 

In any case, "triple-buffered vsync" solves the jerkiness of double-buffered vsync, so it leads to smooth framerates that are synced to monitor refreshes. In the proper "ping-pong" implementation, it doesn't cause any additional input latency than the theoretical minimum required to sync frames with monitor refreshes; in the suckass "render ahead queue" implementation, every additional buffer adds a frame of latency. This is the setting you see in the Nvidia control panel as "maximum pre-rendered frames". Try to keep that low. I believe the "enable triple-buffering" setting enables proper ping-pong TBVsync on OpenGL games; I tried it with Doom 3 and I felt it didn't introduce any latency. I'd love confirmation on this.

 

I can see competitive gamers, especially in first-person shooters, disabling "render ahead queue" vsync because they cannot afford a few frames of latency and are ready to pay the price of tearing. This is a shame because vsync doesn't have to introduce latency, it only does so because it's usually implemented poorly.

  • Like 1
Link to comment
Share on other sites

I can't stand screen tearing. How on earth it's still an acceptable part of display technology in this day and age, I will never know. It should have been designed into oblivion yeas ago.

Link to comment
Share on other sites

I can't stand screen tearing. How on earth it's still an acceptable part of display technology in this day and age, I will never know. It should have been designed into oblivion yeas ago.

 

I think Oblivion has vsync and so does Skyrim.

  • Like 1
Link to comment
Share on other sites

The problem that vsync can be implemented in various ways and so when people say "vsync does this" and "vsync causes this problem" they never achieve any consensus because they're all talking about different things.

 

I only disable vsync when it's implemented using double buffering and my video card cannot keep a steady 60fps. This causes your framerate to jump to whole dividers of your monitor's refresh rate, so if it's 60, it'll jump between 60, 30, 20, 15, etc. Software like Fraps will still report intermediate framerates like 45 because it averages over several frames, but the game feels jerky and unresponsive. It's worse than playing at a consistently low framerate. Game developers who still use double buffered vertical synchronisation should be fired. Nvidia's Adaptive Vsync is a hack around the issue but doesn't really solve the problem.

 

Tearing is not a "CRT problem". Tearing can and does happen on any monitor when your video card is outputting frames at an interval that is not your monitor's refresh rate. Tearing is completely intolerable in this day and age. It's better than enduring double-buffered vsync, granted, but that's like saying it's better to watch a stream at 240p than having it lag constantly. Both are terrible experiences.

 

There is a proper way to implement vsync and that is triple-buffered vsync. Unfortunately, most game developers use a render ahead queue instead and call that "triple-buffered vsync", which leads to gamers thinking that triple-buffered vsync sucks and introduces tons of input lag. Render ahead queues suck. From what I understand, it's what Direct3D does by default so it's pretty much what most game engines do, but I'm not an expert and it's hard to get a clear answer from game developers. I'm not sure many of them know the intricacies of this, which is a shame because it's such a fundamental issue.

 

In any case, "triple-buffered vsync" solves the jerkiness of double-buffered vsync, so it leads to smooth framerates that are synced to monitor refreshes. In the proper "ping-pong" implementation, it doesn't cause any additional input latency than the theoretical minimum required to sync frames with monitor refreshes; in the suckass "render ahead queue" implementation, every additional buffer adds a frame of latency. This is the setting you see in the Nvidia control panel as "maximum pre-rendered frames". Try to keep that low. I believe the "enable triple-buffering" setting enables proper ping-pong TBVsync on OpenGL games; I tried it with Doom 3 and I felt it didn't introduce any latency. I'd love confirmation on this.

 

I can see competitive gamers, especially in first-person shooters, disabling "render ahead queue" vsync because they cannot afford a few frames of latency and are ready to pay the price of tearing. This is a shame because vsync doesn't have to introduce latency, it only does so because it's usually implemented poorly.

This should be pinned and/or frontpaged!

Link to comment
Share on other sites

The problem is that vsync can be implemented in various ways and so when people say "vsync does this" and "vsync causes this problem" they never achieve any consensus because they're all talking about different things.

 

I only disable vsync when it's implemented using double buffering and my video card cannot keep a steady 60fps. This causes your framerate to jump to whole dividers of your monitor's refresh rate, so if it's 60, it'll jump between 60, 30, 20, 15, etc. Software like Fraps will still report intermediate framerates like 45 because it averages over several frames, but the game feels jerky and unresponsive. It's worse than playing at a consistently low framerate. Game developers who still use double buffered vertical synchronisation should be fired. Nvidia's Adaptive Vsync is a hack around the issue but doesn't really solve the problem.

 

Tearing is not a "CRT problem". Tearing can and does happen on any monitor when your video card is outputting frames at an interval that is not your monitor's refresh rate. Tearing is completely intolerable in this day and age. It's better than enduring double-buffered vsync, granted, but that's like saying it's better to watch a stream at 240p than having it lag constantly. Both are terrible experiences.

 

There is a proper way to implement vsync and that is triple-buffered vsync. Unfortunately, most game developers use a render ahead queue instead and call that "triple-buffered vsync", which leads to gamers thinking that triple-buffered vsync sucks and introduces tons of input lag. Render ahead queues suck. From what I understand, it's what Direct3D does by default so it's pretty much what most game engines do, but I'm not an expert and it's hard to get a clear answer from game developers. I'm not sure many of them know the intricacies of this, which is a shame because it's such a fundamental issue.

 

In any case, "triple-buffered vsync" solves the jerkiness of double-buffered vsync, so it leads to smooth framerates that are synced to monitor refreshes. In the proper "ping-pong" implementation, it doesn't cause any additional input latency than the theoretical minimum required to sync frames with monitor refreshes; in the suckass "render ahead queue" implementation, every additional buffer adds a frame of latency. This is the setting you see in the Nvidia control panel as "maximum pre-rendered frames". Try to keep that low. I believe the "enable triple-buffering" setting enables proper ping-pong TBVsync on OpenGL games; I tried it with Doom 3 and I felt it didn't introduce any latency. I'd love confirmation on this.

 

I can see competitive gamers, especially in first-person shooters, disabling "render ahead queue" vsync because they cannot afford a few frames of latency and are ready to pay the price of tearing. This is a shame because vsync doesn't have to introduce latency, it only does so because it's usually implemented poorly.

So do you think windows aero/dwm may be using "proper" triple buffered vsync? because I've noticed when playing in borderless windowed mode I usually get no additional input lag, but when using the in game vsync (double or "triple") I usually get a noticeable jump in input lag.

Link to comment
Share on other sites

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.