Jump to content



Photo

Your thoughts on vertical sync?


  • Please log in to reply
27 replies to this topic

#16 ViperAFK

ViperAFK

    Neowinian Senior

  • Tech Issues Solved: 2
  • Joined: 07-March 06
  • Location: Vermont

Posted 04 July 2013 - 17:20

I play games in borderless windowed mode whenever possible which gives me tear-free through dwm with very little input lag (and nice instantaneous alt tabbing)




#17 Draconian Guppy

Draconian Guppy

    LippyZillaD Council

  • Tech Issues Solved: 2
  • Joined: 22-August 04
  • Location: Neowin

Posted 04 July 2013 - 18:48

I play games in borderless windowed mode whenever possible which gives me tear-free through dwm with very little input lag (and nice instantaneous alt tabbing)

Doesn't the experience and reduce performance?



#18 TheExperiment

TheExperiment

    Reality Bomb

  • Tech Issues Solved: 1
  • Joined: 11-October 03
  • Location: Everywhere
  • OS: 8.1 x64

Posted 04 July 2013 - 18:58

If you have an Nvidia card you can enable adaptive v-sync, then leave it off in game. I've found that to be an acceptable compromise. More recently I usually just do whatever the Nvidia tuner tells me to do.  :laugh:

I just got that...and I'm kind of impressed.  (I don't think leaving it off in game has any effect if its on in the CP btw.)

 

Since we're quoting HardOCP anyway here's the review of it - http://www.hardocp.c...hnology_review/



#19 Riva

Riva

    Neowinian

  • Tech Issues Solved: 1
  • Joined: 11-February 07

Posted 04 July 2013 - 19:17

I always keep it on since I get a lot of tearing. Maybe its resolution dependent.



#20 ViperAFK

ViperAFK

    Neowinian Senior

  • Tech Issues Solved: 2
  • Joined: 07-March 06
  • Location: Vermont

Posted 04 July 2013 - 19:34

Doesn't the experience and reduce performance?

Not anymore than vsync would, in fact in a lot of games it seems to perform slightly better than the in-game vsync option for me.

 

And quite sure exactly what you mean by "the experience", perhaps you accidentally a word :). Anyway I think I know what you meant, so let me clarify about borderless windowed mode: *Borderless* windowed mode looks identical to playing a game in fullscreen. Its a window that takes up the *full screen* with no visible window borders. It combines the advantages of fullscreen and windowed mode. Unfortunately all games don't support it, but many do these days (all recent source games do for example).



#21 Draconian Guppy

Draconian Guppy

    LippyZillaD Council

  • Tech Issues Solved: 2
  • Joined: 22-August 04
  • Location: Neowin

Posted 04 July 2013 - 19:41

Not anymore than vsync would, in fact in a lot of games it seems to perform slightly better than the in-game vsync option for me.

 

And quite sure exactly what you mean by "the experience", perhaps you accidentally a word :). Anyway I think I know what you meant, so let me clarify about borderless windowed mode: *Borderless* windowed mode looks identical to playing a game in fullscreen. Its a window that takes up the *full screen* with no visible window borders. It combines the advantages of fullscreen and windowed mode. Unfortunately all games don't support it, but many do these days (all recent source games do for example).

Well, playing dead space with borders, would ruin the "ambient", but alas you clarified.



#22 Andre S.

Andre S.

    Asik

  • Tech Issues Solved: 10
  • Joined: 26-October 05

Posted 04 July 2013 - 20:02

The problem is that vsync can be implemented in various ways and so when people say "vsync does this" and "vsync causes this problem" they never achieve any consensus because they're all talking about different things.

 

I only disable vsync when it's implemented using double buffering and my video card cannot keep a steady 60fps. This causes your framerate to jump to whole dividers of your monitor's refresh rate, so if it's 60, it'll jump between 60, 30, 20, 15, etc. Software like Fraps will still report intermediate framerates like 45 because it averages over several frames, but the game feels jerky and unresponsive. It's worse than playing at a consistently low framerate. Game developers who still use double buffered vertical synchronisation should be fired. Nvidia's Adaptive Vsync is a hack around the issue but doesn't really solve the problem.

 

Tearing is not a "CRT problem". Tearing can and does happen on any monitor when your video card is outputting frames at an interval that is not your monitor's refresh rate. Tearing is completely intolerable in this day and age. It's better than enduring double-buffered vsync, granted, but that's like saying it's better to watch a stream at 240p than having it lag constantly. Both are terrible experiences.

 

There is a proper way to implement vsync and that is triple-buffered vsync. Unfortunately, most game developers use a render ahead queue instead and call that "triple-buffered vsync", which leads to gamers thinking that triple-buffered vsync sucks and introduces tons of input lag. Render ahead queues suck. From what I understand, it's what Direct3D does by default so it's pretty much what most game engines do, but I'm not an expert and it's hard to get a clear answer from game developers. I'm not sure many of them know the intricacies of this, which is a shame because it's such a fundamental issue.

 

In any case, "triple-buffered vsync" solves the jerkiness of double-buffered vsync, so it leads to smooth framerates that are synced to monitor refreshes. In the proper "ping-pong" implementation, it doesn't cause any additional input latency than the theoretical minimum required to sync frames with monitor refreshes; in the suckass "render ahead queue" implementation, every additional buffer adds a frame of latency. This is the setting you see in the Nvidia control panel as "maximum pre-rendered frames". Try to keep that low. I believe the "enable triple-buffering" setting enables proper ping-pong TBVsync on OpenGL games; I tried it with Doom 3 and I felt it didn't introduce any latency. I'd love confirmation on this.

 

I can see competitive gamers, especially in first-person shooters, disabling "render ahead queue" vsync because they cannot afford a few frames of latency and are ready to pay the price of tearing. This is a shame because vsync doesn't have to introduce latency, it only does so because it's usually implemented poorly.



#23 +FiB3R

FiB3R

    aka DARKFiB3R

  • Tech Issues Solved: 6
  • Joined: 06-November 02
  • Location: SE London
  • OS: Windows 8.1 Enterprise
  • Phone: Lumia 930

Posted 04 July 2013 - 20:43

I can't stand screen tearing. How on earth it's still an acceptable part of display technology in this day and age, I will never know. It should have been designed into oblivion yeas ago.



#24 Enron

Enron

    Windows for Workgroups

  • Tech Issues Solved: 1
  • Joined: 30-May 11
  • OS: Windows 8.1 U1
  • Phone: Nokia Lumia 900

Posted 04 July 2013 - 22:06

I can't stand screen tearing. How on earth it's still an acceptable part of display technology in this day and age, I will never know. It should have been designed into oblivion yeas ago.

 

I think Oblivion has vsync and so does Skyrim.



#25 Draconian Guppy

Draconian Guppy

    LippyZillaD Council

  • Tech Issues Solved: 2
  • Joined: 22-August 04
  • Location: Neowin

Posted 04 July 2013 - 22:12

The problem that vsync can be implemented in various ways and so when people say "vsync does this" and "vsync causes this problem" they never achieve any consensus because they're all talking about different things.

 

I only disable vsync when it's implemented using double buffering and my video card cannot keep a steady 60fps. This causes your framerate to jump to whole dividers of your monitor's refresh rate, so if it's 60, it'll jump between 60, 30, 20, 15, etc. Software like Fraps will still report intermediate framerates like 45 because it averages over several frames, but the game feels jerky and unresponsive. It's worse than playing at a consistently low framerate. Game developers who still use double buffered vertical synchronisation should be fired. Nvidia's Adaptive Vsync is a hack around the issue but doesn't really solve the problem.

 

Tearing is not a "CRT problem". Tearing can and does happen on any monitor when your video card is outputting frames at an interval that is not your monitor's refresh rate. Tearing is completely intolerable in this day and age. It's better than enduring double-buffered vsync, granted, but that's like saying it's better to watch a stream at 240p than having it lag constantly. Both are terrible experiences.

 

There is a proper way to implement vsync and that is triple-buffered vsync. Unfortunately, most game developers use a render ahead queue instead and call that "triple-buffered vsync", which leads to gamers thinking that triple-buffered vsync sucks and introduces tons of input lag. Render ahead queues suck. From what I understand, it's what Direct3D does by default so it's pretty much what most game engines do, but I'm not an expert and it's hard to get a clear answer from game developers. I'm not sure many of them know the intricacies of this, which is a shame because it's such a fundamental issue.

 

In any case, "triple-buffered vsync" solves the jerkiness of double-buffered vsync, so it leads to smooth framerates that are synced to monitor refreshes. In the proper "ping-pong" implementation, it doesn't cause any additional input latency than the theoretical minimum required to sync frames with monitor refreshes; in the suckass "render ahead queue" implementation, every additional buffer adds a frame of latency. This is the setting you see in the Nvidia control panel as "maximum pre-rendered frames". Try to keep that low. I believe the "enable triple-buffering" setting enables proper ping-pong TBVsync on OpenGL games; I tried it with Doom 3 and I felt it didn't introduce any latency. I'd love confirmation on this.

 

I can see competitive gamers, especially in first-person shooters, disabling "render ahead queue" vsync because they cannot afford a few frames of latency and are ready to pay the price of tearing. This is a shame because vsync doesn't have to introduce latency, it only does so because it's usually implemented poorly.

This should be pinned and/or frontpaged!



#26 Riva

Riva

    Neowinian

  • Tech Issues Solved: 1
  • Joined: 11-February 07

Posted 04 July 2013 - 22:13

Probably as far as concerning FPS but increases quality. What I noticed with VSync is that it usually caps FPS at ~60



#27 ViperAFK

ViperAFK

    Neowinian Senior

  • Tech Issues Solved: 2
  • Joined: 07-March 06
  • Location: Vermont

Posted 05 July 2013 - 01:08

The problem is that vsync can be implemented in various ways and so when people say "vsync does this" and "vsync causes this problem" they never achieve any consensus because they're all talking about different things.

 

I only disable vsync when it's implemented using double buffering and my video card cannot keep a steady 60fps. This causes your framerate to jump to whole dividers of your monitor's refresh rate, so if it's 60, it'll jump between 60, 30, 20, 15, etc. Software like Fraps will still report intermediate framerates like 45 because it averages over several frames, but the game feels jerky and unresponsive. It's worse than playing at a consistently low framerate. Game developers who still use double buffered vertical synchronisation should be fired. Nvidia's Adaptive Vsync is a hack around the issue but doesn't really solve the problem.

 

Tearing is not a "CRT problem". Tearing can and does happen on any monitor when your video card is outputting frames at an interval that is not your monitor's refresh rate. Tearing is completely intolerable in this day and age. It's better than enduring double-buffered vsync, granted, but that's like saying it's better to watch a stream at 240p than having it lag constantly. Both are terrible experiences.

 

There is a proper way to implement vsync and that is triple-buffered vsync. Unfortunately, most game developers use a render ahead queue instead and call that "triple-buffered vsync", which leads to gamers thinking that triple-buffered vsync sucks and introduces tons of input lag. Render ahead queues suck. From what I understand, it's what Direct3D does by default so it's pretty much what most game engines do, but I'm not an expert and it's hard to get a clear answer from game developers. I'm not sure many of them know the intricacies of this, which is a shame because it's such a fundamental issue.

 

In any case, "triple-buffered vsync" solves the jerkiness of double-buffered vsync, so it leads to smooth framerates that are synced to monitor refreshes. In the proper "ping-pong" implementation, it doesn't cause any additional input latency than the theoretical minimum required to sync frames with monitor refreshes; in the suckass "render ahead queue" implementation, every additional buffer adds a frame of latency. This is the setting you see in the Nvidia control panel as "maximum pre-rendered frames". Try to keep that low. I believe the "enable triple-buffering" setting enables proper ping-pong TBVsync on OpenGL games; I tried it with Doom 3 and I felt it didn't introduce any latency. I'd love confirmation on this.

 

I can see competitive gamers, especially in first-person shooters, disabling "render ahead queue" vsync because they cannot afford a few frames of latency and are ready to pay the price of tearing. This is a shame because vsync doesn't have to introduce latency, it only does so because it's usually implemented poorly.

So do you think windows aero/dwm may be using "proper" triple buffered vsync? because I've noticed when playing in borderless windowed mode I usually get no additional input lag, but when using the in game vsync (double or "triple") I usually get a noticeable jump in input lag.



#28 Andre S.

Andre S.

    Asik

  • Tech Issues Solved: 10
  • Joined: 26-October 05

Posted 05 July 2013 - 19:46

So do you think windows aero/dwm may be using "proper" triple buffered vsync? because I've noticed when playing in borderless windowed mode I usually get no additional input lag, but when using the in game vsync (double or "triple") I usually get a noticeable jump in input lag.

That's a good question, I don't know. I would assume DWM just uses a DXGI swap chain which is a render ahead queue, but it might be that it's doing something smarter. In theory implementing proper triple buffering on top of a double-buffered scheme shouldn't be hard, I'll try to code a demo myself because it seems no one on the internet knows about it. 





Click here to login or here to register to remove this ad, it's free!