NVIDIA announces G-SYNC to help make PC monitors better

Today, we are beginning to see 4K PC monitors with higher resolution displays that were unthinkable a few years ago. Yet, there are still issues with these monitors high end applications, and particularly with playing games, that can't simply be solved by increasing the screen resolution. Today, NVIDIA announced it has a solution that it claims will help make PC monitors better.

That solution is called G-SYNC, and in an announcement Friday, NVIDIA offered up the first public details of this new technology. It consists of a hardware module made to be placed inside a monitor. This piece of hardware was created to synchronize the monitor to the output of a PC's graphics card, rather than having the GPU sync up to the monitor. NVIDIA says that with the G-SYNC module in place, it eliminates issues such as screen tearing, VSync input lag, and stutter from PC games and applications. How does this module work? NVIDIA states it does away with a monitor's fixed refresh rate, which is usually set at 60Hz. It states.

With G-SYNC, the monitor begins a refresh cycle right after each frame is completely rendered on the GPU. Since the GPU renders with variable time, the refresh of the monitor now has no fixed rate.

The company says that testing G-SYNC with pro gamers has resulted in those players dominating others without the module inside their monitors, thanks to nearly no visible lag in playing PC games. It also has generated praise from well known game developers. It states:

Tim Sweeney, creator of Epic’s industry-dominating Unreal Engine, called G-SYNC “the biggest leap forward in gaming monitors since we went from standard definition to high-def.” He added, “If you care about gaming, G-SYNC is going to make a huge difference in the experience.” The legendary John Carmack, architect of id Software’s engine, was similarly excited, saying “Once you play on a G-SYNC capable monitor, you’ll never go back.”

Naturally, the G-SYNC module will only work with NVIDIA GPUs, specifically the GeForce GTX 650 Ti Boost and above. NVIDIA will begin to offer the G-SYNC hardware for sale  later this year as an add-on that people can install themselves inside an ASUS VG248QE monitor, or they can buy the same monitor with the module already installed. Pricing for the do-it-yourself kit is $175 and the price of the ASUS monitor with the G-SYNC module is $399. In 2014, NVIDIA says that PC monitors with the G-SYNC module will go on sale from a variety of companies.

Source: NVIDIA | Image via NVIDIA

Report a problem with article
Previous Story

Windows RT 8.1 pulled from the Windows Store

Next Story

Oh meu Deus! Xbox One will be cheaper than PS4 in Brazil but will still cost $1,000

44 Comments

Commenting is disabled on this article.

This is a great innovation that solves a very important technical problem. VSync is a dirty hack around the fundamental problem that monitors refresh at a fixed interval; finally we're getting monitors that are framerate-aware and can get rid of any gpu-side vsync and its associated latency/judder/tearing issues. This will make games better, in many more ways than simply providing smoother framerates. This basically enables dynamic framerates and gives developers much more flexibility in content creation and optimisation.

I think a lot of the negative opinions here stem from ignorance. Eventually this tech should be in most monitors and TVs, and of course AMD will follow suit with its own implementation. If you look in history at how graphics APIs have evolved it's more often than not been either ATI's or NVIDIA's sole initiative where one eventually follows the other or they agree on some common standard. Stop panicking and give it some time.

So, G-SYNC is:
- Proprietary tech.
- Working ONLY with nVidia cards.
- Working ONLY with GTX 650 Ti and above.
- Will cost 175$.
- Obviously AMD will come up with some same-****.

Now for all the people out there, that are left with some reason and logic in them:
- Lets wait for an 'aliance-of-companies-that-will-make-the-same-thing-and-as-an-open-standar'.
- Open is good, open makes the world better.
And i'm a geforce user and fan for ages now, but this thing doesn't even consider consoles as "...it will only work with NVIDIA GPUs"...!.
And i'm wondering why don't they even make this built-in in nVidia cards, if that's how it works!
You see where i'm going here..."don't feed the money hungry cow-trolls".

Graphic cards are a two way street, you need something on the other end to get the output from the graphic card. That's why this has to be on a monitor so it'll keep up with what the graphic card is putting out.

Snake89 said,
Graphic cards are a two way street, you need something on the other end to get the output from the graphic card. That's why this has to be on a monitor so it'll keep up with what the graphic card is putting out.

Could easily be solved by making a standard that allows the user to enable a "variable refresh rate" mode on the monitor, and then the monitor just draws at the rate it gets frames. There is no need to have a NVIDIA specific device to do this.

All these people whining because a company invests a bunch of money developing a new technology and doesn't immediately share it with their competitors. Get a clue.

'The company says that testing G-SYNC with pro gamers has resulted in those players dominating others without the module inside their monitor'
Wow, what exquisite garbage. If that's so, then that means all monitors for sale have some extreme lag that I'm not quite getting...

The product DOES eliminate lag.
The GPU creates new frames as FAST as possible with a cap at the monitors maximum refresh such as 120Hz (120FPS). The monitor then draws every new frame as soon as it gets it rather than buffer (VSYNC).

It solves the following issue:
VSYNC ON - lag between button press and screen update due to buffering to synch with monitor refresh

VSYNC OFF - less lag but SCREEN TEARING due to new frames not drawing in synch

If should also help avoid MICRO-STUTTER since there are no buffer/synch issues, especially with SLI.

Seems like a SOFTWARE solution on the GPU side so it seems like an ideal thing to add to new HDTV's for game consoles (all game consoles). A dedicated HDMI input for gaming with no video processing for minimum lag and G-Sync or identical technology.

n_K said,
Yes there's lag, but unless you have milisecond reflexes, it's going to make ZERO difference

Play more Quake (Live).
Gamers that run 120fps+ fast reaction games at 120hz DO notice input lag significantly. most play with vsync off because it's unbearable on, but would love to eliminate tearing.

i just can't tolerate any company not willing to share something that's truly benefit to all gamers instead of minority who will be using it. this is bad for consumer and will not actually be widely adopter despite how good it will become eventually. AMD Your Move.

Being tied specifically to nvidia cards makes this tech as worthless as physx and anything else similarly shackled to their cards only.

Enron said,
Stop complaining, that's one of the reasons Nvidia cards are great.

- Sent from my GTX Titan

However, they are a bit short sighted with their 'restrictions' and technology lock ins.

Right now the industry is moving to new form factors, and outside of 'Tegra', NVidia is dead in the water. AMD has low power APUs and Intel HD is catching up to low end discrete video in tablets and laptops.

NVidia needs to find alliances with the new generation of devices, and not lock themselves into a gaming enthusiast market only. That is what killed video card companies of the past.


LaP said,
The price is ridiculous. How much a monitor coming with this gonna cost ???

it'll be $399 for the Asus VG248QE with G-Sync.

I paid $289 for my monitor on sale last week. So buying a monitor with it would be cheaper then buying a DIY kit.

has resulted in those players dominating others without the module inside their monitors
I better add this in with my Killer NIC so I can dominate all the noobs in UT 2004!

I can understand knocking the Killer NIC, but this actually seems good. I doubt it'll significantly affect my gameplay, but if it makes the game smoother and such, then that's worth something.

There shouldn't be two different closed options for something like this, I think the monitor makers should make their own version (reverse engineer it) and have it work with anything, not just NVidia GPUs.

The problem is MS is not working on PC anymore. So there wont be much new multiplatform stuffs in the upcoming years and that's why AMD and nVidia will make their own proprietary stuffs more and more. I really wish MS would not feel the need to give up on PC gaming to make the XBox a success. If it were not for Valve, AMD and nVidia PC gaming would be in a terrible shape today.

LaP said,
The problem is MS is not working on PC anymore.

How is Microsoft fitting into this? They don't make GPUs or monitors or even PC's...

LaP said,
The problem is MS is not working on PC anymore. So there wont be much new multiplatform stuffs in the upcoming years and that's why AMD and nVidia will make their own proprietary stuffs more and more. I really wish MS would not feel the need to give up on PC gaming to make the XBox a success. If it were not for Valve, AMD and nVidia PC gaming would be in a terrible shape today.

I to fail to see how this is MS dropping the ball, they have no influence on the technology incorporated into monitors. This would be something for the monitor companies to work out between themselves and AMD/Nvidia.

I was interested up to the "Naturally, the G-SYNC module will only work with NVIDIA GPUs" part. I'm not buying a new monitor and a new graphics card.

Pupik said,
I was interested up to the "Naturally, the G-SYNC module will only work with NVIDIA GPUs" part. I'm not buying a new monitor and a new graphics card.

don't worry AMD will create rip off eventually

Shadowzz said,
so you have to replace your monitor if you switch GPU's? no ty. Currently got NVidia, next one will be AMD again (damn you NVidia)

I doubt that a monitor would require an Nvidia graphics card. It just won't provide G-sync capabilities. There's no way that the monitor manufacturers would buy into a product that eliminates half their audience/profits.

_Alexander said,
Monitors are cheap compared to graphic cards. Besides I bet the monitor is going to be compatible with a few generations of NV GPUs.

What kind of ****ty monitor are you using? any decent monitor is easily over $300, way more depending what you are looking for (size, refresh rate or color accuracy).
Your monitor should be an investment that lasts you about 5 years, a video card? swap evey year or two.

gonchuki said,

What kind of ****ty monitor are you using? any decent monitor is easily over $300, way more depending what you are looking for (size, refresh rate or color accuracy).
Your monitor should be an investment that lasts you about 5 years, a video card? swap evey year or two.

Which is twice as cheap as a NV 780.

Probably doesn't matter what type of monitor you're using (but probably has to be newish), they'll be selling do-it-yourself kits according to the article. I'm assuming it'll need to be a vaguely modern monitor to be compatible though.

AR556 said,
This will be awesome when it filters down to mid range priced Monitors/video cards!

This will be awesome once it becomes an open standard!

Oh wait, we are talking about Nvidia... move along, nothing to see here.

gonchuki said,

This will be awesome once it becomes an open standard!

Oh wait, we are talking about Nvidia... move along, nothing to see here.


Yes, but I don't care. I'm sticking with NVidia anyways.

gonchuki said,

This will be awesome once it becomes an open standard!

Oh wait, we are talking about Nvidia... move along, nothing to see here.

Fixing this for you...
This will be awesome once it becomes a standard.

Everyone gets too caught up on the illusion of 'open' and forgets that standards drive the industry. Microsoft has given thousands of 'standards' to the hardware industry over the years, that are technically not 'open', but are free and are standards.

Crimson Rain said,

Yes, but I don't care. I'm sticking with NVidia anyways.

Not to want to sound all preachy, but people should care. Competition is a good thing even if you do want to stick with Nvidia, it keeps them on their game and stops them hiking prices, it keeps them in check.

Things like this should be open standards, as we'll now have an additional piece of hardware (in the monitor) which is directly linked to a certain brand of Graphics card in the PC. As monitors are not seen as something that you really need to upgrade too often it will stop anyone with the tech jumping to an AMD card if they wanted to (if at the time of upgrade, AMD simply had a better card at a better price for instance), because they'd be more entrenched into the Nvidia brand.

duddit2 said,

Not to want to sound all preachy, but people should care. Competition is a good thing even if you do want to stick with Nvidia, it keeps them on their game and stops them hiking prices, it keeps them in check.

Things like this should be open standards, as we'll now have an additional piece of hardware (in the monitor) which is directly linked to a certain brand of Graphics card in the PC. As monitors are not seen as something that you really need to upgrade too often it will stop anyone with the tech jumping to an AMD card if they wanted to (if at the time of upgrade, AMD simply had a better card at a better price for instance), because they'd be more entrenched into the Nvidia brand.


Yes, that's why I said "yes" at first.