NVIDIA ends full driver support for legacy GPU's older than the GTX 400

NVIDIA has announced that it will be dropping support for a large number of legacy cards to coincide with the release of its new driver. NVIDIA cards older than the GTX 400 will no longer receive full driver support (enhancements and improvements) after Release 340. While this might seem outrageous, some of these cards have been in circulation for several years.

While NVIDIA is discontinuing support for these graphics cards, it will not completely abandon them. NVIDIA will continue to provide limited support for driver issues and bug fixes until April 1, 2016. That means NVIDIA will have provided a full decade of support​ for some of these legacy cards, which seems more than reasonable. Those with cards not listed can expect to receive regular updates from NVIDIA. The latest release for newer cards will receive updates and enhancements through Release 343 and onwards. 

NVIDIA also had a hell of a time shipping a driver that worked correctly with series 400 and 500 cards for almost two years, only fixing the freezing issues with release 323 onward. It's unclear if the decision to drop legacy support had anything to do with those issues.

Source: NVIDIA via Forum tip | Image: NVIDIA

Report a problem with article
Previous Story

Bing catches up with Google again with new Image Match feature

Next Story

Bill Gates: 'Microsoft would have been willing to buy' WhatsApp

47 Comments

Commenting is disabled on this article.

So much for unified drivers, I guess. I'm not replacing my 9800 GT for what they want for newer cards.

I have an MXM II GTX280M 1GB in my Clevo M571TU gaming notebook. I wish it was as easy as upgrading! The cards for this still cost upwards of £180 USED! I'm not sure if I could even find a modern MXM II card for it as the heatblocks for ATi and nVidia cards for this machine are different!

It's no wonder nVidia suport them for so long, I'm surprised my card hasn't died from thermal stress yet, and this is a laptop, a flippin' heavy one, most of the weight is the heatblock for the VGA card! This machine was worth £1,440 new 4 years ago, the card was supposed to have been one of the defective solder flux affected ones but it has lasted. I don't game on it much for fear of killing it, as that would put this beauty out of commission, the 17" screen and big keyboard are great for everything.

I mostly use my desktop rig Lucy with her HD6870 MSi Twin Frozr II 1GB card for games now, the cards are cheaper to replace in desktops! If I had a quid for every GPU reball I've done over the years at work since RoHS came in mid 2006 I'd be richer than Bill Gates!

Fair play. You can't realistically expect them to keep providing drivers for cards older than 4 years, with all the time/expense that entails.

I would suggest 3 years support is enough and anything above that is gravy.

Currently they haven't been able to get a good driver for the 400 and 500 series since the 314.11 update but the 200 series takes the new one just fine......weird.

Not every 500 series card, or even 400 series card, had issues with later drivers - what issues I had were entirely application-driven (Flash in particular). How many of the reported issues were also specific-application-driven?

Which made sense - HD4xxx didn't support DX11 at all. I didn't move from AMD to nVidia due to lack of GPU support (in fact, my Mom has my old HD5450 that my refurb GTX550 Ti replaced), and she has the last pre-Mantle Catalyst (only because I have not updated that driver yet due to her running Windows 7 Ultimate x64 and her NOT being a big gamer). Look at the majority of folks that are planning to stay put - they are staying put due to lack of need for DX11 or later (because their OS of choice doesn't support the feature set). I get that much - therefore, I can't exactly blame them.

I don't see a problem with dropping most of those cards, a few of the laptop ones IMO should be getting a bit longer support but I know the 9600GT I have in one of my secondary desktop systems had a good run *not that for what it does now it needs updated drivers anyway. Have to admit those the 400\500 series driver issues did sour me on Nvidia a bit and I've been using them since Voodoo went under.

Well, apart for certain VRAM limitations my GTX 295 is perfectly usable for most modern games in 2014. Sure, it's 5 years old, but frankly I didn't NEED to upgrade.

i don't really see this as much of a problem, ok they say support is not going to be as in depth for these cards, however generally ive found that nvidia sort of give up on older cards after a couple of years anyway.

I have a couple second hand Geforce GTX 295 still rocking, but I understand the need for NVIDIA to move on.

ATI would have dropped driver support years ago..

Wut, My previous card was a AMD Radeon HD 3450.

Latest driver for this is released on 21st of january 2013.
The card is released in 2006.

Yeah sure. Talking about Nvidia supporting cards UPTO 4 years old after this moment. With AMD supporting cards for well OVER those 4 years.

Those are not legacy drivers. Those are the same catalyst that every other card gets, gets your stuff right. Last year I got an update to my 4650 that I got to my 7950.

I don't know if the same goes for mobility but amd stopped supporting the HD4200 in my laptop and it's not even two years old yet.

The 4200 series might be more than two years old and I could have possibly just got one of the last batches, dunno, I do know I would never even consider another amd card ever again I don't care what amount of hype they put into it or how many rabid fanboys say it's the greatest thing on earth, amd drivers are just plain horrible, their entire development team should be fired and then physical thrown right out the front door and barred from ever working on software again for the remainder of their natural lives.

Shame; One of the things I loved about NVidia. Long time driver support.

AMD usually ditched their cards after a few couple of years.

Driver support long term or short at Nvidia is horrible in my opinion. Saying so with a GTX760 Good card, bad drivers, bad performance. I can't get more then 60% out of it Even BF4 everything on ultra, it never goes above 55-60% usage while I constantly have frame drops (need to set 2 options to high instead of ultra for constant 60fps)
Before I was on AMD and never had such issues, I used that GPU for over 4 years and still had the latest catalyst. My next GPU (or any future GPU for that matter) will be AMD.

And only because Nvidia is horrible with its drivers.

Shadowzz said,
Driver support long term or short at Nvidia is horrible in my opinion. Saying so with a GTX760 Good card, bad drivers, bad performance. I can't get more then 60% out of it Even BF4 everything on ultra, it never goes above 55-60% usage while I constantly have frame drops (need to set 2 options to high instead of ultra for constant 60fps)
Before I was on AMD and never had such issues, I used that GPU for over 4 years and still had the latest catalyst. My next GPU (or any future GPU for that matter) will be AMD.

And only because Nvidia is horrible with its drivers.

And yet almost everyone else says the same thing about AMD drivers who can't even stick with final, having to release beta after beta after beta. And then they break support for legacy games because there's too much convoluted mess with codepaths for newer games.

ensiform said,

And yet almost everyone else says the same thing about AMD drivers who can't even stick with final, having to release beta after beta after beta. And then they break support for legacy games because there's too much convoluted mess with codepaths for newer games.

NVIDIA does beta drivers too. Not as often I don't think, but still do, so I don't quite understand why you're complaining about it.

AMD releases a monthly BETA of catalyst IIRC. Even if there are no code changes there will be a new beta. At least this is how it was upto a year ago.

So yeah it looks like they release a sht load of them. But its mainly to give users the latest and the greatest.
Nvidia can take months before there will be a next update, skipping a bunch of version numbers at once...
Dont know whats better, a monthly release schedule or random every 1-6 months.

Shadowzz said,
Driver support long term or short at Nvidia is horrible in my opinion. Saying so with a GTX760 Good card, bad drivers, bad performance. I can't get more then 60% out of it Even BF4 everything on ultra, it never goes above 55-60% usage while I constantly have frame drops (need to set 2 options to high instead of ultra for constant 60fps)
Before I was on AMD and never had such issues, I used that GPU for over 4 years and still had the latest catalyst. My next GPU (or any future GPU for that matter) will be AMD.

And only because Nvidia is horrible with its drivers.

I find your statement convoluted and opposite to my personal experience. Used to use ATi card. Have the hardware. Download the right driver. Installed it and it says it can't find the hardware. Every single ATi card I came across with.

When I switch to NVIDIA, no matter which company make the hardware, the driver installed and it works. It does not say I do not have the hardware installed. NOT ONCE. If the driver does not speed up your card, that is a different story. The "Driver Support" is based on the fact that if you have the right driver for it, any hardware manufacture or brands, the software driver should be able to detect it and installed it. That is what a "Driver Support" should be. You probably have a different definition for "Driver Support".

Heh, my experiences are the reverse of yours. Owned a few ATi/AMD cards and Nvidia. And currently am on Nvidia. Where untill last week where I had to completely reinstall my machine because buggy drivers giving me the frowny of death when the resource load on my GPU went over 50% (or as I estimate, heavy usage = crash, BF3 ultra etc crashed it, Q3 engine games at 200 FPS are fine, at max FPS = 999 it crashed)
Yesterday I updated the drivers and guess what, the install was hanging on 1/3rd of the progress. Force quit it, reboot and reinstall the drivers like a fresh install....

And without my CPU or RAM bottlenecking it, if I put BF4 everything on Ultra, I get FPS drops. Yet the usage of my GPU _never_ gets above 55%.

45% of the power in my GPU seems worthless/unused.

For a short period I used a GT520 to pass the time and I needed a little improvement over the Radeon HD 3450 I had...
Even though on every spec the GT520 is better. My HD 3450 performed A LOT better with 1/4th the ram and 25% less speed. Even when playing Nvidia optimized games (Unreal engine for example) my older crappier radeon was performing better.

And driver wise.... much better. Remeber when Windows 7 was released and you had to update your Nvidia drivers... it required a reboot.
AMD already had non-reboot drivers when Windows 7 wasn't even RTM yet. Nvidia took months to properly stick to the Windows API.
I had driver issues once, similar to yours, not detecting my GPU properly. But it was fixed with a reboot and a new install.

Other than that, Catalyst gives me a 100 times more control over my GPU then Nvidia's configuration. I can't really change anything on my current card, yeah make simple profiles (default config ruins Minecraft/java GPU apps for one, need to disable threaded optimization or something, otherwise you get only 30-40 fps while the card is doing heavy work for those 6 individual pixels it has to generate)

For me it's the absolute last time I'm using Nvidia.

I wonder how many people answered "No" to the question "Was this answer helpful?" at the bottom of NVIDIA's table page.

I did just because I read your post!

Never been a gamer and have never really paid attention to updating my video card other than through Windows update, which I know most of you say IS NOT the way to, but I've never had an issue doing it that way.

Never have really liked Nvidia either, but that's usually the only thing sold around here!

From the looks of that list, I believe all 9 of the computers I have, have one of those! The only reason any of the computers have those is from when I upgraded from XP to Windows 7, otherwise, I think every computer had Intel on board graphics! Yep, I run old crap, but they all run great still!

Edit:
Nope,
2 computers have ATI

Interesting to see that the 6 and a half year old GeForce 8800 GT in my old Core2Duo PC is basically still supported up until now.

Guess i'm not used to that with mobile phone manufacturers getting bored of supporting mobile phones after a year and a half.

I have an old 8800gt card in one of my systems. Great card for the time and i will still keep it. Of course, I still have a VooDoo card floating around somewhere....

I thought so to, until I tried to play COD: Ghosts and it didn't support DX11. It's lasted me 4 years, time for an upgrade. Pity, I could still play everything at 1920x1200 with decent settings happily.

That works on windows but on Linux, a 400 series card (mine is a 400 series) for instance can serve at least a decade or till someone gets a new computer. same as with people who have 200 series cards).
400 series cards introduced support for opengl 4. but almost all linux applications are still opengl 2 and 3.

Good, those dated DX10 cards were holding back the game industry. Hopefully devs will move along and ditch them just like they did for those deprecated DX9 cards. Games should be DX11-exclusive anyways.

And they largely got whacked for lack of DX10 (and even DX9c) support, by those gamers that refused to leave XP (which is still a rather shockingly LARGE amount of them). The GTX550Ti *does* support DX11 - as do the rest of the GeForce 500 series.

Another part of the reason for moving sub-GTX400 cards to legacy is the launch of the GTX750 and GTX750 Ti - crappy PSUs (unless you're talking sub-250W) are no longer a holdback from upgrading your GeForce GT and GTS series GPUs to a true GTX - most GTX750 Tis, even with aftermarket coolers and 2 GB of GDDR5, still use only the PCI Express slot for power, and will work with as low-end a chipset as Intel's G31, which itself goes back to the beginning of LGA775.

I was in the same boat as her, until a few months ago. The260 is a very versatile card, with more than enough horsepower to play most games. I could even make BF3 look decent on it.
But, it lacks support for a lot of new features that are starting to become average, instead of being just for the high-end - such as new AA methods, ShadowPlay (which is extremely useful - has made my FRAPS license irrelevant, and is less of a drain on my system), etc.

I'm with you. I only upgrade my graphics when I need to, and it was just getting to that point with the 260. I just switched to a 760, and couldn't be happier. Support for some of my legacy games even got a little better. On my 260, lighting would easily get messed up in Half-Life 2 (pitch black, or blinding white). I still run into some graphics issues with the 760, but now it is only something like textures not being loaded onto an NPC (you just get white portions on the model, like they got covered in paint).
Might be worth the switch, now that they are dropping support.

If I recall correctly, the 4xx series of cards was released in April, 2010, so that's almost four years old. Should we emphasize legacy?

Although it doesn't mention it in our article.. all cards were supported afaik, not just ones from ten years ago.. it really relates to features and enhancements still being added to cards from ten years ago being discontinued while older cards simply (still) had driver support. That is what is changing.

Look at the EOL list, they're ceasing support for products OLDER than GTX400, ie GTX260 / 9800GT / 8800GTS / GT 220 and such.
The four year old GTX400 series is still on the roadmap.

NinjaMonkey said,
Look at the EOL list, they're ceasing support for products OLDER than GTX400, ie GTX260 / 9800GT / 8800GTS / GT 220 and such.
The four year old GTX400 series is still on the roadmap.

I have a geforce 8800GT on my main PC right now. I guess its a reminder that I should buy a new PC now.

IntelliMoo said,

And with AMD

I'm sure if he'd picked AMD then he'd have had to replace the cards several times due to it burning out after about a year when the fan just decides to give up.