DirectX 10: why it's exclusive to Vista

When Microsoft officially announced that DirectX 10 (DX10) would only be available for Windows Vista, many gaming fans yearning to be on the bleeding edge were upset. In order to get the most from their video cards, users would have to upgrade their operating systems to Vista. Some have attributed Microsoft's decision to be purely based on marketing, but that's not entirely the case. What other factors were in play?

According to Microsoft DirectX guru Phil Taylor, development for DX10 wasn't complete until late in Windows XP's lifecycle, and during the time of its development, things became clear that DX10 simply would not fit into XP.

View: Full Article @ Ars

Report a problem with article
Previous Story

Microsoft Promotion: make MSN Your Homepage & Win a Zune

Next Story

U.S. senator: It's time to ban Wikipedia in schools

14 Comments

Commenting is disabled on this article.

No it does make sense, my first reaction when I heard about DX10 on Vista only was that they are being lazy and using it as an excuse to force upgrades.
But when you think about it, XP is like 6 years old now and customers have gotten more than there money's worth from it it over the years, we've had tons of free upgrades such as media player and various other features in that time.
Seriously, an OEM version is like £80. 6 years of free updates its pretty damn good for that price.

So ya, You need to remember they are a Business. Yes they could have done DX10 on XP but that would only serve to keep people away from Vista for longer, which doesn't really make sense when you want to push a new version of your product. It simply didn't make economical nor technical sense for them to backport it.
The amount of backend work they would have to do to XP is insane, DX10 requires WDDM driver as well. Why do such massive low level kernel work on such a mature OS this late in its product lifecycle. It doesn't make sense. They would then have to waste time ensuring applications work fine with the changes on XP and so on.

Its a feature of Vista, just as Aero is. They've given us lots of other Vista features for XP when they didn't have to, Instant Search, IE7(tho they would be hated if they didn't give that at least...) and Media player 11 so they can't be blamed for keeping SOME for Vista only.

Funny thing is, when apple do stuff like this with OSX(Omg you need osx 10.99994 to run our new chat client! or our new widget system!) No one seems to complain and those that do get branded mac haters and flamed out of the forums

this REALLY SUCKS! ... now im definitely gonna find a way to "get" (if you catch my drift) windows vista when it's nessesary for games in the future... but as of now i aint worry about it since pretty much all games on pc still use DX9.

Yeah it's not as if you should or if it's even worth payign for the one thing you use every day on your computer and is required even to use your computer for what you want.

ThaCrip: So you want to steal software that Microsoft spent countless hours and dollars making because you feel some sort of entitlement to it?

You must be above the law. Good for you!

I guess "your drift" it total arrogance.

They probably could have done it. but it would have required so much work and changes in XP as well that it wasn't economical.

Wha? Too much coding work? Windows Vista is nothing more than an expansive service pack for Windows XP, they could've just made DX10 for XP. Just marketing, man.

People keep saying this without knowing much of how different DX10 is and how it works compared to DX9.

Aside from the fact that it's pretty much brand new code, DX10 also drops things that you find in DX9, like DirectSound 3D, and other things are combined into one, mostly 2D and 3D are now together in DX10.

That aside, to get DX10 to work the way it should, you have to make kernel level changes, driver changes, and then you have to change the graphics/UI system changes, all of which, and more in Vista have been changed. Doing these changes to XP to get DX10 to work on it will, in the end of the day, just make it a crippled version of Vista. Why do I say crippled? Because in MSs need to keep compatibility in XP and get DX10 to work, theyd have to cut corners, and drop things. It's just not worth it.

You must be one of those people who confuse the "Start button" for the power button on the computer.

Clearly you don't comprehend what Vista is. Read up on it.

GP007 said,
...to get DX10 to work the way it should, you have to make kernel level changes, driver changes, and then you have to change the graphics/UI system changes, all of which, and more in Vista have been changed. Doing these changes to XP to get DX10 to work on it will, in the end of the day, just make it a crippled version of Vista.
For the most part, this is correct. GDI in Vista is substantially different than GDI in XP, and yes, there would need to be significant changes for drivers, kernel, etc.

If the DirectX team were to do DX10 compatibility on XP, in the end it can result in a potentially unstable OS (as opposed to a "crippled" version of Vista). There are too many different system configurations to asure a pleasant customer experience across the board. Considerations need to be made about OS and system stability, and sometimes there is just too much risk.

Another thing to consider is the 80/20 Rule: you develop for the majority. Basically, over 99% of XP machines sold did not come with a DX10 videocard installed, and over 80% of those machines are not capable of being upgraded to a DX10 videocard...so the majority of XP customers wouldn't experience most of the benefits of DX10 anyway.

C_Guy said,
You must be one of those people who confuse the "Start button" for the power button on the computer.

Clearly you don't comprehend what Vista is. Read up on it.

An overpriced piece of crap? Yeah, I totally comprehend what Vista really is.

They could have done it (DX10 on XP) had they wanted to. Simply its a good way to get people onto Vista and cheaper and easier to do it just for one OS.