Microsoft hints DirectX 12 will also be supported by Xbox One

Earlier this week, Microsoft posted up a teaser site for their upcoming reveal of DirectX 12, the next major version of its graphics API that's used in nearly every PC game. Now that site has been updated with a Xbox One logo in addition to the Windows symbol.

This change to the teaser site would seem to strongly hint that any games that are made with DirectX 12 will work seamlessly with Microsoft's latest home console. The new API's features could also serve as a graphical boost to games made for the Xbox One. Currently, the console supports DirectX 11.2, which is also used in Windows 8.1.

As we have reported before, previous rumors about DirectX 12 claim that it will enable developers to have low-level access to PC hardware, similar to what AMD has created for Mantle, which launched a few weeks ago as a graphics driver for its Radeon line of GPUs.

DirectX 12 will be the subject of a presentation at the 2014 Game Developers Conference in San Francisco on March 20th. However, we may have to wait until at least spring 2015, the projected launch timeframe for Windows 9, before DirectX 12 becomes available.

Source: Microsoft | Image via Microsoft

Report a problem with article
Previous Story

Microsoft sent Titanfall's developers some slick looking custom Xbox One consoles

Next Story

Leaked video shows Wi-Fi Sense in Windows Phone 8.1

47 Comments

Commenting is disabled on this article.

Master of Earth said,
I hope it won't make mantle irrelevant in the future.

I hope it will. There's nothing worse that a vendor locked technology, we already have some. PhysX, G-Sync, now Mantle. Every customer and developer will benefit if similar features will become an official standard.

No, I didn't. I read that it's AMD only. Still, if next DirectX will include similar low level access, it would be better, simply because NVIDIA will be forced to support it.

http://support.amd.com/en-us/search/faq/184

"Mantle was designed in a way that makes it applicable to a range of modern GPU architectures. In the months ahead, we will be inviting more partners to participate in the development program, leading up to a public release of the specifications later in 2014. Our intention is for Mantle, or something that looks very much like it, to eventually become an industry standard applicable to multiple graphics architectures and platforms."

That's so vague In any case, Mantle COULD become standard IF Nvidia wants to support it. On the other hand, Nvidia MUST support the DirectX's Mantle-like API if their hardware supports DX12.

Weissmeister said,
That's so vague In any case, Mantle COULD become standard IF Nvidia wants to support it. On the other hand, Nvidia MUST support the DirectX's Mantle-like API if their hardware supports DX12.

Go thank Nvidia for acting like kids that they don't want to play nicley with others. AMD is releasing "G-sync" like SOFTWARE in upcoming months which don't need no extra hardware (works for both Nvidia and AMD cards) - even AMD enginiers were baffeld as why Nvidia requires hardware for G-sync as it can easily be done on software levels.

to lock you into there cards. Monitors still need to support vesa 1.2 v_blank which most dont and is updated in the displayport 1.3 specification so still requires new hardware and monitors so yah

thank god mantle got the team to get going again.
very excited to see the new version.

perhaps directx 12 exclusive to windows 8 or windows 9?
windows 9 to be announced/shown off at this year's build?
I'm calling it.

aviator189 said,
thank god mantle got the team to get going again.
very excited to see the new version.

Mantle was announced 25th September 2013. You don't really think MS came up with DX12 in under 6 months, do you?!

Such complex APIs don't just spontaneously happen. It must've been in development for a very long time.

this is why we need AMD so stick around. Even if their techs aren't widely successfull it's enough to spark competition in the industry.

eddman said,

Mantle was announced 25th September 2013. You don't really think MS came up with DX12 in under 6 months, do you?!

Such complex APIs don't just spontaneously happen. It must've been in development for a very long time.


hmm, yeah you're probably right about that.
but still, competition rocks. i'm much more excited for directx 12 than for mantle or opengl.

aviator189 said,
thank god mantle got the team to get going again.
very excited to see the new version.

perhaps directx 12 exclusive to windows 8 or windows 9?
windows 9 to be announced/shown off at this year's build?
I'm calling it.

It's already been called. Its been said in a news article on here or somewhere else that Microsoft might go back to announcing the next release or Windows a year ahead of time like they used to.

What I hope is they announce Windows Phone 9 also. Possibly even public betas of both Windows and Windows Phone. That would be exciting.

eddman said,

Mantle was announced 25th September 2013. You don't really think MS came up with DX12 in under 6 months, do you?!

Such complex APIs don't just spontaneously happen. It must've been in development for a very long time.

Microsoft still have one year to work on it and that's plenty of time. They could have worry because array of game such as BF4, Thief, PVZ: Garden Warfare, NFS ,Star Wars and star citizen will be supported. It's not surprise to see them quickly announce a new version of directX to lure the attention back from developer because once mantle is mature then it might be hard to ignore.

So it's not wrong to point out that mantle is the major reason for all of sudden that openGL and directX decided to update their API to compete with each other.

aviator189 said,
thank god mantle got the team to get going again.
very excited to see the new version.

perhaps directx 12 exclusive to windows 8 or windows 9?
windows 9 to be announced/shown off at this year's build?
I'm calling it.

Really? I suppose hUMA and HSA inspired Microsoft as well?

The foundation of Mantle has several Microsoft origins.

(PS hUMA/HSA already exists in 'software', it is in the kernel of NT. Notice the AMD presentation of these technologies, they make a very clear distinction that they put it into the hardware controller instead of being 'software' based as it already existed on NT going back to the development of Vista.)

Microsoft truly needs to be a publicity ######. Because instead of people being apathetic to their involvement, they assume Microsoft is copying someone else or lazy at developing technologies. - Which is just jaw dropping when the 'technologies' came from Microsoft.

eddman said,

Mantle was announced 25th September 2013. You don't really think MS came up with DX12 in under 6 months, do you?!

Such complex APIs don't just spontaneously happen. It must've been in development for a very long time.

Mantle was being developed and code shared for years with MS and EA, etc. They knew about it.

sinetheo said,
Backported to Win 7?

If not no one will bother to use it

Windows 7 will be entering extended support phase by the time DX12 is out, so that seems unlikely.
Windows 8.x is on 20% of Steam PCs just after 15 months, I guess it's good enough adoption rate for Microsoft to focus on existing and future Windows versions, not the aging Windows 7 that you people will be clinging to like those people before you clung to XP.

Edited by Weissmeister, Mar 9 2014, 5:57am :

Weissmeister said,

Windows 7 will be entering extended support phase by the time DX12 is out, so that seems unlikely.
Windows 8.x is on 20% of Steam PCs just after 15 months, I guess it's good enough adoption rate for Microsoft to focus on existing and future Windows versions, not the aging Windows 7 that you people will be clinging to like those people before you clung to XP.

Not even remotely close. Developers are not going to waste their time with an API that only 20% of their potential customer base can use, especially when competing APIs offer similar or better performance at far higher market penetration.

If DirectX12 will require new hardware, like all previous versions did, the customer base will be even less. In fact, it will probably be limited to the few expensive high end video cards for the first few months. And it still won't stop developers from making games for the new API. The adoption won't be quick, of course, but the situation was roughly the same when the new DirectX was available for older Windows. Hey, when DX10 was launched, it had virtually zero market share being limited to Vista and those monstrous GTS 8800 and similar Radeon cards. And still we had several great games that used DX10 in 2007.
Many developers still ignore anything higher than DX9. And this has nothing to do with DX10 and higher not supporting Windows XP.
And please, stop this nonsense about 'competing' APIs. Where are they? Why there's no decent games that use them, especially if DirectX is tied to specific Windows versions?

sinetheo said,
Backported to Win 7?

If not no one will bother to use it

Possible, the but the Win 7 NT kernel and driver model already can't juggle the features added in Windows 8/8.1.

At this level of difference and complexity, it would be easier to just release Windows 8.2 and have it load the Win 7 WinSxS subsystem for all the people that the UI changes still confound and scare.

Mobius Enigma said,

Possible, the but the Win 7 NT kernel and driver model already can't juggle the features added in Windows 8/8.1.

At this level of difference and complexity, it would be easier to just release Windows 8.2 and have it load the Win 7 WinSxS subsystem for all the people that the UI changes still confound and scare.

DirectX11 was ported to Vista. Yes WDDM 1.1 is more limited, but MS still got IE 10/11 backported after they included a few features from 8.

It is not that people are confound and afraid of change Mobius. It is a new era where people do not upgrade anymore unless there is a reason. The changes are no longer revolutionary. People have more files than ever and programs that are difficult to find and printers and other hardware. It is a royal pain in the d***n ass to upgrade. The cloud will help eventually.

But at the end of the day the MBA suits make the shots. 12 years ago these guys say target the newest to show off feature sets. Today it is focus on marketshare. With directX11 vista users are covered all the way to the latest and greatest Windows. So it is a mantle patch instead for higher end users.

It costs tens of millions to develop a game today. It makes no logical sense to not want the biggest ROI as possible. No users won't upgrade for a game. They just won't buy it.

Many gamers are college students who have papers due and can't change what they have until summer, highschool students whose parents do not want them messing with things that work fine, and so on.

Mobius Enigma said,

Possible, the but the Win 7 NT kernel and driver model already can't juggle the features added in Windows 8/8.1.

At this level of difference and complexity, it would be easier to just release Windows 8.2 and have it load the Win 7 WinSxS subsystem for all the people that the UI changes still confound and scare.

So stop messing around with the kernel and driver models and start writing a graphics API that isn't needlessly dependant on either.

Mantle does it, OpenGL does it - so why can't DirectX?

It could be possible if Microsoft adopts Mantle API itself in DirectX 12. AMD GCN 2.0 presents in Xbox One has support for DirectX 11.2 while if Microsoft adopts Mantle API in all its sense then Xbox One could have newly claimed DirectX 12. This is just what comes to mind, I know it does not make sense. Although it could be possible.
I always want something better for us PC Gamers if DX 12 is in this right direction, who cares, we PC Gamers welcome it with warm heart.

Mantle I think is non issue unless its supported by Nvidia. If directX 12 does what mantle does then I don't see why developers would devote resources to it. With Nvidia and AMD making cards that takes advantage of Direct X 12 that should be good enough.

I doubt that DX12 can do the things Mantle can, since AMD produces the cards and the API itself so they have some more knowladge to get even more closer to the metal. But that's just my opinion.

alwaysonacoffebreak said,
I doubt that DX12 can do the things Mantle can, since AMD produces the cards and the API itself so they have some more knowladge to get even more closer to the metal. But that's just my opinion.

I agree that it probably can't get as close to the metal as AMD can take it with mantel but if the difference between it and DX12 is single digit % of performance then the added benefits of DX, that it's supported by other GPU makers as well, will win out. Right now Mantel is such a hot topic because, iirc, in BF4 it brought a 30% difference in performance over DX11. If that is cut to 5%, less of a issue.

This is sort of interesting. DX used to have close to the metal capabilities, didn't it? And as time went by, it got farther away to improve stability and such. Now they're going closer?

Chikairo said,
This is sort of interesting. DX used to have close to the metal capabilities, didn't it? And as time went by, it got farther away to improve stability and such. Now they're going closer?

DX11 has 'closer' to the metal features than Direct3D ever has. Ironically, the still overly used DX9 is the version that is furthest from the metal, back when it was released and even more today as a lot of DX9 is translated/emulated on most GPUs.

not really sure how dx 12 will work on the X1 unless theres hardware features in there that 11.2 isnt using cus it uses its own low level code. Technically the X1's GCN optimisations will be ported into the PC version of DX then maybe some optimizations for the hawaii core for GCN 2 then theyll prolly add Kepler and Maxwell architectures and if MS arent stupid they wont back port it further cus itll just get more bloated and complex and end up slowing stuff down again

Wonder if this will enable the xbone to support 1080 @60 fps properly?, maybe MS can introduce an expansion gpu add-on for the console if not? /s

Simon Fowkes said,
Wonder if this will enable the xbone to support 1080 @60 fps properly?, maybe MS can introduce an expansion gpu add-on for the console if not? /s

the xb1 and even the xb360 can display games at 1080p and 60fps, as long as the game devs choose the right amount of effects and model complexity so that the frame rate don't drop below 60fps.

it's only a matter of adjustments of scene complexity.

no matter how powerful a cpu/gpu is, a developer may choose to reduce the resolution to 720p because he has determined that the rendering looks better with a more complex 3d models in a 720p resolution than in 1080p with reduced model details (a 1080p resolution with most complex models running at 30-50fps for example would be a terrible experience)

so it doesn't make sense to ask whether some changes can make upcoming games run at 1080p/60fps.

Even on the ps4 they won't always run at this resolution. Even if you buy a very high end steambox, you won't be able to play games that will be released in 2years at 1080p/60fps.

Edited by link8506, Mar 8 2014, 10:54pm :

Framerate is important to gaming but resolution is vastly overrated. A developer should hardly ever dumb down the textures, lighting, models or other aspects of the game just to boost the resolution. Focus your effort in areas that will actually make the game look better.

Yeah no. Higher resolutions not only look better but eliminate the need for heavy anti-aliasing. The whole point of improving hardware is to make the visuals better, there is only so much you can do by trading off.

The 720p 3D argument is simply ridiculous and wrong.

link8506 said,

the xb1 and even the xb360 can display games at 1080p and 60fps, as long as the game devs choose the right amount of effects and model complexity so that the frame rate don't drop below 60fps.

it's only a matter of adjustments of scene complexity.

no matter how powerful a cpu/gpu is, a developer may choose to reduce the resolution to 720p because he has determined that the rendering looks better with a more complex 3d models in a 720p resolution than in 1080p with reduced model details (a 1080p resolution with most complex models running at 30-50fps for example would be a terrible experience)

so it doesn't make sense to ask whether some changes can make upcoming games run at 1080p/60fps.

Even on the ps4 they won't always run at this resolution. Even if you buy a very high end steambox, you won't be able to play games that will be released in 2years at 1080p/60fps.

Nobody still prefer to play a game that's 720p with a X1 or PS4 because that's last generation resolution and will limit enormous of detail that can be display on a screen. It's matter of optimization to able to play a game that render at natively 1080p and GTA V (Native 720p) is a fabulous example that despite the console only contain 512MB of RAM that it still able create a beautiful biggest open world game.

The 8th generation consoles have 8GB of ram and that's sufficient to move beyond 720p and only idiots will think it's too hard to accomplish that.

Edited by Master of Earth, Mar 9 2014, 5:31am :

Master of Earth said,

Nobody still prefer to play a game that's 720p with a X1 or PS4 because that's last generation resolution and will limit enormous of detail that can be display on a screen. It's matter of optimization to able to play a game that render at natively 1080p and GTA V (Native 720p) is a fabulous example that despite the console only contain 512MB of RAM that it still able create a beautiful biggest open world game.

The 8th generation consoles have 8GB of ram and that's sufficient to move beyond 720p and only idiots will think it's too hard to accomplish that.

when you're a game developer, it's not hard to make a game run at any resolution and target framework you want. Technically, there is no difficulty in achieving that.

HOWEVER, the higher the resolution is, the more likely you'll have to drop the number of polygons in a 3d scene (and other effects) so that the game doesn't drop below the target frame rate.

if 1080p was a must have requirement, then every game would run at this resolution and devs would just use less detailed models.

but some game developers may thing that their game looks better with more polygons on the 3d scene (which is the result of using more detailed 3d models) even if that means running at a lower resolution than 1080p.

and they may be right, because most gamers don't have a huge tv screen, and when sitting at a few meters from their TV, they are less likely to see the difference between 1080p and 720p than the difference between using low resolution and high resolution models and textures.

that's why 720p is not always a bad choice.

even on the Xbox 360 many games such as Halo 3 ran at 640p, not 720p.

link8506 said,

when you're a game developer, it's not hard to make a game run at any resolution and target framework you want. Technically, there is no difficulty in achieving that.

HOWEVER, the higher the resolution is, the more likely you'll have to drop the number of polygons in a 3d scene (and other effects) so that the game doesn't drop below the target frame rate.

if 1080p was a must have requirement, then every game would run at this resolution and devs would just use less detailed models.

but some game developers may thing that their game looks better with more polygons on the 3d scene (which is the result of using more detailed 3d models) even if that means running at a lower resolution than 1080p.

and they may be right, because most gamers don't have a huge tv screen, and when sitting at a few meters from their TV, they are less likely to see the difference between 1080p and 720p than the difference between using low resolution and high resolution models and textures.

that's why 720p is not always a bad choice.

even on the Xbox 360 many games such as Halo 3 ran at 640p, not 720p.

720p is pixelated for a 1080p monitor or tv and it's hard to enjoy and see all the detail if you have many object and see other things come in your way.

If the people who can't notice the difference between 1080p and 720p are not likely a gamer but rather a casual gamer that simply don't care. That can't being said to the enthusiastic gamer who think 1080p is standard to this generation of gaming consoles especially they really do know more what's best

Or the Xbox One was built to support DX12 or at least just some of it's features like how the Xbox 360 supported some DirectX10 features.

King Joffrey said,
So is this mean our DX11 cards will be also suported? (Mine GTX 770)

What normally happens is that most features are backwards compatible but some require hardware acceleration or run considerably more efficiently with optimised hardware. That's unlikely to change with DX12. It seems unlikely that this will be the first version of DirectX not to introduce new hardware features.

The problem is that people think if a GPU isn't fully compatible with a certain DX version, then it doesn't support any of its features. A misconception.

Each new DX version or revision brings a plethora of features, not just one.
For example, although geforce 600 and 700 GPUs aren't DX11.1 compatible, they support all the gaming related features, so who cares if they don't have a "DirectX11.1 compatible" badge on the box.

theyarecomingforyou said,

What normally happens is that most features are backwards compatible but some require hardware acceleration or run considerably more efficiently with optimised hardware. That's unlikely to change with DX12. It seems unlikely that this will be the first version of DirectX not to introduce new hardware features.

It could go either way...

However, it is worth noting that the Xbox CPU/GPU does contain hardware level technology not found anywhere else yet. There are differences in the memory and instruction controllers as well as the more known features like eSRAM.

Microsoft are doing a few things different in the Xbox One SDK update that very well could be a part of DX12. Since part of that update is based on getting the eSRAM performance to games through the framework instead of developers designing to use it, it could mean there is a PC equivalent requirement that would need it to be present in hardware.

Microsoft might also take this time to let the hardware even out and get developers to focus on skipped features and performance features of DX11 that DX12 might make easier to access or take on the work for the developers.

So there is 'new' hardware, but until they say, flip a coin.