Microsoft forming team to "win back" game developers with "modern" open source framework

Could Microsoft's new dev team allow games like Halo: Spartan Assault to work on non-Microsoft platforms?

Microsoft has been trying to convince game developers to make their titles on the company's platforms for some time. The results of those efforts have been mixed so far, but this week, a new listing on the Microsoft Careers website claims that the company is serious about bringing game creators over to their side.

How will they do this? According to the Careers listing for a Software Development Engineer, Microsoft is forming a team called "New Devices and Gaming" that wants to make a way to create games on multiple platforms, including those not controlled by Microsoft.

The goal of Microsoft's new team is pretty straightforward; it wants to "win back our game developers from our competitors." The listing adds, "We will create a modern framework that is open source, light-weight, extensible and scalable across various platforms including Windows Store, Windows Phone, iOS and Android."

Yes, Microsoft says it wants to make a way to create games that will be an open source solution. This is nothing new for the company; it has supported open source software in the past and has released certain products to that community, such as the recent NewsPad blogging tool. However, this new effort seems to be a step up from these previous projects.

Exactly how this will work is still a mystery, but Microsoft might reveal more information in a few weeks as part of its activities at the 2014 Game Developers Conference, to be held in San Francisco later this month.

Source: Microsoft Careers via WinBeta | Image via Microsoft

Report a problem with article
Previous Story

Microsoft's latest TV ad takes a jab at Apple for not offering touchscreen Macs

Next Story

Microsoft quietly puts in scientific calculator feature in Bing

50 Comments

Commenting is disabled on this article.

Likely various posts on the old CRYDEV forums. Consider how much whackage Crytek has taken with the changes to CryENGINE as of Crysis 3 - specifically, the killing off of DX9c going forward. That is despite every hardware survey (especially those of Valve) showing that DX11 alone vastly outnumbers DX9c in existing PCs by a massive margin. The only fracturization of game development (even in terms of DX - let alone any other API) is being done by game developers themselves - mostly in the name of multi-platform. As most of the critics of Microsoft "in general" admit, DirectX is entirely a Microsoft platform - therefore, anybody in hardware OR software that competes with (or sees themselves as competing with) Microsoft will either avoid it or somehow try to fork/spork it, simply to avoid being seen as kowtowing to Microsoft. (Valve itself is a major example - and that is prior to SteamOS.) Image is a large reason/excuse for the fracturization - any of the PR speeches at any major developers' conference (such as GDC or even the Tokyo Game Show) points that out - nobody wants to be seen as under Microsoft's thumb.

Why bother when there's stuff like Unreal or Unity that targets all that, plus Vita/PS3/360/PS4/Xbox One, as well as OSX and SteamOS?

Geezy said,
Why bother when there's stuff like Unreal or Unity that targets all that, plus Vita/PS3/360/PS4/Xbox One, as well as OSX and SteamOS?

Because what Microsoft will be offering will help a product like Unity, not compete with it.

We actually don't know that it could help a product like Unity across the platforms mentioned...

What exactly the framework is, however, is not mentioned, so things remain just a bit vague. The requirements include experience with C++, C#, Java and shipping a public API.
All we know is that it's supposed to help "win back our game developers from our competitors."

gawicks said,
Microsoft already has an open source, cross platform gaming Framework . It's called MonoGame.
Right, but they like to control things

gawicks said,
Microsoft already has an open source, cross platform gaming Framework . It's called MonoGame.

They should have kept and maintained where that came from: XNA. MonoGame is a poor replacement for it.

Eric said,

They should have kept and maintained where that came from: XNA. MonoGame is a poor replacement for it.

I remember XNA as being limited in what it could do and developers wanted the ability to do more? Which is why MS dropped it and went back to a pure DX/C++/C# option for game development. Now we have support for things like UE3 on WP iirc.

All in all I think this sounds like XNA part 2 but not closed off and open for people to write extensions for and so on, kind of like OpenGL really. It'll still have DX/D3D at the heart of it but now it's sounding like it won't be limited to just Windows platforms, and will cover iOS and Android. If you can get those developers working with this then the problem of porting games to Windows Store and Windows Phone is gone.

I know Monogame is not Made by Microsoft per se. But it's Microsoft's XNA ported to other platforms. Since it's open source it should hardly matter who wrote the code.

gawicks said,
I know Monogame is not Made by Microsoft per se. But it's Microsoft's XNA ported to other platforms. Since it's open source it should hardly matter who wrote the code.

Well it does matter... There are patent issues and everything else that can hamper real adoption of it...

Mono for .NET is safe due to the Novell - MS patent agreement that came out of the Novell suit for Wordperfect.

Sure they'll create a new framework for game development and then completely drop support for it a few years later.

Xilo said,
Sure they'll create a new framework for game development and then completely drop support for it a few years later.

It says it'll be open source, so that's kind of hard to do if it gets used from the start. The community will keep it going if MS doesn't keep building on it.

If this is your solution Microsoft then you clearly don't understand the problem.

The needless fragmentation of your own platform is why developers are moving to OpenGL, the perf boost from lower CPU/driver overhead is only an added incentive.

You need to get an API release out that is both competitive and runs on all your current software platforms. That means no more idiotic locking to OS releases with weak excuses involving WDDM.

If OpenGL can do PRT (aka Tiled Resources) with ARB_sparse_texture on XP, Vista, 7, and 8 - then I see no reason why DirectX shouldn't be capable of doing the same.

Wishful thinking at best... Microsoft still believes that tying things to Windows releases somehow make people want to upgrade more...

Look at IE. For some reason, Micorosoft still takes a year or more to "port" it to an older Windows version and syncs it to be launched with a new version of Windows. No one in their right mind is making their decision to upgrade based on IE...

I agree with you and I think they're finally seeing the light (at least, some light). The presentations we'll see at GDC about API optimization and nullifying driver/API overhead seem to be exciting reactions to Mantle/OGL. Let's hope they wake up already and do what's right for everybody.

LogicalApex said,
Wishful thinking at best... Microsoft still believes that tying things to Windows releases somehow make people want to upgrade more...

Look at IE. For some reason, Micorosoft still takes a year or more to "port" it to an older Windows version and syncs it to be launched with a new version of Windows. No one in their right mind is making their decision to upgrade based on IE...

It depends if Microsoft see how much they stand to lose by letting the de facto API become OpenGL.

As much as I would like for them to be ignorant and let OpenGL take over, I doubt that they're quite that blind.

Athernar said,
If this is your solution Microsoft then you clearly don't understand the problem.

The needless fragmentation of your own platform is why developers are moving to OpenGL, the perf boost from lower CPU/driver overhead is only an added incentive.

You need to get an API release out that is both competitive and runs on all your current software platforms. That means no more idiotic locking to OS releases with weak excuses involving WDDM.

If OpenGL can do PRT (aka Tiled Resources) with ARB_sparse_texture on XP, Vista, 7, and 8 - then I see no reason why DirectX shouldn't be capable of doing the same.

Just because OpenGL 'can' do some of the DirectX10/11 functionality does not mean it can do it as effectively or efficiently.

The thing people STILL DO NOT GET after 8 years of Microsoft endlessly explaining DirectX and the WDDM model is that DirectX DEPENDS on and EXPECTS the OS to have GPU thread and RAM scheduling in the KERNEL OF THE OS.

Windows XP cannot do the same things Windows 8 does as fast as Windows 8, not even close.

The other part of this equation is that by DirectX DEPENDING on and USING the OS GPU scheduling technologies, it takes more work away from the game developers.

This last part is even confused by many engine developers as they are doing FAR MORE work than they have to on DirectX. Even when developing multi-platform gaming, they are basing the models off of OpenGL's less efficient model and then only copying this back over to DirectX. Thus DirectX is NOT BEING fully utilized.

This last part is where Microsoft needs to train/correct/help developers so that they are NOT wasting computation by doing things the OS and framework is already managing for their code. Even today we have DX11 designed game engines that are built from an OpenGL standpoint and are not even close to using the performance advantages that DX10/11 offer, and instead are only focusing on the 'effects' features of DX10/11.

It is a lot like multi-platform code that is started on *nix and then moved to NT where the developers don't realize that 10 of their lines of code are already being done by the OS, and they leave this redundant code in because it is necessary on *nix. And it is either laziness or a genuine lack of understanding that NT is already handling all this in one faster API call, that they are often also still using when porting it to NT, making the code EVEN slower.


Additional Notes...

Microsoft didn't create the WDDM/WDM model just to screw with consumers or developers. It offers GPU features at the OS kernel level that NO OTHER OS currently even comes close to offering. It is why Windows 7/8 are still faster at even running OpenGL code than other OS, and do better and juggling several OpenGL/CL/GP-GPU applications at once.

The model came from joint work from the Xbox 360 team, as it was 'faster' for even single games running on the newer GPU model with agnostic shaders. If Microsoft had kept the DX9 model on the Xbox 360 or kept the Xbox 360 using the older bi-shader GPU technology it would never have performed even close to the level of the PS3, and instead it held its own with a lot less processing power.

Game developers are right now ignoring the latest and greatest, and instead are trying to serve everyone, and Microsoft realizes that developers are trying to bring games to both a XB1 and a freaking iPad at the same time, which are insanely different in terms of features and performance.

They have a solution for this this, and it will be interesting.


The other sad thing is that Microsoft was the biggest OpenGL support in the early 90s. They wanted to move it from CAD/Engineering to gaming, and the rest of the OpenGL world gave them the finger and the 3D hardware makers at the time gave them the finger as they wanted to use their own proprietary framework for their Gaming cards.

It was Microsoft and DirectX that unified gaming in the first place, so it is rather INSANE that we find someone complaining or blaming Microsoft for development fragmentation today.

Especially when 90% of all the technology used in gaming was created by Microsoft, from the hardware to the shader code to even the basis of OpenGL 4.x being based directly off of DirectX instead of making their own approach.

/rant off

LogicalApex said,
Wishful thinking at best... Microsoft still believes that tying things to Windows releases somehow make people want to upgrade more...

Look at IE. For some reason, Micorosoft still takes a year or more to "port" it to an older Windows version and syncs it to be launched with a new version of Windows. No one in their right mind is making their decision to upgrade based on IE...

Unless I'm really misremembering, wasn't IE10 and IE11 released for Windows 7 shortly after the Win 8 versions?

Mobius Enigma said,

Just because OpenGL 'can' do some of the DirectX10/11 functionality does not mean it can do it as effectively or efficiently.

Except, it can. In fact, OpenGL is already significantly faster than DirectX. AMD have implied as much, Valve have stated as much, and so have NVidia.

Details as follows at NVidia's SDD talk (Soon to be repeated at GDC jointly with AMD and Intel)

Beyond Porting: How Modern OpenGL can Radically Reduce Driver Overhead
Slides: http://media.steampowered.com/...ys/slides/beyondporting.pdf
VoD: http://www.youtube.com/watch?v=-bCeNzgiJ8I

Mobius Enigma said,
/rant

You're throwing out a bunch of history there as if it's going to stick to what people are talking about today. Microsoft then is not the same MS we see today. Different people, different goals.

Edited by dead.cell, Mar 2 2014, 4:48am :

Mobius Enigma said,
If Microsoft had kept the DX9 model on the Xbox 360 or kept the Xbox 360 using the older bi-shader GPU technology it would never have performed even close to the level of the PS3, and instead it held its own with a lot less processing power.

You was right about everything apart from that. The 360 has a more powerful GPU. Dev's have often stated this including John Carmack. It's overall faster and has unified memory (unlike PS3), as well as more available memory to work with. Most cross-platform games often run at higher frame rates on 360. Of course some of that is also down to what you mentioned and superior Microsoft dev tools.

Edited by NoClipMode, Mar 2 2014, 5:00am :

Except, it can. In fact, OpenGL is already significantly faster than DirectX. AMD have implied as much, Valve have stated as much, and so have NVidia.

Details as follows at NVidia's SDD talk (Soon to be repeated at GDC jointly with AMD and Intel)

Beyond Porting: How Modern OpenGL can Radically Reduce Driver Overhead


Does using OpenGL reduce driver overheads? Sure in some cases it does which reduces CPU load (like the slides say). But here's the thing, outside of a very few games (like WoW) which are heavily CPU dependent, for most games CPUs aren't a limiting factor. It doesn't matter if you use an old i3 or a top of the line i7, in the vast majority of games you will see 0-1 fps difference. Techspot have proved this a billion times.

Then if you continue reading on from the very slides that you posted it does go on to say that GPU performance can be negatively impacted but in cases that it will be, it won't be much.

John Carmack, the guy who has always "hated" OpenGL has agreed since Dx10 that it is superior to ogl.

---

Mantle works differently to OpenGL and DirectX, it's a far lower-level API which means that the software can access the hardware more directly than they can with DX and OGL. This will obviously lead to performance increases. Hopefully MS has something similar in the works for DX too.

One good thing about Mantle is it forces devs to re-write part of their code which indirectly helps with performance. A lot of "Dx11" games still use a metric ####ton of Dx9 with some random Dx11 features thrown on top because the devs aren't willing to maintain two separate codebases.

-Razorfold said,

Does using OpenGL reduce driver overheads? Sure in some cases it does which reduces CPU load (like the slides say). But here's the thing, outside of a very few games (like WoW) which are heavily CPU dependent, for most games CPUs aren't a limiting factor. It doesn't matter if you use an old i3 or a top of the line i7, in the vast majority of games you will see 0-1 fps difference. Techspot have proved this a billion times.

Then if you continue reading on from the very slides that you posted it does go on to say that GPU performance can be negatively impacted but in cases that it will be, it won't be much.

Um, driver overhead -IS- CPU overhead. The API driver's role is packaging and validating batches of commands to send to the GPU.

If you're being limited by the driver, you're being limited by the CPU.

-Razorfold said,

John Carmack, the guy who has always "hated" OpenGL has agreed since Dx10 that it is superior to ogl.

That quote is horrendously out of date and no longer relevant.

-Razorfold said,

Mantle works differently to OpenGL and DirectX, it's a far lower-level API which means that the software can access the hardware more directly than they can with DX and OGL. This will obviously lead to performance increases. Hopefully MS has something similar in the works for DX too.

No, Mantle doesn't really work different to existing APIs, it just exposes more functionality to the developer and does so in a manner more fitting of modern graphics hardware. (i.e. programmable vs fixed-function)

AMD have already stated they will be submitting OpenGL extensions to Khronos that will allow OpenGL to match Mantle in terms of performance. On top of that, OpenGL 4 already removed a bunch of fixed-function cruft that was flagged as deprecated in 3.

-Razorfold said,

One good thing about Mantle is it forces devs to re-write part of their code which indirectly helps with performance. A lot of "Dx11" games still use a metric ####ton of Dx9 with some random Dx11 features thrown on top because the devs aren't willing to maintain two separate codebases.

APIs don't work like that, also considering most engines are closed source, [Citation needed].

Um, driver overhead -IS- CPU overhead. The API driver's role is packaging and validating batches of commands to send to the GPU.

If you're being limited by the driver, you're being limited by the CPU.


Yes and like I said most games for most games from the past few years, it doesn't matter what CPU you use. There are a couple of games that are CPU dependent but the vast majority aren't. So reducing CPU overhead won't give you a performance increase.

I mean go look at CPU benchmarks for BF4, BF3 and plenty of other games. An AMD Phenom II gets 95 fps. Swapping the CPU (but keeping everything else the same) to a i7-4960X gives you 98 fps. 3 fps difference between an ancient cpu and a top of the line current gen cpu.

Of the top of my head the only current game that I can think of where CPU becomes a limiting factor is WoW because of how old it's engine is.

No, Mantle doesn't really work different to existing APIs, it just exposes more functionality to the developer and does so in a manner more fitting of modern graphics hardware. (i.e. programmable vs fixed-function)

Er so exactly what I said then? Mantle allows developers to gain more access to the hardware so they can get more out of the hardware.

APIs don't work like that, also considering most engines are closed source, [Citation needed].

There are very few games that are actually really Dx11. Most games that have a Dx11 mode isn't really a pure version of Dx11. They just added some Dx11 features into their existing Dx9 codebase.

Reason? A lot of games share the same engine between the 360 and the PC. This drives down development cost but still lets PCs use advanced features like tessellation. But this doesn't mean it's a "pure" Dx11 game. The base game engine code hasn't been fully rewritten for Dx11.

-Razorfold said,

Yes and like I said most games for most games from the past few years, it doesn't matter what CPU you use. There are a couple of games that are CPU dependent but the vast majority aren't. So reducing CPU overhead won't give you a performance increase.

The fact Mantle, the aforementioned SDD talk and Microsoft's GDC DirectX talk exist disagree with your assessment.

-Razorfold said,

I mean go look at CPU benchmarks for BF4, BF3 and plenty of other games. An AMD Phenom II gets 95 fps. Swapping the CPU (but keeping everything else the same) to a i7-4960X gives you 98 fps. 3 fps difference between an ancient cpu and a top of the line current gen cpu.

Of the top of my head the only current game that I can think of where CPU becomes a limiting factor is WoW because of how old it's engine is.

All that proves is there is minimal disparity between CPUs, not that CPU overhead isn't an issue.

-Razorfold said,

Er so exactly what I said then? Mantle allows developers to gain more access to the hardware so they can get more out of the hardware.

No, you said Mantle worked differently. Working differently != exposing more functionality. OpenGL's extension system works the same way, it exposes additional functionality. e.g. ARB_sparse_texture.

-Razorfold said,

There are very few games that are actually really Dx11. Most games that have a Dx11 mode isn't really a pure version of Dx11. They just added some Dx11 features into their existing Dx9 codebase.

Reason? A lot of games share the same engine between the 360 and the PC. This drives down development cost but still lets PCs use advanced features like tessellation. But this doesn't mean it's a "pure" Dx11 game. The base game engine code hasn't been fully rewritten for Dx11.

Oh? Provide an example of what makes a codebase not "pure DX11".

All that proves is there is minimal disparity between CPUs, not that CPU overhead isn't an issue.

Actually it proves both of that.

No, you said Mantle worked differently. Working differently != exposing more functionality. OpenGL's extension system works the same way, it exposes additional functionality. e.g. ARB_sparse_texture.

Sorry bad wording on my part.

Oh? Provide an example of what makes a codebase not "pure DX11".

Because it still uses legacy Dx9 code? I don't see why this is so confusing for you lol.

Very few games are written that only use DX11 functions, a vast majority of them use both Dx9 and Dx11. DX9 for the majority of the rendering, DX11 for features like tessellation. It would take a significant amount of work to rewrite a game to only use DX11 functions, but it will net you some nice improvements. Most devs aren't willing to do this because they make the very same game for the 360, so now you need to maintain two separate game engines.

With the Xbox One things should start to change but we'll see how long it takes for that to happen.

-Razorfold said,

Actually it proves both of that.

No, it doesn't, you've already conceded as much with your statements regarding Mantle.

If driver overhead wasn't an issue, Mantle wouldn't exist and neither would Microsoft's GDC talk. It's really as simple as that.

-Razorfold said,

Because it still uses legacy Dx9 code? I don't see why this is so confusing for you lol.

Code such as?

-Razorfold said,

Very few games are written that only use DX11 functions, a vast majority of them use both Dx9 and Dx11. DX9 for the majority of the rendering, DX11 for features like tessellation. It would take a significant amount of work to rewrite a game to only use DX11 functions, but it will net you some nice improvements. Most devs aren't willing to do this because they make the very same game for the 360, so now you need to maintain two separate game engines.

Which functions are these? Which ones preclude the other?

-Razorfold said,

With the Xbox One things should start to change but we'll see how long it takes for that to happen.

Given the dismal performance of the Xbox One, I hope not.

Athernar said,

No, it doesn't, you've already conceded as much with your statements regarding Mantle.

If driver overhead wasn't an issue, Mantle wouldn't exist and neither would Microsoft's GDC talk. It's really as simple as that.


Driver overhead =/= cpu overhead. There's plenty of power left from your cpu when playing games, you ararely ever run it at 100% cpu for the entire session. Which is why I've been saying games aren't cpu dependent.

Which functions are these? Which ones preclude the other?

I dont get how this is such a hard concept to get so now im going to assume you're just arguing for the sake of arguing.

Most games do not have engines that have been completely re-written for dx11. Period. They still use dx9 for the majority of the rendering with some of the fancier bits of dx11 thrown in.

You dont and have never had to use one or the other. You can still use both just you wouldn't get the full benefits of either.

Given the dismal performance of the Xbox One, I hope not.

What? I dont think you understand. In the past dx9 was used heavily because the 360 could only support that. This held Pc gaming back a bit.

Now that the one supports dx11, devs can start coding their games for that spec. Though if dx12 is really around the corner then my reaction to that is just meh.

Edit (since I can't edit my own post for w/e reason).

We're both confused as to what the other person is talking about when it comes to driver overhead.

I was talking about OpenGL vs DirectX performance in relation to the CPU and how that is pretty meaningless because games aren't limited by the CPU.

You are talking about DirectX vs Mantle and talking about driver overhead.

That's why I'm going on about how games aren't CPU dependent and you're going on about how reducing driver overhead leads to an increase in performance.

Athernar said,
That quote is horrendously out of date and no longer relevant.

I always watch Carmacks epic 3+ hour long talks at Quakecon, and with the last two talks he didn't flat out say it, but he certainly implied that DX is still better than OpenGL (or atleast as good). He's also a big fan of MS's developer tools.

Athernar said,
The needless fragmentation of your own platform is why developers are moving to OpenGL
Actually, cross-platform development has a lot more to do with it, DirectX is the odd one out, not working on PS3/4, Wii U, Vita, 3DS, OS X, iOS, Android, Steam OS...

-Razorfold said,

Driver overhead =/= cpu overhead. There's plenty of power left from your cpu when playing games, you ararely ever run it at 100% cpu for the entire session. Which is why I've been saying games aren't cpu dependent.

What do you think processes the driver thread? Driver overhead absolutely is CPU overhead.

-Razorfold said,

I dont get how this is such a hard concept to get so now im going to assume you're just arguing for the sake of arguing.

No, I think if anyone is arguing for the sake of arguing it's you, because you quite clearly cannot provide any proof (Or provide any technical examples) to back up your assertions, and your understanding of graphics APIs seems to be spotty at best, judging by your repeated references to tessellation.

-Razorfold said,

Most games do not have engines that have been completely re-written for dx11. Period. They still use dx9 for the majority of the rendering with some of the fancier bits of dx11 thrown in.

And what about those engines needs rewriting for them to be "true DX11-class" in your words?

-Razorfold said,

You dont and have never had to use one or the other. You can still use both just you wouldn't get the full benefits of either.

This is an incredibly vague assertion that could be either incorrect or correct depending on what you actually mean.

-Razorfold said,

What? I dont think you understand. In the past dx9 was used heavily because the 360 could only support that. This held Pc gaming back a bit.

API version on the consoles weren't the problem, it was the limited RAM pools that were the issue.

-Razorfold said,

Now that the one supports dx11, devs can start coding their games for that spec. Though if dx12 is really around the corner then my reaction to that is just meh.

They have multiple talks at GDC about "evolving DirectX" and low level access. It might not end up being called DirectX 12, but it certainly won't be a mere patch.

-Razorfold said,
We're both confused as to what the other person is talking about when it comes to driver overhead.

I was talking about OpenGL vs DirectX performance in relation to the CPU and how that is pretty meaningless because games aren't limited by the CPU.

Except driver overhead -IS- CPU overhead, it's absurd to state otherwise.

-Razorfold said,

You are talking about DirectX vs Mantle and talking about driver overhead.

I'm not talking about Mantle specifically, no.

-Razorfold said,

That's why I'm going on about how games aren't CPU dependent and you're going on about how reducing driver overhead leads to an increase in performance.

Not being CPU-bound doesn't mean you're not still being limited by it.

No, I think if anyone is arguing for the sake of arguing it's you, because you quite clearly cannot provide any proof (Or provide any technical examples) to back up your assertions, and your understanding of

Most games do not implement the real DX11 spec. Period. Sorry if this fact is too difficult for you to grasp.

Hell Nvidia GPUs don't even fully support the DX11 spec but we still call them DX11 cards don't we? AMD on the other hand, I believe, fully supports the spec to the letter.

graphics APIs seems to be spotty at best, judging by your repeated references to tessellation.

Because tessellation is a DX11 only feature? DX9 has no support for it?


And what about those engines needs rewriting for them to be "true DX11-class" in your words?

By actually fully utilizing the Dx11 spec? And not just random bits and pieces to get a little bit extra performance from 1 thing?

API version on the consoles weren't the problem, it was the limited RAM pools that were the issue.

Er they both were an issue. A game fully made in DX9 will not look as nice as a game that's fully made in DX11. But developers aren't going to write one game engine for the 360, one game engine for the PS3, and one game engine for the PC. It would be absurd.

Now that the new generation of consoles are similar to PCs (but still suck) and devs can start using all the DX11 features on both the Xbox and the PC, we should start seeing more real DX11 games. Apparently the PS4 also supports dx11.1 (not sure if true or not) so that should help PC gamers too.

Except driver overhead -IS- CPU overhead, it's absurd to state otherwise.

Read what I wrote rather than just copy pasting your exact words. It doesn't matter if OpenGL can process stuff on the CPU faster than DirectX Because IT WON'T LEAD TO A PERFORMANCE INCREASE. That was my point based on the Steam slide that you posted.

My entire point is how CPU isn't a limiting factor in modern games. I wasn't talking about driver overhead which I why I specifically made a post saying "EDIT".

-Razorfold said,

Most games do not implement the real DX11 spec. Period. Sorry if this fact is too difficult for you to grasp.

Go ahead and prove it, I dare you.

-Razorfold said,

Hell Nvidia GPUs don't even fully support the DX11 spec but we still call them DX11 cards don't we? AMD on the other hand, I believe, fully supports the spec to the letter.

Maybe because NVidia cards do support the D3D11 spec? Feature level 11_1 on the other hand...

-Razorfold said,

Because tessellation is a DX11 only feature? DX9 has no support for it?

You can do tessellation on DirectX <11.

-Razorfold said,

By actually fully utilizing the Dx11 spec? And not just random bits and pieces to get a little bit extra performance from 1 thing?

What parts aren't they utilising?

-Razorfold said,

Er they both were an issue. A game fully made in DX9 will not look as nice as a game that's fully made in DX11. But developers aren't going to write one game engine for the 360, one game engine for the PS3, and one game engine for the PC. It would be absurd.

The graphics API has nothing to do with the end-result visually beyond restricting the available perf budget.

-Razorfold said,

Now that the new generation of consoles are similar to PCs (but still suck) and devs can start using all the DX11 features on both the Xbox and the PC, we should start seeing more real DX11 games. Apparently the PS4 also supports dx11.1 (not sure if true or not) so that should help PC gamers too.

The PS4 does not support DirectX, it uses libgcm, an in-house Sony API that is allegedly similar to Mantle in design. (Or rather, Mantle is similar to libgcm)

-Razorfold said,

Read what I wrote rather than just copy pasting your exact words. It doesn't matter if OpenGL can process stuff on the CPU faster than DirectX Because IT WON'T LEAD TO A PERFORMANCE INCREASE. That was my point based on the Steam slide that you posted.

Try watching the actual talk rather than just thumbing through slides and you will realise how wrong you are.

-Razorfold said,

My entire point is how CPU isn't a limiting factor in modern games. I wasn't talking about driver overhead which I why I specifically made a post saying "EDIT".

Your point is wrong.

Go ahead and prove it, I dare you.

Go do a google search on "how many games fully utilize the DX11 spec" you'll find quite a lot of discussions on it. On sites ranging from Steampowered to Anandtech to extremetech. This is not some brand new revelation, it's been knowing for a while.

Newer games may be different but for quite a long times they weren't like that.

Maybe because NVidia cards do support the D3D11 spec? Feature level 11_1 on the other hand...

Note the word fully? I consider 11.1 and 11.2 under that. AMD supports both of those, nvidia doesn't.

You can do tessellation on DirectX <11.

With a massive performance loss, which is why nobody does it and hence why most people consider tessellation to be a Dx11 feature not a DX9 one.

The graphics API has nothing to do with the end-result visually beyond restricting the available perf budget.

And PRACTICALLY (not theoretically) if the performance is restricted then the visuals are restricted too for obvious reasons. There are simply things that DX9 cannot handle to the level that DX11 can.

Here's an example. A Dx9, Dx10, Dx11 card can all draw circles obviously. The way it does it is:

DX9 - Game wants to draw a circle, it sends some points to the CPU. CPU calculates, sends it to the GPU. Bam circle.
DX10 - Game wants to draw a circle, it sends 10 lines to the CPU. CPU calculates, sends it to the GPU. Bam circle.
DX11 - Game wants to draw a circle, it sends center point and radius to DX11 driver, which sends it to the GPU and bam circle. CPU bypassed.

Now all 3 drew a circle great. But 9 is more demanding than 11 so now devs have to scale back some of the graphics to cope with it. On 11 they don't have to so now they can use more of the available power better. Which tends to lead to better looking games.

Take WoW for example. If you switch to Dx11 cosmetically the game starts to look a little bit better, if you look at smoke and fire they look a ton better than compared to Dx9.

Try watching the actual talk rather than just thumbing through slides and you will realise how wrong you are.

Explain this then. Why does John Carmack suddenly prefer DirectX after hating it for so long? Why do almost no games except for the ones on Mac and Linux use OpenGL? I highly highly doubt Microsoft is paying off all these developers and going "please use DirectX please"

If OpenGL was this amazing savior of PC gaming, more devs would use it...but nobody does. For years now OpenGL has sucked compared to DirectX, both in ease of use and features.

Is Mantle a step in the right direction? Sure. Could DirectX12 or w/e they call it be a step in the right direction too? Sure it can.

-Razorfold said,

Go do a google search on "how many games fully utilize the DX11 spec" you'll find quite a lot of discussions on it. On sites ranging from Steampowered to Anandtech to extremetech. This is not some brand new revelation, it's been knowing for a while.

Newer games may be different but for quite a long times they weren't like that.

No, stop being evasive and answer the question. It's not my job to research your assertions. I want technical answers, not speculation or theorycraft by laymen.

-Razorfold said,

Note the word fully? I consider 11.1 and 11.2 under that. AMD supports both of those, nvidia doesn't.

What you consider or not is irrelevant, D3D11 is a complete spec and NVidia support it. D3D11_1 and the later additions that constitute 11.2 (Despite 11_2 not being a feature level) are minor extensions.

Flipside, NVidia have supported OpenGL 4.4 for almost a year now, and AMD are still stuck on OpenGL 4.3. (Which is somewhat embarrassing considering ARB_sparse_texture is a rename of AMD_sparse_texture)

-Razorfold said,

With a massive performance loss, which is why nobody does it and hence why most people consider tessellation to be a Dx11 feature not a DX9 one.

It's still not a feature in the way you keep trying to present it. Using tessellation is an artistic choice, not a "spec conformance" one.

-Razorfold said,

And PRACTICALLY (not theoretically) if the performance is restricted then the visuals are restricted too for obvious reasons. There are simply things that DX9 cannot handle to the level that DX11 can.

Such as?

-Razorfold said,

<snip>

Spare me the invented examples. I understand how an API works, what I'm trying to do is to get you to provide some actual technical evidence for your assertions.

-Razorfold said,

Take WoW for example. If you switch to Dx11 cosmetically the game starts to look a little bit better, if you look at smoke and fire they look a ton better than compared to Dx9.

That has nothing to do with the API and everything to do with additional features/changes Blizzard have written into their internal D3D11 codepath. Correlation is not causation.

-Razorfold said,

Explain this then. Why does John Carmack suddenly prefer DirectX after hating it for so long? Why do almost no games except for the ones on Mac and Linux use OpenGL? I highly highly doubt Microsoft is paying off all these developers and going "please use DirectX please"

Suddenly? You sure love twisting things don't you, those comments date back to 2011 and are no longer current or relevant.

More recently: https://twitter.com/ID_AA_Carm...statuses/386899206163554304

-Razorfold said,

If OpenGL was this amazing savior of PC gaming, more devs would use it...but nobody does. For years now OpenGL has sucked compared to DirectX, both in ease of use and features.

More devs are using it in engines currently under development thanks to Microsoft's fragmentation of the DirectX platform. (Aforementioned support range of ARB_sparse_texture vs "Tiled Resources")

Source 2 is OpenGL-only, UE4 has a OpenGL 4.3 codepath, KojiPro's FOX engine is OpenGL on the PC, etc.

-Razorfold said,

Is Mantle a step in the right direction? Sure. Could DirectX12 or w/e they call it be a step in the right direction too? Sure it can.

In all likelihood it'll be another nail in DirectX's coffin due to further fragmentation of the platform.

I've already given you technical answers.

It's still not a feature in the way you keep trying to present it. Using tessellation is an artistic choice, not a "spec conformance" one.

I never said it was a spec conformance one. I just gave it as an example of DX11 features that dev like to use.

That has nothing to do with the API and everything to do with additional features/changes Blizzard have written into their internal D3D11 codepath. Correlation is not causation.

Except that those effects wouldn't be possible on DX9 without a significant performance loss.

Suddenly? You sure love twisting things don't you, those comments date back to 2011 and are no longer current or relevant.

If you watched the most recent keynote he made he pretty much implied the same thing.

More devs are using it in engines currently under development thanks to Microsoft's fragmentation of the DirectX platform. (Aforementioned support range of ARB_sparse_texture vs "Tiled Resources")

Fragmentation hahaha sure. Because you know Microsoft should have just ported back the W7 kernel which DX11 REQUIRES and made the WDDM framework available for Win XP too. Right?

Oh but no that was just a completely needless change like you like to claim so much.
---

And most devs using it? And then you list Valve which is only using OpenGL so they can push forward their "steam on linux" ideology. That's WHY Valve is using OpenGL so they can save money instead of having to maintain two codebases.

Unreal engine? Their main engine is still DX. The OGL base is for Mac, iOS, PS3 etc, which is required because DX doesn't exist on those platforms.

What about the tons of other devs? Where are they?

Edited by zhangm, Mar 3 2014, 5:10pm :

Athernar said,

Except, it can. In fact, OpenGL is already significantly faster than DirectX. AMD have implied as much, Valve have stated as much, and so have NVidia.

Details as follows at NVidia's SDD talk (Soon to be repeated at GDC jointly with AMD and Intel)

Beyond Porting: How Modern OpenGL can Radically Reduce Driver Overhead
Slides: http://media.steampowered.com/...ys/slides/beyondporting.pdf
VoD: http://www.youtube.com/watch?v=-bCeNzgiJ8I

The Valve stuff has been debunked so many times, the fact that you offer it in an argument negates credibility of understanding this subject.


If you disagree with what Razor said, that does not make it their responsibility to do the research or educate you on the subject matter.

A contradictory viewpoint doesn't require the OP to explain things out like a child. If Razor claimed the Sky was blue and you said it was not, it is not their responsibility to go outside and shoot pictures or find links to make you understand.

I've stayed out of this, but you like I just posted, if you are relying on crap benchmarks that everyone but Valve has refuted, you either don't understand the subject matter or are grasping for straws to prove something that is simply not based in reality.

DX11 is not even close to be utilized, as I stated, especially for the performance advantages. Even DX10 development often just added on effects of the API instead of rebuilding for the framework that would have increased performance considerably by using the DX10 model.

(Think of it like the early Vista Drivers, NVidia and AMD had crap drivers, because they rebuilt them for WDM/WDMM, but they used portions of the XPDM and also XPDM optimizations that NO LONGER had any effect and even hurt performance. It took Microsoft's direct involvement to get them to restructure the drivers to work as native WDM/WDDM code models from the ground up in '07, and this is when Vista gaming benchmarks caught up to XP and started being faster.)

As for OpenGL, more work has been done on the mobile variations that any serious work for technologies beyond DirectX.

OpenGL 3.x had almost completely given up in competing with DX10/11, and there was a time that many were going to let it die. Instead, people like Camarack saved it by adopting DX10/11 features directly instead of trying to advanced the OpenGL framework model. OpenGL 4.x is essentially DirectX for other platforms, it is no longer an independent or self advancing framework. 99% of the 4.x OpenGL features that catch up to Directx11 are directly copied from DX11. There is nothing DirectX11.2 is lacking in comparison to new technologies or features compared to OpenGL. (This means that if Microsoft dumps future DirectX development, OpenGL also stagnates at the current generation of features.)

As for the 'Tessellation' argument, The Xbox 360 did tessellation long before DX11; however, these features were supposed to be included in DX10, but the NVidia 8xxx GPUs could NOT DO IT. Microsoft pulled back the DX10 feature set to make nice with NVidia. The Xbox 360 uses a DX 'superset', meaning that it is more like Dx10 with some DX11 features. As Microsoft clearly talked about with DX11, it was the first PC framework that brought parity with the DX features already on the Xbox 360.

Shoving new techniques through GPUs that are not optimized to handle them is ALWAYS possible to some extent, but that doesn't mean it works well or is efficient.

This is especially true when dealing with OpenGL that has software/CPU rendering fallbacks that can do things, but VERY SLOWLY.

Even in the Windows 8 world, this is not encouraged, as it has software/CPU DirectX rendering, but that doesn't mean a developer should be shoving complex DX11 calls through a SVGA video card.

-----

If you want the answers to things you disagree with that I or Razor said, it is your responsibility to find the information. I know for a fact I am referencing things from a long list of sources and I simply don't have the time to dig up where they all came from just because you don't think the Sky is blue.

Edited by zhangm, Mar 3 2014, 5:51pm :

Hey guys, we appreciate (robust) discussions. Please try to keep it impersonal (don't go calling folks trolls). Thanks.

There is simply no reason what-so-ever to spend any money to support XP users at this point. It's foolish to expect or demand that anyone do so, and anyone wasting time doing that is ... wasting time and money.

But yes, everything MS does should run on Win7 AND Win8 across the board. Vista? I can't imagine why anyone running Vista doesn't just update to Win7.

Maybe Microsoft Studios should make games for PC too. So many half-assed series given up on.

Halo 1+2 but no further
Gears of War half done for PC etc.
GFWL shut down.

They do make some, have you seen Mark of Ninja? It is an AWESOME game for PC availabel on Steam. That said, I was happy GFWL died, it only gave me problems and no benefit at all.

McKay said,
Maybe Microsoft Studios should make games for PC too. So many half-assed series given up on.

Halo 1+2 but no further
Gears of War half done for PC etc.
GFWL shut down.

Mechwarrior, they purchased the license, launched a single game (exclusive for xbox) and nothing more in a decade. Also for ShadowRun and Crimson Skies (now owned by S&T)
Flight Simulator... well, there are a lot to say about it.

For MS, it is all about Halo and Gear.