Recommended Posts



AMD CES 2014 Battlefield 4 running on Mantle API...
Link to comment
Share on other sites

Some leaked performance claims for Mantle and Kaveri processors, probably being readied for the launch or CES.

 

http://wccftech.com/amd-kaveri-apu-a107850k-gaming-general-performance-unveiled-mantle-45-faster-directx/

More than 3 times the speed of DirectX  :| I was expecting good results for Kaveri but not of that order of magnitude... If this actually materializes it could be excellent news for mobile gaming. No need for discrete cards to get decent framerates at decent settings in the latest AAA shooters is something we couldn't even dream of a few years ago.

Link to comment
Share on other sites

Maybe its Microsoft's lack of interest in the PC gaming market, Its eyes are set on the Xbone now days, which is competition. Directx probably needs quite a team to keep it upto date and would be lots of cooperation between Microsoft, Nvidia and AMD, teams that may not always whish to share hardware information.

Link to comment
Share on other sites

Maybe its Microsoft's lack of interest in the PC gaming market

As little developer interest as there was in DX10/11 for a long time I can't really blame them.  I know the reasons, but it still confuses the crap out of me.

 

Now devs are finally interested in the high end again, so there's plenty of reason to move forward.  I suspect MS will have a proper response within a few years, but it might well be too late for it to matter.

Link to comment
Share on other sites

Figure I would post this here since it's along the same lines, the slides from nVidia's OpenGL talk at Steam Dev Days.

 

http://de.slideshare.net/CassEveritt/beyond-porting

 

Of note, yet more evidence for the pile that OpenGL is indeed faster than DirectX. And the MSFT fans said that Valve was lying. :rolleyes:

I believe a large part of it is talking about future extensions, not current.  And Valve has yet to show evidence that they've even experimented with DX11.

 

I wouldn't say Valve lied, just was very selective about what they showed.

 

Considering how behind the times their games are, go figure.

  • Like 2
Link to comment
Share on other sites

Note that a large part of it is talking about future extensions, not current.  And Valve has yet to show evidence that they've even experimented with DX11.

 

I wouldn't say Valve lied, just was very selective about what they showed.

 

Considering how behind the times their games are, go figure.

 

The extensions listed are a part of OpenGL 4.4 - which is current (At least for NVidia, AMD are approaching full support).

 

There is also evidence Valve has experimented with newer DirectX versions, such as the existence of shaderapidx10.dll and mat_dxlevel 100 in dxsupport.cfg in certain games.

 

As the slides indicate however, D3D11 is still slower.

Link to comment
Share on other sites

The extensions listed are a part of OpenGL 4.4 - which is current (At least for NVidia, AMD are approaching full support).

 

There is also evidence Valve has experimented with newer DirectX versions, such as the existence of shaderapidx10.dll and mat_dxlevel 100 in dxsupport.cfg in certain games.

I did reword that before you posted again heh :)  I wasn't actually sure.

 

As for OGL 4.4, it is much newer than DX11.  It took three years for OGL to reach feature parity with DX11 IIRC.  They are currently ahead from what I know, but there's also very little in the way of DX11.1/11.2 or OpenGL 4+ games.

 

My point about Valve still stands.  It is more than a little suspicious that they chose to benchmark their OpenGL code against their DX9 code and yell LOOK HOW AMAZING THIS IS.

  • Like 2
Link to comment
Share on other sites

I did reword that before you posted again heh :)  I wasn't actually sure.

 

As for OGL 4.4, it is much newer than DX11.  It took three years for OGL to reach feature parity with DX11 IIRC.  They are currently ahead from what I know, but there's also very little in the way of DX11.1/11.2 or OpenGL 4+ games.

 

My point about Valve still stands.  It is more than a little suspicious that they chose to benchmark their OpenGL code against their DX9 code and yell LOOK HOW AMAZING THIS IS.

 

OpenGL 4.x has "feature parity" with D3D11, 3.x with 10, 2.x with 9. In this case, 4.4 is better compared to 11.2. (Both added implementations of PRT for instance)

 

It seems you're trying to imply Valve were being disingenuous by comparing an OpenGL 4 codepath vs D3D9, however that was not the case. Given the were using L4D2 as the testbed, the likely comparison was D3D9 vs D3D9 ontop of a GL 2.x layer. Since Source 1 does not natively support OpenGL, unlike Source 2 which is native OpenGL.

 

That, and the notion of "fixing" the comparison doesn't help anyone - if anything it hurts Valve more than MSFT.

Link to comment
Share on other sites

OpenGL 4.x has "feature parity" with D3D11, 3.x with 10, 2.x with 9. In this case, 4.4 is better compared to 11.2. (Both added implementations of PRT for instance)

 

It seems you're trying to imply Valve were being disingenuous by comparing an OpenGL 4 codepath vs D3D9, however that was not the case. Given the were using L4D2 as the testbed, the likely comparison was D3D9 vs D3D9 ontop of a GL 2.x layer. Since Source 1 does not natively support OpenGL, unlike Source 2 which is native OpenGL.

 

That, and the notion of "fixing" the comparison doesn't help anyone - if anything it hurts Valve more than MSFT.

OpenGL 4.3 was the first to offer compute shaders, which were a feature in DX11 that extended to DX10 cards.

 

Really?  Making Windows look bad doesn't help SteamOS?  I'm not sure I'll ever believe it wasn't intentional.  Whatever works.

Link to comment
Share on other sites

OpenGL 4.3 was the first to offer compute shaders, which were a feature in DX11 that extended to DX10 cards.

 

Really?  Making Windows look bad doesn't help SteamOS?  I'm not sure I'll ever believe it wasn't intentional.  Whatever works.

 

Linux doesn't factor in, I'm talking purely on Windows.

Link to comment
Share on other sites

I hope mantle is a success and is as good as they claim it be but tbh with all talk of will mantle be good or wont it be good etc weve only got wait till BF4 comes out and can look ourselves, i know what framerates i get so ill tell instantly if theres a boost or not and an approximation of how big of a boost.  Im getting thief for free from AMD's game bundle although that comes with mantle out of the box?? so wont be able to tell unless theres an option to use DX or mantle then you could switch between API's and gauge is theres more performance or not.

 

Nvidia does to much proprietary stuff and doesnt give jack away for free.... think some opengl extensions are proprietary to Nvidia so no one could prolly use em without a license fee maybe??

 

With DX 11.2's PRT stuff i think it could be used to generate highly detailed procedurally generated game environments like in a space game when you fly through the atmosphere and come down to ground level and fly around a planet. Think that would be good but cant see something like call of duty needing it or battlefield etc

Link to comment
Share on other sites

Nvidia does to much proprietary stuff and doesnt give jack away for free.... think some opengl extensions are proprietary to Nvidia so no one could prolly use em without a license fee maybe??

 

That would defeat the point of OpenGL if swaths of the spec were proprietary.

 

There are vendor extensions, but those essentially exist so they can be later ratified by the OpenGL ARB.

 

OpenGL 4.4's ARB_sparse_texture extension is an example of this, it's more or less a copy of AMD_sparse_texture with some improvements/adjustments.

Link to comment
Share on other sites

Ive benchmarked my DX 11 performance

 

===========================================================
Oxide Games
Star Swarm Benchmark - ?2013
C:\Users\amell_000\Documents\Star Swarm\Benchmark_14_01_30_2210.txt
Version 0.95
01/30/2014 22:10
===========================================================
 
== Hardware Configuration =================================
GPU: AMD Radeon R9 200 Series
CPU: GenuineIntel
Intel® Core i7 CPU         950  @ 3.07GHz
Physical Cores: 4
Logical Cores: 8
Physical Memory: 12875575296
Allocatable Memory: 140737488224256
===========================================================
 
 
== Configuration ==========================================
API: DirectX
Scenario: ScenarioAttract.csv
User Input: Disabled
Resolution: 1920x1080
Fullscreen: True
GameCore Update: 16.6 ms
Bloom Quality: High
PointLight Quality: High
ToneCurve Quality: High
Glare Overdraw: 16
Shading Samples: 64
Shade Quality: Mid
Motion Blur Frame Time: 16
Motion Blur InterFrame Time: 2
Detailed Frame Info: Off
===========================================================
 
 
== Results ================================================
Test Duration: 120 Seconds
Total Frames: 2248
 
Average FPS: 18.73
Average Unit Count: 3624
Maximum Unit Count: 5314
Average Batches/MS: 552.50
Maximum Batches/MS: 913.88
Average Batch Count: 33284
Maximum Batch Count: 131851
===========================================================
 
using MSI 280x gaming edition (pretty much bog standard) will have to try again when i get my sapphy toxic back but least ive got a benchmark to see if mantle improves it :d
Link to comment
Share on other sites

This topic is now closed to further replies.