Witcher 3 dev: No major power diff. Hidden XBO power?


Recommended Posts

10x less overhead than Direct3D, excellent multi-thread scaling, making an FX-8350 actually compete against haswell i7s, much larger batches (100K vs 20K IIRC). Oxide had a presentation with concrete numbers and an actual demo running at APU13, I can't seem to find it right now but google it perhaps you'll have better luck.

 

Well, I mean more results in terms of actual games. You can easily make synthetic benchmarks that will show performance differences. It is just matter of of throwing the larger batches, but do games really throw the larger batches such that you would actually see significant gains? I'm just not really sure at this point and I'm waiting to see. I was hoping you had found results saying as much (a game demo would be interesting in that respect as well).

Link to comment
Share on other sites

and? they aren't running the same code. you're assuming free resources of the cpu are equal.

I'm not following. Based on what data are you arguing that draw calls on the PS4 will be more expensive? Simply suggesting differences may exist does nothing to reinforce your point.

 

PLpvSrf.jpg?1
 
Audio processor: also on PS4.
Video decode/encode hardware: also on PS4.
ESRAM: this is just a workaround the slow DDR3 memory, it would be silly on PS4
Compute engines: the PS4 has 8 vs the Xbox One's 2
Move engines: also a workaround for the slow DDR3 memory
 
So, again, what you referring to? What's supposed to offset 33% less shaders on Xbox One?
 
you cant possibly see those 28nm transistor gates on the die to determine their function from a block diagram.

Me, no, but the guys at Chipworks.com are actual experts.

 

the esram also isn't purely for the GPU. there is a total of 47MB of memory on the die, and there is esram in the cpu block,audio block. im pretty sure caches and fast memory are part of increasing the efficiency of the cpu as well.

The ESRAM is just a workaround the fact that the Xbox One uses DDR3 memory. At best, it offsets the performance penalty of using slower main memory, but it cannot compensate for 33% less shaders. Not suffering too much from slow main memory because you have a large fancy cache doesn't offset the fact that computationally expensive jobs are going to take longer on Xbox One.

 

there is also hardware accelerated compression/decompression

Also on PS4...

 

as well as compressed render target support.

If you're referring to the 6e4/7e3 formats referred to here, I'm not even sure that even has to do with hardware; developers can use whatever texture format they want and have always been able to. Both the PS4 and Xbox One GPUs are fully programmable. In any case that looks like a trick to get more out of the small ESRAM for textures that don't need a high precision, perhaps again that's not even useful on PS4.

 
oh yeah, there is also the advantage of having 2 different memories running at the same time. for example, one thread could be working on something on the DDR3 memory,while another thread could have a chunk of esram allocated and working on that, whereas with one unified pool of gddr5, while one thing is using the memory,everything else is waiting.

That's just not how computers work. CPUs don't even access main memory directly, each core uses its own caches in parallel with the others. Nothing's waiting after memory unless the channels are saturated, and with DDR3 it's going to get saturated a lot faster on Xbox One.

 

youre missing the point. getting the data to the gpu is one thing, waiting until you can send the data over to the gpu is another. the cpu is also responsible for many things in the game engine, physics, AI, audio,OS,etc... a busy cpu will hurt performance, mantle or no mantle. freeing up these resources makes hardware usage more efficient. jaguar cores are pretty weak as is.

Yeah the Xbox One has DMA channels they call "Move Engines" to free up the CPU from waiting for data transfers, a very good reason for having that would be the slow RAM; on PS4 data transfers are just much faster so if the engineers decided not to add those it's probably because it doesn't need it. So perhaps there's still a slight efficiency advantage there for the CPU on Xbox One, but again, that's nothing to seriously offset 50% more shaders on PS4. 

  • Like 3
Link to comment
Share on other sites

Well, I mean more results in terms of actual games. You can easily make synthetic benchmarks that will show performance differences. It is just matter of of throwing the larger batches, but do games really throw the larger batches such that you would actually see significant gains? I'm just not really sure at this point and I'm waiting to see. I was hoping you had found results saying as much (a game demo would be interesting in that respect as well).

If I find the demo I'll let you know. I guess we'll all have to wait for the Battlefield 4 patch due somewhere in December for real numbers. I'm quite eager and optimistic based on everything said so far.

  • Like 1
Link to comment
Share on other sites

So perhaps there's still a slight efficiency advantage there for the CPU on Xbox One, but again, that's nothing to seriously offset 50% more shaders on PS4.

I know you guys are deep into a very technical discussion, but I'm a little confused here.

Does the ps4 have 33% or 50% more shaders? You mention the X1 has 33% fewer shaders in your reply, but then here say the ps4 has 50% more. Which is it?

So how about I try to sum this up for both of you:

The ps4 has a gpu that has a raw performance edge thanks to 33 or 50% more shaders. The two system basically tie when it comes to the ram question (they go about it in two different ways and I don't think anyone really knows the small differences that might result right now). The X1 has a faster cpu thanks to MS' overclock and possibly some advantage due to the added esram cache built into the chip (I'm not talking about the esram that is in use as part of the ram system).

Both consoles offer dedicated hardware to things like audio, video decoding/scaling, networking, etc, but we do not know enough about both of them to know if either has an advantage in those areas. All we know right now is that MS custom designed the chips covering most of those things and Sony has opted to use at least some standard chips from the likes of AMD (we know less about Sony's hardware choices, so there is no complete picture).

We may know more specifics about the X1 hardware, but even it still has parts that are unknown as far as what difference they will make. The PS4 has less info out there on the specific bits, but the stuff we do know seems to point to an effort to avoid custom bits as much as possible.

So where does that leave us in the grand scheme of things? Hard to say at this point. Its too early to be making any definitive points. For most of us, its easier to just point to that gpu figure and call it a day, but when it comes to real world results (i.e. games), it becomes a more gray area. I want to see what that shader difference means for PS4 and X1 games and how much of that 33/50 figure can be mitigated. I want to know what the minimum performance can be expected from either console. Launch titles don't tell us much. You have ps4 and x1 games at 1080p or below and you have some X1 games at 720. Its a wait and see moment.

Link to comment
Share on other sites

I know you guys are deep into a very technical discussion, but I'm a little confused here.

Does the ps4 have 33% or 50% more shaders? You mention the X1 has 33% fewer shaders in your reply, but then here say the ps4 has 50% more. Which is it?

Well it's both really. 18 is 50% greater than 12 (12 + 50% * 12), and 12 is 33% less than 18 (18 - 33% * 18), it's just basic arithmetic. :)

Link to comment
Share on other sites

So where does that leave us in the grand scheme of things? Hard to say at this point. Its too early to be making any definitive points. For most of us, its easier to just point to that gpu figure and call it a day, but when it comes to real world results (i.e. games), it becomes a more gray area. I want to see what that shader difference means for PS4 and X1 games and how much of that 33/50 figure can be mitigated. I want to know what the minimum performance can be expected from either console. Launch titles don't tell us much. You have ps4 and x1 games at 1080p or below and you have some X1 games at 720. Its a wait and see moment.

IGN has a great comparison page: http://ca.ign.com/wikis/xbox-one/PS4_vs._Xbox_One_Native_Resolutions_and_Framerates

 

So far the two AAA shooters and Assassin's Creed run at higher res on PS4; all PS4 games are 1080p except BF4 which is 900p. Meanwhile the other cross-platform titles are 1080p on both consoles, but most Xbox One exclusives run under that, Forza 5 being the exception.

 

So while it's not a large sample, it does look like it's easier for the PS4 to drive higher resolutions, and the extra shaders would be a natural explanation of that.

Link to comment
Share on other sites

I'm not following. Based on what data are you arguing that draw calls on the PS4 will be more expensive? Simply suggesting differences may exist does nothing to reinforce your point.

 

PLpvSrf.jpg?1

 

Audio processor: also on PS4.

Video decode/encode hardware: also on PS4.

ESRAM: this is just a workaround the slow DDR3 memory, it would be silly on PS4

Compute engines: the PS4 has 8 vs the Xbox One's 2

Move engines: also a workaround for the slow DDR3 memory

 

So, again, what you referring to? What's supposed to offset 33% less shaders on Xbox One?

 

Me, no, but the guys at Chipworks.com are actual experts.

 

The ESRAM is just a workaround the fact that the Xbox One uses DDR3 memory. At best, it offsets the performance penalty of using slower main memory, but it cannot compensate for 33% less shaders. Not suffering too much from slow main memory because you have a large fancy cache doesn't offset the fact that computationally expensive jobs are going to take longer on Xbox One.

 

Also on PS4...

 

If you're referring to the 6e4/7e3 formats referred to here, I'm not even sure that even has to do with hardware; developers can use whatever texture format they want and have always been able to. Both the PS4 and Xbox One GPUs are fully programmable. In any case that looks like a trick to get more out of the small ESRAM for textures that don't need a high precision, perhaps again that's not even useful on PS4.

 

That's just not how computers work. CPUs don't even access main memory directly, each core uses its own caches in parallel with the others. Nothing's waiting after memory unless the channels are saturated, and with DDR3 it's going to get saturated a lot faster on Xbox One.

 

 

Yeah the Xbox One has DMA channels they call "Move Engines" to free up the CPU from waiting for data transfers, a very good reason for having that would be the slow RAM; on PS4 data transfers are just much faster so if the engineers decided not to add those it's probably because it doesn't need it. So perhaps there's still a slight efficiency advantage there for the CPU on Xbox One, but again, that's nothing to seriously offset 50% more shaders on PS4.

1.XBO audio chip has dedicated hardware for audio processing,enough power as one full CPU core. In your link, cerny talks about using GPU for audio processing. Also,if not using GPU,you will use CPU.

2.CPU core has a few meg chunk of ESRAM. Audio works with esram. This is a pretty big deal. Ever look at benchmarks of chips with more cache? Yeah,i thought so.

3.CPU is upclocked.

4.Moves engines do compress/decompress of data on the fly.Bandwidth saver

5.XBO has 200GB/S ESRAM + 68GB/S DDR3 ram that can be used simultaneously,for a total theoretical throughput of 268GB/S,much more than 176GB/S. Fact is, you can have a thread doing work on the DDR3,while the GPU is working on the ESRAM. No blocking,no stalling,no waiting,completely two different memories.

6.compressed render targets again maximize bandwidth

And i leave you with a quote from master cerny,from your own article you posted,that you probably didnt read,that discredits the suggestion that the GPU is build with 18CU for better graphics. even the sony docs acknowledge that the system is balanced for 14CU. why keep denying this?

The point is the hardware is intentionally not 100 per cent round. It has a little bit more ALU in it than it would if you were thinking strictly about graphics. As a result of that you have an opportunity, you could say an incentivisation, to use that ALU for GPGPU.

the xbox gpu clock increases make the 12CU perform higher than 14CU at the original 800mhz clock,according to microsoft engineers.

Link to comment
Share on other sites

For all the talk about how the XB1/PS4 are virtually identical/very very similar, what strikes me is that they completely and obviously are not similar in many ways. Sure, they may be both Jaguar and Radeon based systems but the wildly divergent integration of the components is to me the an obvious reason for the troubles devs are having with XB1.

 

The common refrain is that the power on XB1/PS4 is very close, and I believe the devs, so I wouldn't at all be surprised that a fancy new console with a more complex integration of reasonably similar components can initially produce rather inferior results. I'll be interested to see the progression of game graphics of this coming generation...

Link to comment
Share on other sites

But Ryse happens to be the best looking so far, for now, on next consoles. At only 900p/30fps. Yep even better looking than those 1080p games too.

I'm not not as technical as some of y'all. So how is it possible?

Link to comment
Share on other sites

IGN has a great comparison page: http://ca.ign.com/wikis/xbox-one/PS4_vs._Xbox_One_Native_Resolutions_and_Framerates

 

So far the two AAA shooters and Assassin's Creed run at higher res on PS4; all PS4 games are 1080p except BF4 which is 900p. Meanwhile the other cross-platform titles are 1080p on both consoles, but most Xbox One exclusives run under that, Forza 5 being the exception.

 

So while it's not a large sample, it does look like it's easier for the PS4 to drive higher resolutions, and the extra shaders would be a natural explanation of that.

So do you think its possible for X1 games to be on par as far as hitting that magic 1080p mark once developers have had more time to work with the system even with the shader deficit? Its clear the X1 can hit that number at launch, but developers either ran out of time or chose not to make effort to do that for every game.

I mean heck, both COD and Assassins Creed required day one patches for the ps4 to hit 1080. There re a lot of signs pointing to rushed launches.

Link to comment
Share on other sites

But Ryse happens to be the best looking so far, for now, on next consoles. At only 900p/30fps. Yep even better looking than those 1080p games too.

I'm not not as technical as some of y'all. So how is it possible?

because it isnt a poorly optimized rushed multiplatform title that runs on 6 different platforms. ryse also has extremely talented devs. i believe the guys making killzone asked the ryse devs to open the case the xbox one was in to make sure it was actually running on real xbox hardware. games looking like this very early on in the cycle is very encouraging for what the future holds as more and more things become more efficient.

 

I mean heck, both COD and Assassins Creed required day one patches for the ps4 to hit 1080. There re a lot of signs pointing to rushed launches.

fact. COD ghosts ran worse on PS4 at 720p than the xbox one version. i guess these games failed certification from sony, thus requiring extra work and patches to get them to 1080p.

Link to comment
Share on other sites

So far there is no proof that the PS4 is more powerful despite the PS4 having the SDK available much earlier, despite the XO SDK missing several features yet, so far we're only seeing the consoles choosing different paths, stable FPS at low res vs fps drops at high res.

  • Like 2
Link to comment
Share on other sites

So far there is no proof that the PS4 is more powerful despite the PS4 having the SDK available much earlier, despite the XO SDK missing several features yet, so far we're only seeing the consoles choosing different paths, stable FPS at low res vs fps drops at high res.

 

There's plenty of proof when you look into the papers but for some reason you don't want to do that.

 

I have a little problem explaining it all in English but hopefully you'll understand what I mean.

 

First there's RAM. Let's leave the esram aside for the moment since that's causing problems already for devs and lets just compare GDDR vs DDR and their use with the APU. As in a traditional PC the more and faster memory you have for an APU the better use you will get out of it and it's on die graphics. Jaguar is an SoC but still works on the same principal as an regular APU, now as we all know GDDR5 memory has a lot more bandwidth than DDR3 so that's one plus for the PS, GDDR has faster access to the GPU, another plus for the PS.

 

(esram problems that I mentioned? graphics caching issues due to moving data in and out eSRAM.)

 

Now to the GPU part. Do I really need to go over that?

Link to comment
Share on other sites

You guys are going around in circles, both sides of the argument.     When you have developers, actually working with the hardware of both systems come out and say they're close what more do you guys want?     Does the PS4 have a better GPU?  Yes.  Does it make much of a difference as some think?  No.     

 

Why not look at the games.    Both systems have games that run at 1080p, both have games that don't.   Others started off sub 1080p and were patched up later.   Good indication that the developers didn't have enough time or worked on non-final devkits to get the best out of their game.    This whole bickering is going nowhere.   You can argue % till you're blue in the face but when both systems have games running at 1080p will this matter?     If every multiplayer game going forward ends up with the same res and frame rate then what's the argument going to be exactly?      If the Witcher 3 developers come out and say that they expect MS to tweak things, the API, the drivers and the SDK itself more with time and allow for better optimized games this is wrong how exactly?   You've all veered off topic yet again.    

 

I don't see why it's hard to grasp,  aside from both systems using x86 CPUs they have other differences.  Why are some singling out one part of the systems and only that part like the others don't come into play?   Any PC gamer will tell you that it's not just their video card that's important.   If you want to push your card for all it's got you need a good CPU, and faster system memory helps as well.  It's all one system, and to that effect you can't just call out the CU difference between the GPUs on the systems, which seems to be less than what people thought originally (the PS4 sounds like it's using some of those 18 CUs for other tasks and not all of them for graphics).    MS has done something different, the eSRAM is a key part in how the system works, so are the move engines and so on.  Once developers work with it and can get more out of it then I see zero reason why any future games can't be at full HD.

 

Once this happens then maybe we can move on to something else?   Somehow I doubt it though, I'm sure some of you will keep comparing games running on both, zooming in 400%+ to count pixels and so on.

Link to comment
Share on other sites

AC was also patched after release on special request from sony and most likely extra under the tablet money for the extra work. 

 

That kind of nonsense just screams insecurity. Paid off patches? Seriously...  :laugh:

Link to comment
Share on other sites

That kind of nonsense just screams insecurity. Paid off patches? Seriously...  :laugh:

just saying

 

Ubisoft and Sony announced a partnership today that will see exclusive content hit the PS3 and PS4 versions of two upcoming titles: Assassins Creed IV Black Flag and Watch Dogs. Senior Vice President of Sales and Marketing at Ubisoft, Tony Key, was quite happy about giving extra value to consumers who opt for Sony's machines over Microsoft's, saying: ?Expanding our partnership with Sony Computer Entertainment allows us to offer PlayStation gamers even more substance to games that are already rich with content. Regardless of which PlayStation console Assassin?s Creed IV Black Flag and Watch Dogs are played on, PlayStation gamers will be treated to exclusives that will add to their gameplay experiences.?

http://www.thatvideogameblog.com/2013/06/11/ubisoft-to-deliver-exclusive-content-to-sonys-consoles-with-watch-dogs-and-assassins-creed-iv/

Link to comment
Share on other sites

But Ryse happens to be the best looking so far, for now, on next consoles. At only 900p/30fps. Yep even better looking than those 1080p games too.

I'm not not as technical as some of y'all. So how is it possible?

 

 

That's because most people here seem concerned with how big the image is, not the quality of the image. If you stretched Super Mario Bros from the NES to 1080, and on the other system had Crysis 3 running at PC's Ultra settings, but in 720p. I'm convinced these people would argue that 1080p SMB is superior. (Not arguing gameplay here, just image size)

 

Xbox 360 and PS3 both had titles running in native 1080, it isn't a hard thing to aim for. They've obviously taken to the deliberate decision to hit 720, or 900 rather than sacrifice image quality.

Link to comment
Share on other sites

 

Yeah exclusive content, not frickin patches. That's really going down to the drivel at the bottom of the fanboy barrel if you want to start saying companies are being paid off to release/not release patches.

Link to comment
Share on other sites

 

In fairness, BF4 is getting the first DLC on Xbox One months before it hits other systems, Watchdogs is making exclusive DLC for Xbox One too, Call of Duty Ghosts will have timed-exclusive content for the Xbox One.

 

Microsoft and Sony have always paid off Developers for special stuff for their systems. 

Link to comment
Share on other sites

Yeah exclusive content, not frickin patches. That's really going down to the drivel at the bottom of the fanboy barrel if you want to start saying companies are being paid off to release/not release patches.

+1

Companies are very tight during development for a launch title, like Ubisoft said they switched AC4 to 1080p during last minute optimisations then realised a patch. Just like the X1 is getting a performance patch for BF4 next week.

I love how unfinished games are becoming the norm these days.

Link to comment
Share on other sites

 

And what's wrong with that? Both Microsoft and Sony make these type of deals all the time. If Microsoft don't want to show an interest in AC4 because they're too busy with CoD/BF4 deals, then why shouldn't Sony take  advantage of it?

Link to comment
Share on other sites

Yeah exclusive content, not frickin patches. That's really going down to the drivel at the bottom of the fanboy barrel if you want to start saying companies are being paid off to release/not release patches.

just pointing out that there is a partnership on these titles,and money has exchanged hands, not specifically ubisoft getting paid for a patch. Perhaps ubisoft is being generous because of this. I mean, ubisoft even decided to write a blog admiring themselves about how their team decided to keep working on the ps4 version specifically between ship and launch to enhance the res.

 

In fairness, BF4 is getting the first DLC on Xbox One months before it hits other systems, Watchdogs is making exclusive DLC for Xbox One too, Call of Duty Ghosts will have timed-exclusive content for the Xbox One.

Microsoft and Sony have always paid off Developers for special stuff for their systems.

And what's wrong with that? Both Microsoft and Sony make these type of deals all the time. If Microsoft don't want to show an interest in AC4 because they're too busy with CoD/BF4 deals, then why shouldn't Sony take advantage of it?

nothing wrong with it,never said there was. its completely fair,and yes it does happen all the time. all im saying is, the game received extra development time to be optimized and bring up the res compared to the other,and this could be a reason to explain the discrepancy in resolution.

Link to comment
Share on other sites

This topic is now closed to further replies.