Microsoft's Penello: No way is Xbox One giving up 30% power advantage t


Recommended Posts

Originally Posted by Albert Penello

 

I see my statements the other day caused more of a stir than I had intended. I saw threads locking down as fast as they pop up, so I apologize for the delayed response.

I was hoping my comments would lead the discussion to be more about the games (and the fact that games on both systems look great) as a sign of my point about performance, but unfortunately I saw more discussion of my credibility.

So I thought I would add more detail to what I said the other day, that perhaps people can debate those individual merits instead of making personal attacks. This should hopefully dismiss the notion I'm simply creating FUD or spin.

I do want to be super clear: I'm not disparaging Sony. I'm not trying to diminish them, or their launch or what they have said. But I do need to draw comparisons since I am trying to explain that the way people are calculating the differences between the two machines isn't completely accurate. I think I've been upfront I have nothing but respect for those guys, but I'm not a fan of the mis-information about our performance.

So, here are couple of points about some of the individual parts for people to consider:

? 18 CU's vs. 12 CU's =/= 50% more performance. Multi-core processors have inherent inefficiency with more CU's, so it's simply incorrect to say 50% more GPU.
? Adding to that, each of our CU's is running 6% faster. It's not simply a 6% clock speed increase overall.
? We have more memory bandwidth. 176gb/sec is peak on paper for GDDR5. Our peak on paper is 272gb/sec. (68gb/sec DDR3 + 204gb/sec on ESRAM). ESRAM can do read/write cycles simultaneously so I see this number mis-quoted.
? We have at least 10% more CPU. Not only a faster processor, but a better audio chip also offloading CPU cycles.
? We understand GPGPU and its importance very well. Microsoft invented Direct Compute, and have been using GPGPU in a shipping product since 2010 - it's called Kinect.
? Speaking of GPGPU - we have 3X the coherent bandwidth for GPGPU at 30gb/sec which significantly improves our ability for the CPU to efficiently read data generated by the GPU.

Hopefully with some of those more specific points people will understand where we have reduced bottlenecks in the system. I'm sure this will get debated endlessly but at least you can see I'm backing up my points.

I still I believe that we get little credit for the fact that, as a SW company, the people designing our system are some of the smartest graphics engineers around ? they understand how to architect and balance a system for graphics performance. Each company has their strengths, and I feel that our strength is overlooked when evaluating both boxes.

Given this continued belief of a significant gap, we're working with our most senior graphics and silicon engineers to get into more depth on this topic. They will be more credible then I am, and can talk in detail about some of the benchmarking we've done and how we balanced our system.

Thanks again for letting my participate. Hope this gives people more background on my claims

 

 


At Microsoft, we have a position called a "Technical Fellow" These are engineers across disciplines at Microsoft that are basically at the highest stage of technical knowledge. There are very few across the company, so it's a rare and respected position.

We are lucky to have a small handful working on Xbox.

I've spent several hours over the last few weeks with the Technical Fellow working on our graphics engines. He was also one of the guys that worked most closely with the silicon team developing the actual architecture of our machine, and knows how and why it works better than anyone.

So while I appreciate the technical acumen of folks on this board - you should know that every single thing I posted, I reviewed with him for accuracy. I wanted to make sure I was stating things factually, and accurately.

So if you're saying you can't add bandwidth - you can. If you want to dispute that ESRAM has simultaneous read/write cycles - it does.

I know this forum demands accuracy, which is why I fact checked my points with a guy who helped design the machine.

This is the same guy, by the way, that jumps on a plane when developers want more detail and hands-on review of code and how to extract the maximum performance from our box. He has heard first-hand from developers exactly how our boxes compare, which has only proven our belief that they are nearly the same in real-world situations. If he wasn't coming back smiling, I certainly wouldn't be so bullish dismissing these claims.

I'm going to take his word (we just spoke this AM, so his data is about as fresh as possible) versus statements by developers speaking anonymously, and also potentially from several months ago before we had stable drivers and development environments.

 

To summarize

 

55% more memory bandwidth

3x the coherent gpgpu bandwidth

6.6% faster gpu clock speed

10% faster cpu clock speed

1 cpu core equivalent audio chip

15 special purpose processors to offload cpu and gpu and remove bottlenecks

 

more stuff we still dont know

Link to comment
Share on other sites

I think it's best that we wait for the next-gen consoles to be released before making claims that one has better real-world performance than the other.

  • Like 1
Link to comment
Share on other sites

i think this how stupid ppl are ppl claim that ps4 has 1156 stream processors or someat and 18 compute units which makes them super duper hard.... but do they realise they are pretty much 1 in the same? 1 compute unit has 64 stream processors so for xbox 12 (compute units) x 64 (stream processors) = 768. PS4 has 18 compute units which = 1152. basic maths but either way as stand alone cards they are both undeniably S H I T compared to PC. the only advantage i the architrctuire they employ to bring it together. both has strengths and weaknesses but dont even say these console come anywhere near as close as a pc can do especially with amd's next 20nm process gfx itll blow the consoles out of the water without even trying and were talking about months... not years!

Link to comment
Share on other sites

dont even say these console come anywhere near as close as a pc can do

What a current PC is technically capable of doing and what the games will push it to do are very different.

 

The power advantages are cut short by the legacy issues (such as DirectX 9,) and things like the Xbox 360/Xbox One audio processor have no PC counterpart.

 

Not to mention most games on PC are still programmed to support dual core machines from 7-8 years ago, so they certainly aren't going to be using a high end current system to its full effect most of the time.

 

Now even assuming all that wasn't a problem, there's still plenty of doubt as to how much the PC graphics libraries are even capable of - http://www.bit-tech.net/hardware/graphics/2011/03/16/farewell-to-directx/2 (this refers to the old consoles being vastly more capable than PC at drawing unique objects, not the new consoles)

 

I expect the PC gaming environment to improve drastically in the next few years, but right now it's a bit of a mess.

Link to comment
Share on other sites

No add'l dGPU. Working on more tech deep-dives. XBO is plenty capable now and in the future. Perf. differences are greatly overstated.

 

Excellent. Now we can all start further (conspiracy?) theories, pointing out how he said no additional dGPU, with the implication that there's only one dGPU, not two, for a grand total of one AMD-based CPU/APU and only one dGPU. Also, notice he didn't explicitly rule out further graphics processing cores, as opposed to a whole GPU  :D  :rofl:  :D

 

(Man - this is fun. I should start more, and subscribe to more conspiracy theories.. Hum, where's a good place to start? :D )

Link to comment
Share on other sites

Originally Posted by Albert Penello

 

 

 

 

 

 

To summarize

 

55% more memory bandwidth

3x the coherent gpgpu bandwidth

6.6% faster gpu clock speed

10% faster cpu clock speed

1 cpu core equivalent audio chip

15 special purpose processors to offload cpu and gpu and remove bottlenecks

 

more stuff we still dont know

 

Sigh. You are bringing in logic in this discussion. PS4 fanbois would not appreciate it.

  • Like 3
Link to comment
Share on other sites

Originally Posted by Albert Penello

 

 

 

 

 

 

To summarize

 

55% more memory bandwidth

3x the coherent gpgpu bandwidth

6.6% faster gpu clock speed

10% faster cpu clock speed

1 cpu core equivalent audio chip

15 special purpose processors to offload cpu and gpu and remove bottlenecks

 

more stuff we still dont know

Thanks for sharing this, I don't know much about how all this hardware works but this looks good and from my standpoint this looks like it closes the "50% power gap"

Shame to see him get torn to shreds on NeoGaf though.

Link to comment
Share on other sites

Thanks for sharing this, I don't know much about how all this hardware works but this looks good and from my standpoint this looks like it closes the "50% power gap"

Shame to see him get torn to shreds on NeoGaf though.

Probably because knowledgeable people know better

http://m.neogaf.com/showthread.php?t=673713&page=5

Good reading starts on that page.

With the way the bandwidth adding had been done, the Xbox 360 has more than the Xbox One http://majornelson.com/2005/05/20/xbox-360-vs-ps3-part-1-of-4/ at 278.4.

Link to comment
Share on other sites

Probably because knowledgeable people know better

http://m.neogaf.com/showthread.php?t=673713&page=5

Good reading starts on that page.

With the way the bandwidth adding had been done, the Xbox 360 has more than the Xbox One http://majornelson.com/2005/05/20/xbox-360-vs-ps3-part-1-of-4/ at 278.4.

So ###### the guy who designed the system, believe other random people instead? Only a Sony fan would say that.

Link to comment
Share on other sites

So **** the guy who designed the system, believe other random people instead? Only a Sony fan would say that.

 

So the stuff MajorNelson posted about the Xbox 360 vs the PS3 is correct? The Xbox 360 has more bandwidth than the Xbox One?

 

300zo8i.jpg

 

You would rather drink the companies cool aid vs multiple skeptical posts about technical inconsistencies, and multiple independent games developers findings? Only a *loyalist fan* would say that.

 

You do not add ram bandwidth like that. It's quite obvious instead of spending 30 mins to read NeoGAF you just seen "Audioboxer" and done what half this GH usually does. Your loss.

 

Here's just one example

 

Originally Posted by Albert Penello 
 
I do want to be super clear: I'm not disparaging Sony. I'm not trying to diminish them, or their launch or what they have said. But I do need to draw comparisons since I am trying to explain that the way people are calculating the differences between the two machines isn't completely accurate. I think I've been upfront I have nothing but respect for those guys, but I'm not a fan of the mis-information about our performance.

 

 
Some of your points are misleading, or otherwise need clarifying.
 
? 18 CU's vs. 12 CU's =/= 50% more performance. Multi-core processors have inherent inefficiency with more CU's, so it's simply incorrect to say 50% more GPU.

 

 

 
Graphics processing is inherently parallel, so 18 vs 12 is indeed 50% more, given the same clock rate.
 
? Adding to that, each of our CU's is running 6% faster. It's not simply a 6% clock speed increase overall.

 

 

 
Each CU being 6% faster still means only 6% speed increase overall compared to YOUR baseline, not Sony's. You can't have it both ways, Albert. Having 50% more CU is not quite 50% more GPU, but having a 6% clock speed increase is more significant than the number implies?
 
1.8TF > 1.3TF, to the degree that is universally understood about laws of conservation.
 
? We have more memory bandwidth. 176gb/sec is peak on paper for GDDR5. Our peak on paper is 272gb/sec. (68gb/sec DDR3 + 204gb/sec on ESRAM). ESRAM can do read/write cycles simultaneously so I see this number mis-quoted.

 

 

 
As some others may have pointed out, you don't just add the numbers together. At no point in time can the GPU see more than the maximum ESRAM bandwidth. Cold, hard fact.
 
This leads to my question. Can ESRAM sustain simultaneous read/write cycles ALL the time? If not, then how much of the time?
 
And please allow me to help out your PR department a little. 204gb/sec, according to your understanding of the number, actually implies the old clock rate of 800mhz. The new number should be 853 * 128 * 2 (simultaneous read/write per cycle) = 218gb/sec. They can thank me later.
 
? We have at least 10% more CPU. Not only a faster processor, but a better audio chip also offloading CPU cycles.

 

 

 
Please do tell us more about Sony's audio chip, or do you actually not know like the rest of us?
 
? Speaking of GPGPU - we have 3X the coherent bandwidth for GPGPU at 30gb/sec which significantly improves our ability for the CPU to efficiently read data generated by the GPU.

 

 

 
You are combining read and write bandwidth again for your side, while using the one-way bandwidth for your competition. Not quite being honest there, are you?
 
or
 
  • What is this inherent inefficiency you speak of? Can you elaborate? It is not something I've ever heard mentioned.
  • Your second point contradicts your first. If 50% more CU performance is viable to inefficiencies, why would 6% extra performance not also be privy to the same thing?
  • How did you arrive at the 204gb/s figure for the Esram, can you elaborate? Also you realise this is a very disinginuous claim. YES the bandwidth can be added together in that the DDR3 and Esram can function simultaneously, but this tells only a small part of the full story. The Esram still only accounts for a meagre 32mb of space. The DDR3 ram, which is the bulk of the memory (8GB) is still limited to only 68gb/s, whilst the PS4's GDDR5 ram has an entire 8GB with 176gb/s bandwidth. This is a misleading way to present the argument of bandwidth differences.
  • How do you know you have 10% more cpu speed? You said you are unaware of the PS4's final specs, and rumours of a similar upclock have been floating around. It could also be argued that the XO has the more capable audio chip because the systems audio Kinect features are more demanding, something the PS4 does not have to cater to. Add to that, the PS4 does also have a (less capable) audio chip, along with a secondary custom chip (supposedly used for background processing). There's that to consider too.
  • That's good that Microsoft understands GPGPU, but that does not take away from the inherent GPGPU customisations afforded to the PS4. The PS4 also has 6 additional compute units, which is a pretty hefty advantage in this field.
  • This is factually wrong. With Onion plus Onion+ the PS4 also has 30gb/s bandwidth.

 

 

  • Like 3
Link to comment
Share on other sites

Exactly.  And the PS4 doesn't offer any of the cloud support, which will allow developers to take advantage of performance that doesn't ship in the box as well.  Whereas the PS4 has to rely on what's in the box for 10 years, the XBoxOne does not...

 

you're right, because cloud (a.k.a. "remote processing") is not something that can be added to the platform by a simple upgrade.  :rolleyes:

Link to comment
Share on other sites

 

? We have at least 10% more CPU. Not only a faster processor, but a better audio chip also offloading CPU cycles.

 

Please do tell us more about Sony's audio chip, or do you actually not know like the rest of us?

 

As an aside, if anyone is interested to read a reasonably detailed discussion about the XB1 audio chip, with input from an engineer who helped to design it (or maybe lead engineer, not quite sure), here is a beyond3D forum thread on the topic

Link to comment
Share on other sites

So the stuff MajorNelson posted about the Xbox 360 vs the PS3 is correct? The Xbox 360 has more bandwidth than the Xbox One?

300zo8i.jpg

You would rather drink the companies cool aid vs multiple skeptical posts about technical inconsistencies, and multiple independent games developers findings? Only a *loyalist fan* would say that.

You do not add ram bandwidth like that. It's quite obvious instead of spending 30 mins to read NeoGAF you just seen "Audioboxer" and done what half this GH usually does. Your loss.

Its lime you're afraid the X1 is as powerful as the Ps4 for some ridiculous reason. Cool aid has nothing to do with it and again, only die hard fans pull that one over on fans of a competing system. I'll still choose to believe the engineer who designed the system any day before I believe anyone on Neogaf - that's just common sense. Have fun with your theories though.

  • Like 2
Link to comment
Share on other sites

Its lime you're afraid the X1 is as powerful as the Ps4 for some ridiculous reason. Cool aid has nothing to do with it and again, only die hard fans pull that one over on fans of a competing system. I'll still choose to believe the engineer who designed the system any day before I believe anyone on Neogaf - that's just common sense. Have fun with your theories though.

 

It's actually the reverse, all the scampering is being done on the MS side to try and say there is negligible difference, the PS4 performance gap isn't that big, developers are exaggerating, simple math can be ignored, secret sauce/GPUs, ect. If anyone is afraid as you put it, it's select individuals in the MS camp. The blind eating up of MS PR/marketing proves that day in day out on Neowin. It started with the cloud % figures nonsense, and now we're onto RAM bandwidth nonsense.

 

Albert isn't the engineer who developed the system, he's a PR mouth piece and one that is actually discrediting the engineers who worked hard on the Xbox One making them look like buffoons by trying to post technical speak quite akin to MajorNelsons blog posts of 2005 that simply does not add up.

 

Sony are not the ones saying anything about a performance difference if you notice, they've given their hardware specs, and then it's the DEVELOPERS who are talking about better hardware and/or independent individuals who know hardware. Sony PR mouthpieces aren't getting into message board debates about their tech specs vs Xbox One. Why is it MS that have to defend the performance gap and not their Xbox One developers if Albert's post is oh so truthful? I mean will you even admit MajorNelsons blog posts from 2005 are a crock of ****? Or does the Xbox 360 have 278.4 GB/s of bandwidth and the PS3 less than 1/5th of that, and the Xbox One less than the 360?

Link to comment
Share on other sites

It basically comes down to this. While Sony obviously and without any doubt has significantly more powerful hardware, Microsoft keeps saying that they matched the performance increase through smart software.

 

What does this mean in the long run? Sony can still optimize their software and get quite serious performance gains. Microsoft on the other hand is stuck with slower hardware, and once each platform has reached equal software maturity it'll be the PS4 that ends up slightly faster or with slightly better graphics.

 

But in the end you shouldn't let performance decide what console is right for you anyway. You pick the ecosystem you like most. For me that is the PS4 without doubt, simply because it has much more of an international focus (no features that are only available in some very specific locales), better controllers (imo) and a much faster release where I live. If you want to go for the Xbox One because of the TV features, Kinect, ... be my guest.

Link to comment
Share on other sites

 

So the stuff MajorNelson posted about the Xbox 360 vs the PS3 is correct? The Xbox 360 has more bandwidth than the Xbox One?

 

300zo8i.jpg

 

You would rather drink the companies cool aid vs multiple skeptical posts about technical inconsistencies, and multiple independent games developers findings? Only a *loyalist fan* would say that.

Companies cool-aid? Seriously? If you can't accept that MS have some of the best hardware engineers in the world then you simply are being a fan boy. Albert even specified the employee status in his posts and that the company was lucky to have two of them on the X1 as a project.

 

You do not add ram bandwidth like that. It's quite obvious instead of spending 30 mins to read NeoGAF you just seen "Audioboxer" and done what half this GH usually does. Your loss.

Erm, yes you can. Is the GPU the only device which accesses memory? 

 

Here's just one example

 
Some of your points are misleading, or otherwise need clarifying.
 
Graphics processing is inherently parallel, so 18 vs 12 is indeed 50% more, given the same clock rate.
The efficiency of CU's dramatically decrease the more there is. That was his point, and you failed to mention it. Also, the efficiency of the CU's and the application of parallel processing is down to the software and the graphics libraries which use the hardware, something a OpenGL wrapper doesn't include.

 

Each CU being 6% faster still means only 6% speed increase overall compared to YOUR baseline, not Sony's. You can't have it both ways, Albert. Having 50% more CU is not quite 50% more GPU, but having a 6% clock speed increase is more significant than the number implies?
You've just contradicted your first point. Yes, as he stated this is also a 6% increase across the CU's which makes a big difference.
 
1.8TF > 1.3TF, to the degree that is universally understood about laws of conservation.
 
As some others may have pointed out, you don't just add the numbers together. At no point in time can the GPU see more than the maximum ESRAM bandwidth. Cold, hard fact.
Did you know ESRAM doesn't share the latency issues that GDDR does? In real-time application uses, the returning latency delays the processing on the CU. Also, you've got to consider that DDR and ESRAM are separate so if the GPU was frame-buffering on the ESRAM, the CPU can use the whole of the DDR BW without worrying about sharing the same bus with the GPU. The CPU can't actually see the ESRAM. With the PS4, with the CPU/GPU sharing the same bus with the same memory architecture with the higher returning latency, it will really will slow down real-time application uses. Especially since the CPU will have to waste clock cycles to wait for the RAM return.
 
This leads to my question. Can ESRAM sustain simultaneous read/write cycles ALL the time? If not, then how much of the time?
Of course it can.
 
And please allow me to help out your PR department a little. 204gb/sec, according to your understanding of the number, actually implies the old clock rate of 800mhz. The new number
should be 853 * 128 * 2 (simultaneous read/write per cycle) = 218gb/sec. They can thank me later.
 
Please do tell us more about Sony's audio chip, or do you actually not know like the rest of us?
Read up on this, even Mark Cerny has quoted that in the future he can imagine the GPU taking some of the audio load due to the fact that it isn't effected by the GDDR latency in the same way as the CPU is. Extra load on the GPU, means less power to work with. Audio takes up a lot of resources.
 
You are combining read and write bandwidth again for your side, while using the one-way bandwidth for your competition. Not quite being honest there, are you?
Expand on this further, I don't see what you mean. It can read and write at the same time but then it halves the BW for each operation?
  • Like 3
Link to comment
Share on other sites

  • 23 May 2013 - Linus Blomberg [Just Cause Studios]

    "It's difficult to say, as it's still early days when it comes to drivers," Blomberg said. "With each new driver release, performance increases dramatically in some areas. The PlayStation 4 environment is definitely more mature currently, so Microsoft has some catching up to do. But I'm not too concerned about that as they traditionally have been very good in that area. 

    "The specs on paper would favor the PS4 over the Xbox One in terms of raw power, but there are many other factors involved so we?ll just have to wait and see a bit longer before making that judgment," he added.

    • 15 June 2013 - Anonymous

      In talking to a developer who wished to remain anonymous, gamers will see a difference on Day One when they compare third party PS4 games to Xbox One head-to-head. 

      The developer told me the PS4 is 40 percent more powerful than Xbox One and games like Call of Duty Ghosts will be noticeably different out of the gate.

    • 14 July 2013 - Confirmed XBOX One Developer [on Reddit]

      The facts are on paper, the PS4 has better specs and the most you can debate is by how much. What I can tell you is I have played Forza, Killer instinct, and Ryse on the Xbox One. They look as good as the games I play on a high end PC. Ryse reminded me of darksiders II.

     

    • 02 August 2013 - John Carmack [ex id Software]

      "... it's almost amazing how close they are in capabilities, how common they are and uh that the capabilities they give are essentially the same. We can talk about differences in memory architectures, but, the bottom line being that they're a multicore AMD processor with AMD graphics, is, it's almost weird how close they are."

      "Now everyone would like me to come out with some, some A over B comparison about the 2 platforms, and to be completely honest, I haven't done really rigorous benchmarking on them, so even if I didn't have NDA protection I couldn't give you a really completely honest answer. But, they're very close. They're both very good."

    • 21 August 2013 - Marcus Nilsson [Ghost Games - Need For Speed Rivals] 

      ?What we?re seeing with the consoles are actually that they are a little bit more powerful than we thought for a really long time ? especially one of them, but I?m not going to tell you which one," Nilsson told VideoGamer.com at Gamescom earlier today. 

      ?And that makes me really happy. But in reality, I think we?re going to have both those consoles pretty much on parity ? maybe one sticking up a little bit. And I think that one will look as good as the PC.?

      Nilsson chose not to answer when asked whether he was referring to the PS4 version. 

      Pushing him on the subject further, I asked: ?Does that mean that the PS4 version of Rivals will look better than the Xbox One version??

      ?I think that both consoles will look pretty much on parity,? he replied, ?but one of them might stick up a little bit.?

    • 21 August 2013 - Anonymous

      Behind the scenes, c't could hear from developers that the 3D-performance of PlayStation 4 is very far ahead of Xbox One.

    • 23 August 2013 - Anton Yudintsev [Gaijin Entertainment - War Thunder]

      AY: Well, obviously PlayStation 4 is more powerful than Xbox One.

      How much more powerful?

      AY: It depends what you?re doing. GPU, like 40 per cent more powerful. DDR5 is basically 50 per cent more powerful than DDR3, but the memory write is bigger on Xbox One so it depends on what you?re doing.

      How is that going to translate to on-screen results for the kinds of games you want to make? So to optimise War Thunder on both consoles you could hypothetically make a better, prettier version on PS4?

      AY: Yep.

    • 06 September 2013 - Adrian Chmielarz [ex People Can Fly - Gears of War Judgment]

      PMW0KPs.png

      Here I am. So...

      1. I am not doing a damage control, but I do want to clarify one thing. But first, yes, devs I know -- and as someone has shown it before in this thread, some other devs already talked about it too -- claim that there's 50% speed difference WHEN DEVELOPING in cross-gen/next-gen PS4/XO games. So there we are, I said it and I stand by it. Notice: WHEN DEVELOPING. It'll become clear in a second.

      2. Will this change in the future? WIll devs discover some tricks to narrow the gap? Will stuff like XO cloud computing help? Hell if I know. Uhm, maybe? I know that devs -- well, most of them -- will do whatever they can do get you the best games possible. You're going to see a lot of multiplatform games this next gen, just as you've seen them in this gen, so it's in studios' best interest that there's no clear advantage in one version over the other.

      3. Does it mean studios will cripple PS4 versions to match XO ones? Not really, do not underestimate the devs. Even if this happens, you will not know that and that's okay. You've never seen most games in their most powerful form anyway (when we work on them on our ninja dev PCs in 1080p 120fps with all the antialiasings and stuff turned on for ****s and giggles). But most of the time devs have a target and they meet this target. If it's a multiplatform game, it's designed with this in mind from the start. So maybe it's not maxing out one console while going 100% on the other. Maybe it's 100% on both, but they take extra time for super-extra optimizations on the weaker hardware to make sure things look the same as on the more powerful platform. Etc. etc.

      4. So what is that "one thing" I want to clarify, that some people may consider "damage control", but really is just an explanation. Someone mentioned Titanfall, which looks money and enjoys a great hype. Exactly. A great dev will make a great game no matter what's the hardware. Current gen CoDs looks great and it's 60 fps, on both platforms (well, and PC :). To most devs that is just impossible to achieve. And yet...

      Think about it this way. X360 is faster than PS3. Not just easier to program on, it's faster overall (although PS is faster/better in SOME areas). And yet no exclusive on X360 looks like The Last of Us. Halo 4 looks great. Gears blew my mind in 2006. And still, the best looking AAA game of this generation belongs to the supposedly weaker platform.

      So if you think that the war is over because PS4 is 50% faster TODAY, then you're delusional. This is far from over, and will probably never be over, at least not this upcoming gen.

Link to comment
Share on other sites

It's actually the reverse, all the scampering is being done on the MS side to try and say there is negligible difference, the PS4 performance gap isn't that big, developers are exaggerating, simple math can be ignored, secret sauce/GPUs, ect. If anyone is afraid as you put it, it's select individuals in the MS camp. The blind eating up of MS PR/marketing proves that day in day out on Neowin. It started with the cloud % figures nonsense, and now we're onto RAM bandwidth nonsense.

 

Albert isn't the engineer who developed the system, he's a PR mouth piece and one that is actually discrediting the engineers who worked hard on the Xbox One making them look like buffoons by trying to post technical speak quite akin to MajorNelsons blog posts of 2005 that simply does not add up.

 

Sony are not the ones saying anything about a performance difference if you notice, they've given their hardware specs, and then it's the DEVELOPERS who are talking about better hardware and/or independent individuals who know hardware. Sony PR mouthpieces aren't getting into message board debates about their tech specs vs Xbox One. Why is it MS that have to defend the performance gap and not their Xbox One developers if Albert's post is oh so truthful? I mean will you even admit MajorNelsons blog posts from 2005 are a crock of ****? Or does the Xbox 360 have 278.4 GB/s of bandwidth and the PS3 less than 1/5th of that, and the Xbox One less than the 360?

Albert isn't an engineer by his own admission and he also stated he is getting these numbers from the guy who designed the silicon for Xbox One. I would take that over random GAFer who have no idea that "Technical Fellow" is a legitimate designation. When they make fun of the designation in the same post questioning the numbers, it is difficult to take them seriously.

As for the RAM bandwidth, Albert doubled down on that and the eSRAM bi-directional throughput. We will go with that until the promised tech deep dive from him.

It is laughable of sony defense force calling anyone eating MS PR when they ate and drank Sony bull**** with PS3 - bought an underpowered + expensive hardware that turned out to be no better than Xbox 360.

 

 

It basically comes down to this. While Sony obviously and without any doubt has significantly more powerful hardware, Microsoft keeps saying that they matched the performance increase through smart software.

 

What does this mean in the long run? Sony can still optimize their software and get quite serious performance gains. Microsoft on the other hand is stuck with slower hardware, and once each platform has reached equal software maturity it'll be the PS4 that ends up slightly faster or with slightly better graphics.

 

But in the end you shouldn't let performance decide what console is right for you anyway. You pick the ecosystem you like most. For me that is the PS4 without doubt, simply because it has much more of an international focus (no features that are only available in some very specific locales), better controllers (imo) and a much faster release where I live. If you want to go for the Xbox One because of the TV features, Kinect, ... be my guest.

Yes, Sony has significantly powerful hardware just like they had with PS3 and Cell? :rolleyes:

Please don't come back with "but it's different this time and architectures are same" because you only get to make one of these two arguments,

- Xbox One architecture is complicated and difficult to develop for than PS4

OR

- Xbox One architecture is similar to PS4 and hence you can directly compare them.

Yes, Sony can optimize their software but who says they didn't paint themselves in the corner like they did with the PS3? If you see PS4's additional features and then you see what Xbox is offering - Sony is again left behind like they were with Xbox 360's party chat, guide, XBL, achievements etc. It took them more than half of the current gen to get everything working.

And yes, the PS4 specs are better on paper but as of today, there is no visible difference between games on both sides.

I agree with the ecosystem part and as of right now, Sony doesn't have one.

Link to comment
Share on other sites

Albert isn't an engineer by his own admission and he also stated he is getting these numbers from the guy who designed the silicon for Xbox One. I would take that over random GAFer who have no idea that "Technical Fellow" is a legitimate designation. When they make fun of the designation in the same post questioning the numbers, it is difficult to take them seriously.

As for the RAM bandwidth, Albert doubled down on that and the eSRAM bi-directional throughput. We will go with that until the promised tech deep dive from him.

It is laughable of sony defense force calling anyone eating MS PR when they ate and rank Sony bull**** with PS3 - bought an underpowered + expensive hardware that turned out to be no better than Xbox 360.

 

 

Yeah so why would most of you, you being anyone who blindly accepts MS claims (especially the 600% cloud ones...) want to go down that path now? Where are all the developers, not MS PR figures, disputing the other developers who talk about the PS4 hardware vs Xbox One differences? Why are the developers talking about PS memory being faster whilst Albert is trying to say the Xbox One is faster?

 

If you believed MN in 2005 as well as I've posted a few times now in this topic you'd think the Xbox 360 memory is better than the Xbox One memory. Official PR can BS, hence the PS3 launch nonsense, but this time around as I've also said in here it is not Sony PR claiming the difference, it is the games developers. But of course everyone outside of MS HQ is on Sony's payroll, as is 90% of the 130,000 on GAF... and the Xbox One 180 while we are at it was caused by Sony fanboys.

Link to comment
Share on other sites

*Snip*

You're posting everything I'm stating and trying to explain to you, the PS4 has more power on paper but theirs so many factors involved which makes them seem more equal.

Link to comment
Share on other sites

All arguments aside, your average consumer is gonna buy whatever ecosystem they prefer. They could care less which is better, unless there is a major difference and I really doubt they are in this case. 

  • Like 2
Link to comment
Share on other sites

You're posting everything I'm stating and trying to explain to you, the PS4 has more power on paper but theirs so many factors involved which makes them seem more equal*.

 

*Fairy dust and magic not included.

Link to comment
Share on other sites

All arguments aside, your average consumer is gonna buy whatever ecosystem they prefer. They could care less which is better, unless there is a major difference and I really doubt they are in this case. 

 

Correct, but message boards aren't where most of your average consumers spend their time, so BS PR and marketing speak doesn't really survive on them like how they may in the real world. Especially in the last year or two where so many gaming companies have been torn to shreds by simple investigation and fact finding. EA I'm looking at you, Sim City needs always online for a persistent world? 10 year old Johnny may believe that, but 25 year old Victor cracked it and played it offline.

 

MJ Nelson to Angry Joe at E3?

 

iSLTVABsv32Ij.gif

 

Major-Nelso-flips-the-switch.jpg

 

Day 1 software patch, turns it off completely.

Link to comment
Share on other sites

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.