Microsoft's Penello: No way is Xbox One giving up 30% power advantage t


Recommended Posts

*Fairy dust and magic not included.

That's fine, you just ignore those points I made regarding the statements you made above. Is it possible to have a mature discussion here?

 

Just to make this even funnier, the cloud will make a big difference in years to come. Just as the technological understanding and the optimisation of the PS4 code will. Pretty similar concepts.

Link to comment
Share on other sites

Correct, but message boards aren't where most of your average consumers spend their time, so BS PR and marketing speak doesn't really survive on them like how they may in the real world. Especially in the last year or two where so many gaming companies have been torn to shreds by simple investigation and fact finding. EA I'm looking at you, Sim City needs always online for a persistent world? 10 year old Johnny may believe that, but 25 year old Victor cracked it and played it offline.

 

MJ Nelson to Angry Joe at E3?

 

iSLTVABsv32Ij.gif

 

Major-Nelso-flips-the-switch.jpg

 

Day 1 software patch, turns it off completely.

 

That meme is stupid. Yes, you couldn't just flip a switch and turn it off. Why do you think they delayed launch in 8 countries? I don't know who that Angry Joe guy is but I know enough of software development to understand that such late changes can cause major scheduling problems.

Link to comment
Share on other sites

Yeah so why would most of you, you being anyone who blindly accepts MS claims (especially the 600% cloud ones...) want to go down that path now? Where are all the developers, not MS PR figures, disputing the other developers who talk about the PS4 hardware vs Xbox One differences? Why are the developers talking about PS memory being faster whilst Albert is trying to say the Xbox One is faster?

 

If you believed MN in 2005 as well as I've posted a few times now in this topic you'd think the Xbox 360 memory is better than the Xbox One memory. Official PR can BS, hence the PS3 launch nonsense, but this time around as I've also said in here it is not Sony PR claiming the difference, it is the games developers. But of course everyone outside of MS HQ is on Sony's payroll, as is 90% of the 130,000 on GAF... and the Xbox One 180 while we are at it was caused by Sony fanboys.

Where did Albert say Xbox One is faster than PS4? He has, multiple times said this (paraphrasing) - performance difference is not as much as the raw specs indicate.

 

His tweet yesterday makes it very clear : "performance differences are overstated".

Link to comment
Share on other sites

That's fine, you just ignore those points I made regarding the statements you made above. Is it possible to have a mature discussion here?

 

Just to make this even funnier, the cloud will make a big difference in years to come. Just as the technological understanding and the optimisation of the PS4 code will. Pretty similar concepts.

 

You xbox fans really need to stop fooling yourselves into thinking the cloud will be the thing that magically makes the xbox one into a superconsole. Cause you're only setting yourselves up for disappointment and a river of tears.

Link to comment
Share on other sites

Where did Albert say Xbox One is faster than PS4? He has, multiple times said this (paraphrasing) - performance difference is not as much as the raw specs indicate.

 

His tweet yesterday makes it very clear : "performance differences are overstated".

 

We have more memory bandwidth. 176gb/sec is peak on paper for GDDR5. Our peak on paper is 272gb/sec. (68gb/sec DDR3 + 204gb/sec on ESRAM). ESRAM can do read/write cycles simultaneously so I see this number mis-quoted.

 

 

Developers however say the PS4 memory is faster. ESRam only accounts for 32MB, PS3 has the full 8GB available at the bandwidth of 176gb/sec. ESRam helps, lots, but it's the same nonsense that MN was trying to spew in 2005 by simply adding numbers together. 

 

http://www.anandtech.com/show/6972/xbox-one-hardware-compared-to-playstation-4/2

Link to comment
Share on other sites

You xbox fans really need to stop fooling yourselves into thinking the cloud will be the thing that magically makes the xbox one into a superconsole. Cause you're only setting yourselves up for disappointment and a river of tears.

When did I ever say that? Offloading non-latency effected calculations in the cloud gives more local power, struggling to understand that?

Link to comment
Share on other sites

When did I ever say that? Offloading non-latency effected calculations in the cloud gives more local power, struggling to understand that?

 

My, my. Xbots have short memories. You more or less said you think that'll happen right here, 20 minutes prior to my post.

 

 

the cloud will make a big difference in years to come.

 

Of course, feel free to come up with whatever story you want to try to say that you weren't implying that with that line. I'm sure it too will be amusing like all the theories you guys come up with for the xbox one.

Link to comment
Share on other sites

My, my. Xbots have short memories. You more or less said you think that'll happen right here, 20 minutes prior to my post.

 

 

 

Of course, feel free to come up with whatever story you want to try to say that you weren't implying that with that line. I'm sure it too will be amusing like all the theories you guys come up with for the xbox one.

Saying it makes a big difference doesn't mean I'm specifically saying the cloud makes the local resources of the X1 comparable to a super computer. I'm saying that you'll see some things powered by the cloud in the future which will make a big difference to games.  

Link to comment
Share on other sites

In the post on GAF, Penello got much more into the details of the claim stating, ?I was hoping my comments would lead the discussion to be more about the games (and the fact that games on both systems look great) as a sign of my point about performance, but unfortunately I saw more discussion of my credibility.?


Breaking down his argument by points, Penello makes the following cases:


1. [PS4's] 18 CU?s  vs. [Xbox One's] 12 CU?s [does not equal] 50% more performance. Multi-core processors have inherent inefficiency with more CU?s, so it?s simply incorrect to say 50% more GPU. (Note: CUs are ?compute units?)


2. ?Adding to that, each of our CU?s is running 6% faster. It?s not simply a 6% clock speed increase overall.?


3. ?We have more memory bandwidth. 176gb/sec is peak on paper for GDDR5. Our peak on paper is 272gb/sec. (68gb/sec DDR3 + 204gb/sec on ESRAM). ESRAM can do read/write cycles simultaneously so I see this number mis-quoted.?


4. ?We have at least 10% more CPU. Not only a faster processor, but a better audio chip also offloading CPU cycles.?


5. ?We understand GPGPU and its importance very well. Microsoft invented Direct Compute, and have been using GPGPU in a shipping product since 2010 ? it?s called Kinect.?


6. ?Speaking of GPGPU ? we have 3X the coherent bandwidth for GPGPU at 30gb/sec which significantly improves our ability for the CPU to efficiently read data generated by the GPU.?


penello.png?resize=216%2C196If this seems a little technical to you ? it is. However, the TL;DR version is basically that the Xbox One is a much better performer than people think and the design team needs a little more credit for being, ?some of the smartest graphics engineers around ? they understand how to architect and balance a system for graphics performance.?


Penello sums up his case by stating both machines are good performers and adds, ??each company has their strengths, and I feel that our strength is overlooked when evaluating both boxes.?


Apparently there will be benchmarks published soon which should clarify the issue even more ? but to the average consumer who doesn?t give a ######, most of us just want to see dat framerate nice and solid with beautiful lighting and textures, and full bars when playing multiplayer online.


 


 


http://pixelenemy.com/microsoft-product-planning-boss-elaborates-on-his-tweet-with-tech-specs-and-insists-the-xbox-one-has-better-performance-than-people-think/


Link to comment
Share on other sites

fX0ThDr.png?1

 

At 800Mhz clock, Each CU has a peak bandwidth of 24 GB/S. At 853Mhz, Each CU has a peak bandwidth of 25.4 GB/S

 

18 Compute Units have a peak bandwidth of 432 GB/S. GDDR5 has a peak bandwidth of 176 GB/S

12 Compute Units have a peak bandwidth of 304 GB/S. ESRAM has a peak bandwidth of 218GB/S

 

In this case, 12 Compute Units will have higher utilization. Simply adding more compute units doesnt mean a linear processing power increase. The units are limited by bandwidth,and the only thing adding compute units does is make the schedulers jobs easier,therefore leading to a speed increase,hence the benchmarks we see with a 50% power difference only show an average of 20% increase in framerate.

 

In the diagram, it shows how jobs are loaded in small blocks. Speed is essential here.

 

This is a job for......ESRAM

 

psL2bbX.png?1

 

DDR3 keeps the esram full of jobs,and takes the exported work, while reads and writes are done simultaneously. At the same time, DDR3 is free to do some side projects, like help out with the audio among other things. Here we see how both bandwidths are being utilized at the same time. 218GB/S is being utilized to import and export work to and from the gpu,while DDR3 68GB/S is free for other parts of the system. In a cycle,we could be using 286 GB/S of bandwidth.

 

I keep hearing people saying, oh but the esram is so small. If you look at the first image,it shows the GCN architecture does work from small chunks, which is perfect for esram usage. this architecture is built for using something like esram.

 

note: i drew the colored memory boxes to the right of the second image.

Link to comment
Share on other sites

Developers however say the PS4 memory is faster. ESRam only accounts for 32MB, PS3 has the full 8GB available at the bandwidth of 176gb/sec. ESRam helps, lots, but it's the same nonsense that MN was trying to spew in 2005 by simply adding numbers together. 

 

http://www.anandtech.com/show/6972/xbox-one-hardware-compared-to-playstation-4/2

So he said their memory subsystem is faster and claims but overall performance difference is overstated. He has reiterated claims that you can add those numbers.

Nowhere he claimed that Xbox One as a whole is faster than PS4.

You say it is a PR nonsense because GAF says such but GAF says a lot of stupid things. I guess if the person who designed the APU says it is OK to add those numbers, then it is OK to add them.

Here is what we know about Microsoft claims to this date about memory (and I am paraphrasing from my admittedly bad memory)

- Theoretical peak of 272 GB/s

- but most games will hit something like 133 GB/s

- Their internal buses are wider/faster than PS4 (according to your favorite - anonymous developers)

Sony has said PS4 has a theoretical peak of 176 GB/s

- what is their actual number that most games will hit?

- Have you ever stopped and questioned if that rate is achievable / sustainable?

- What is their CPU clock? is it faster/slower than XBO or same?

- If their audio hardware is same as XBO's SHAPE, why did Sony say that devs can use GPGPU in future to free up CPU?

- What is the exact

Sony still hasn't given more details about their hardware or at least has not been as vocal as Microsoft. Why are they given a pass based on their initial reveal numbers?

Link to comment
Share on other sites

Its interesting to see you guys argue over this.  I see a lot of people that know fairly little about how cpu/gpu architecture works talking like they are experts.  Then I see people drop in to bash it all as some fanboy attack against the PS4.

 

I feel bad for that Penello guy.  He bends over backwards to say how good the PS4 is and how his technical points are not intended as a shot at them.  And yet, people here and Neogaf just want to bash him, I guess feeling threatened by his info for whatever reason.

 

I never see Mark Cerny's quotes destroyed in the same way.  Its pretty interesting. Panello may not be the head architect like Cerny is, but it seems strange to claim we should ignore what he says, but take everything Cerny says as gold. Id rather just see the technical points debated.

 

 

Why can't this debate be limited to a technical discussion? Why must people start spewing the same BS on both sides.  No the cloud doesn't offer a 600% boost (nor have MS ever claimed as such), and no these technical specs aren't some xbox fanboy plot against the PS4.  Get a grip guys.

 

 

 

Developers however say the PS4 memory is faster. ESRam only accounts for 32MB, PS3 has the full 8GB available at the bandwidth of 176gb/sec.

 

 

Correction, the PS4 does not have the full 8GB available.  But your point stands otherwise.

 

I think the hardest part for those of us just coming into this thread looking for info is that you guys dump a whole bunch of info to support or deny the implications made.  Its very hard to figure out who is right around here.  I may know a lot about pc parts and such, but I'm no expert on the inner workings of cpus and gpus. 

  • Like 3
Link to comment
Share on other sites

<SNIP>Yeah so why would most of you, you being anyone who blindly accepts MS claims (especially the 600% cloud ones...) <SNIP>

I just want to point out but I don't think ANYBODY thinks the Xbox will gain 600% more power because of the cloud. its either a coincidence that you used 600% or you are actually referring to this;

 

Xbox One cloud processing gives Forza 5 600% more AI capability, says dev

 

 

 

?So we can now make our AI instead of just being 20%, 10% of the box?s capability, we can make it 600% of the box?s capability,? he went on. ?Put it in the cloud and free up that 10% or 20% to make the graphics better ? on a box that?s already more powerful than we worked on before.?

Source: http://www.vg247.com/2013/08/01/xbox-one-cloud-processing-gives-forza-5-600-more-ai-capability-says-dev/

 

Again, I don't think the Dev is trying to say that the cloud will even make the box 10-20% better because of the cloud, more than it free's up 10-20% of what they would normally be computing to use for other things like graphics, physics etc.

Link to comment
Share on other sites

I just want to point out but I don't think ANYBODY thinks the Xbox will gain 600% more power because of the cloud. its either a coincidence that you used 600% or you are actually referring to this;

 

Xbox One cloud processing gives Forza 5 600% more AI capability, says dev

 

Source: http://www.vg247.com/2013/08/01/xbox-one-cloud-processing-gives-forza-5-600-more-ai-capability-says-dev/

 

Again, I don't think the Dev is trying to say that the cloud will even make the box 10-20% better because of the cloud, more than it free's up 10-20% of what they would normally be computing to use for other things like graphics, physics etc.

 

 

You're right on the mark, offloading tasks to the cloud frees up the local hardware to do other things, it's pretty simple as that.  No ones talking about it magically making a game look way better, so not having a internet connection to use the feature just means the local hardware has to work more, probably a small dip in frames per second at best. 

 

I'm actually surprised this thread is still going and why there's this deep need in some to defend things.   We still don't know everything about the XB1 though we have more info and we also know very little about the PS4.   RAW performance numbers, on paper, are never what you get in final real world tests.  It's like some people don't bother to look at all the benchmarks we've gotten over the years on the PC side.  Many times we've had CPUs or even GPUs that are, on paper, 20% or 30% faster yet in the actual benchmarks the difference is smaller, often in the single digit %.  

 

That's basically what Albert is saying and some want to jump on him and discredit him for it?   The real world is calling, it's time to wake up.  

 

This whole 30, 40, 50% thing has also become a joke, everytime there's a new post on this subject it seems the number keeps changing.    Whatever the case, games will run and look great, do you even care if one game is, at best, 5 frames per second faster than the other?    I bet when it's all said and done the two systems will be very close, even 5+years from now that's still not going to change.  And any real world performance difference is going to be smaller than some think it will be.  If you want a number I'm betting on a real world difference of 10-15% in the PS4s favor, nothing ground shattering TBH and much how the current gen has turned out.

  • Like 3
Link to comment
Share on other sites

Jeez, they didn't just flip the switch. They removed or gimped a lot of the expected functionality because of uninformed outcry. Then they removed then check in.

It seems like these are the same arguments about how "powerful" the PS3 was going to be. Sony's message this time around might as well be "Same as PS3! Except where we copied the 360 (architecture, Kinect)". Wow! That's so next-Gen!

Link to comment
Share on other sites

Correct, but message boards aren't where most of your average consumers spend their time, so BS PR and marketing speak doesn't really survive on them like how they may in the real world. Especially in the last year or two where so many gaming companies have been torn to shreds by simple investigation and fact finding. EA I'm looking at you, Sim City needs always online for a persistent world? 10 year old Johnny may believe that, but 25 year old Victor cracked it and played it offline.

 

MJ Nelson to Angry Joe at E3?

 

iSLTVABsv32Ij.gif

 

Major-Nelso-flips-the-switch.jpg

 

Day 1 software patch, turns it off completely.

 

Tuck it away Boxer.your idiot is showing.

 

Maybe you missed the bit where they needed how many months to get a day one patch? How about all the features they had to remove in order to make this happen? It certainly was not just a flip of a switch.

 

I have no allegiance to either MS or Sony but posts like the above degrade the quality of all posts on Neowin and lower the overall IQ of the membership. Don't even get me started on taking the word of a handful of NeoGAF wafers over the word of real world engineers. Beggars belief.

Link to comment
Share on other sites

Okay here is Ars opinion because anyone who is a peasant like me can't know anything about hardware

http://arstechnica.com/gaming/2013/09/microsoft-exec-defends-xbox-one-from-accusations-its-underpowered/

 

 

"The entire point of GPU workloads is that they scale basically perfectly, so 50% more cores is in fact 50% faster."

 

disproved by digital foundry http://www.eurogamer.net/articles/digitalfoundry-can-xbox-one-multi-platform-games-compete-with-ps4

 

 

The results pretty much confirm the theory that more compute cores in the GCN architecture doesn't result in a linear scaling of performance. That's why AMD tends to increase core clock and memory speed on its higher-end cards, because it's clear that available core count own won't do the job alone.

 

 

and everything else, apparently they dont know what it means, or how it works.

 

What the hell does that even mean?"

 

 

"Just adding up bandwidth numbers is idiotic and meaningless. While the Xbox One's ESRAM is a little faster, we don't know how it's used, and the PS4's GDDR5 is obviously a lot bigger."

 

 

"Maybe true."

 

 

"Who cares about the API? It really doesn't make much difference."

 

 

"I don't know if that's even true."

 

LOL,dont ever post that again.

Link to comment
Share on other sites

This crap has turned into a religion.  Everyone having fun arguing about there beliefs about something that no one really can see or touch with their own eyes. Its so fun to run around on Google or wherever and find random quotes to try and prove a point about something that just cannot be disproved at this time! Carry on.

Link to comment
Share on other sites

*snip*

I can't believe they've even posted that! When someone doesn't knows what the difference is on raising the clock speed across the whole CU's then they really shouldn't be commenting on hardware. The API comment well, I can't even believe that is journalism on Ars Technica! 

Link to comment
Share on other sites

With browsing this article on the front-page, its just dawned on me that Sony have not announced any of their clock speeds for the GPU and CPU. Considering the size of the box and the amount of heat it'll need to draw out from the APU and the direct relation between heat and clock speed, you're producing a 50% number on pure speculation. The box would be running very hot at 1.6Ghz and 800Mhz also WITH the PSU! Its either a very loud box, or they've under clocked it.

Link to comment
Share on other sites

With browsing this article on the front-page, its just dawned on me that Sony have not announced any of their clock speeds for the GPU and CPU. Considering the size of the box and the amount of heat it'll need to draw out from the APU and the direct relation between heat and clock speed, you're producing a 50% number on pure speculation. The box would be running very hot at 1.6Ghz and 800Mhz also WITH the PSU! Its either a very loud box, or they've under clocked it.

It's not a separate processor and GPU, it can run as hot as it needs to.

 

I don't expect the stupid arguments will be calmed down til mid 2014 at the earliest heh.

Link to comment
Share on other sites

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.