AnandTech: Xbox One vs. PS4 - Hardware comparison


Recommended Posts

No it wont, the ESRAM makes up for the lack of bandwidth and tries to prevent bottlenecks because they used DDR3 ram instead of the much faster GDDR5.

Link to comment
Share on other sites

No it wont, the ESRAM makes up for the lack of bandwidth and tries to prevent bottlenecks because they used DDR3 ram instead of the much faster GDDR5.

Are you a console developer? I'm not,. I imagine that how the games use the esram(and if they use it), how they will optimze for gddr5 or ddr3 etc. can be specified by a developer. If that's wrong, I'll wait until a professional developer with optimzation experience on a console tells me that. And as always. The esram, the data-move-engine, the touchpad on the ps4 controller, the extra CU, ROP or whatever still doesn't bring the exclusives to the "opponent".

I don't got more time for this anymore. I've got better things to do, since I know WHY I'll buy a specific console instead of wondering how much % extra performance I may be getting. And I got a great internet connection, so I will be able to take advantage of whatever the devs does in the azure cloud.

Link to comment
Share on other sites

The PS3 was better on paper yes but the CPU was a nightmare to work with. This generation they have identical CPUs and architectures and both GPUs are of the same family so its an easy comparison to make.

XboxOne.png

Everything is on die is it's a freaking SoC.

so you give me a user drawn image to try to prove your point, from a website with headlines such as "Microsoft admits they lost the console war with Xbox One", and "Microsoft cedes console to Sony, gives up on gaming" . Nice,but youre not fooling anyone.

They are the same family of GPU, AMD Southern Islands, we know they are using Graphics Core Next architecture, we know it can do 768 operations per clock which was straight from the mouth of Microsoft which means we can deduce that it is using 12 CUs and PS4 18CUs.

geez,you really don't know how to read. I know about the compute units,thats not what im arguing.

You calling me ignorant is the same as you believing there are modifications, added parts or extra logic. The only extra logic that Xbone has over PS4 is the ESRAM and Move Engines. While the Move Engines can take care of some of the missing GPGPU stuff it doesn't make up the 6 missing CUs.

that is opinion,not fact. unless you have the silicon masks, everything is opinion until we get teardowns into the chips or explanations about how things inside work from the guys who worked on them. and what I find comical is you being adamant there are no modifications to xbox one chip, but sony made modifications to ps4 chip.

You keep repeating that you think it has "magic extra logic" doesn't mean it is fact either, did you even watch the Microsoft Architectural Briefing?

I never claimed its fact, I am saying it is possible,and giving reasons to back my opinion.

The first Xbox didn't have EDRAM, so your "they did it on the previous version" assumption is false.

previous version to xbox one. I guess you're really reaching here to get "get me".

Link to comment
Share on other sites

Do you understand what an SoC is?

This should tell you that the ESRAM isn't external.

This doesnt tell you anything,and I do know what a soc is, I also know busses exist,and the previous xbox used a ram chip/ASIC connected to the main cpu die externally through a dedicated bus,so there is precedent for this kind of thing. Again,this is called opinion,you are speaking like you designed the thing.

Link to comment
Share on other sites

No I just read and listen to what Microsoft are saying and am not making assumptions based on what the previous console had.

Link to comment
Share on other sites

Last comment from me :)

No I just read and listen to what Microsoft are saying and am not making assumptions based on what the previous console had.

This is all good. I agree that we can't assume that Xbox One's SoC would work the same way as the old chip(s) did.

The problem is that while you're not making assumptions based on previous console features, you're still making conclusions based on the few details they shared! Did you consider that there may be more details that they DIDN'T say? Like how the esRAM would work, if they included some revolutionary optimization code etc.etc. I'm sure that they didn't reveal the whole chip design at the press event or at the post-show live event.

It's like a salesman revealing that the car has 1000hp. A 1000hp doesn't help me much when he forgot to tell me that the car has no steering wheel. The point is: you don't know the whole design/specifications, so it's wrong to make conclusions about their performance at this point. Both made custom modifications to the chips.

Link to comment
Share on other sites

8BmZs7f.png

ESRAM

Durango has no video memory (VRAM) in the traditional sense, but the GPU does contain 32 MB of fast embedded SRAM (ESRAM). ESRAM on Durango is free from many of the restrictions that affect EDRAM on Xbox 360. Durango supports the following scenarios:

  • Texturing from ESRAM
  • Rendering to surfaces in main RAM
  • Read back from render targets without performing a resolve (in certain cases)

The difference in throughput between ESRAM and main RAM is moderate: 102.4 GB/sec versus 68 GB/sec. The advantages of ESRAM are lower latency and lack of contention from other memory clients?for instance the CPU, I/O, and display output. Low latency is particularly important for sustaining peak performance of the color blocks (CBs) and depth blocks (DBs).

As I already stated the ESRAM is there to negate the slow DDR3 RAM and weak GPU they are using to save money so they can afford to bundle Kinect in every box.

The Durango GPU includes a number of fixed-function accelerators. Move engines are one of them.

Durango hardware has four move engines for fast direct memory access (DMA)

This accelerators are truly fixed-function, in the sense that their algorithms are embedded in hardware. They can usually be considered black boxes with no intermediate results that are visible to software. When used for their designed purpose, however, they can offload work from the rest of the system and obtain useful results at minimal cost.

Each move engine can read and write 256 bits of data per GPU clock cycle, which equates to a peak throughput of 25.6 GB/s both ways. Raw copy operations, as well as most forms of tiling and untiling, can occur at the peak rate. The four move engines share a single memory path, yielding a total maximum throughput for all the move engines that is the same as for a single move engine. The move engines share their bandwidth with other components of the GPU, for instance, video encode and decode, the command processor, and the display output. These other clients are generally only capable of consuming a small fraction of the shared bandwidth.

The advantage of the move engines lies in the fact that they can operate in parallel with computation. During times when the GPU is compute bound, move engine operations are effectively free. Even while the GPU is bandwidth bound, move engine operations may still be free if they use different pathways. For example, a move engine copy from RAM to RAM would not be impacted by a shader that only accesses ESRAM.

As I already stated, the Move Engines are there purely to negate the slow memory and bottlenecking, they help offload some of the load from CPU/GPU but they are no way going to make up the difference of the 50% faster PS4 GPU.

Link to comment
Share on other sites

and why do you all care about the hardware? the ps3 was better then the 360, but the multi-platform games was in my experience usually better on the 360, and they didn't s*ck even though the 360 was "holding the games back".

You came into a thread titled HARDWARE COMPARISON wondering why everyone is talking about the hardware. Not sure what you were expecting...

Link to comment
Share on other sites

This thread is still going? lol.

No one will know what hardware is the "best" until the devices ship.. No one. You can make assumptions on historical trends, pc trends and personal experiences, but until you have actually used & compared these two devices, we're all just bitching and moaning about specs. Specs don't always translate to performance/value/fidelity/experience.

One of the most sought after cars today is a car that fails in so many specs and can barely drive two hours between charges, yet when you put it together as a package and see it for what it is, its value & functionality is simply incomparable to anything else out there.

Way too much assuming in here

Link to comment
Share on other sites

Oh, it's you again. The guy who thought the only thing involved with rendering and its bandwidth needs is the size of the framebuffer and nothing else

Link to comment
Share on other sites

As I already stated the ESRAM is there to negate the slow DDR3 RAM and weak GPU they are using to save money so they can afford to bundle Kinect in every box.

I don't think Kinect is as costly as you think (parts wise). That can't be the only reason.

Link to comment
Share on other sites

I don't think Kinect is as costly as you think (parts wise). That can't be the only reason.

Just cause it isn't costly doesn't mean microsoft wont jack up the price because of it. Just look back at the 360: If you wanted a 360 with an HDD you had to pay $100 more than one without it. Despite the fact a 250GB HDD didn't cost anywhere near $100.

Link to comment
Share on other sites

Just cause it isn't costly doesn't mean microsoft wont jack up the price because of it. Just look back at the 360: If you wanted a 360 with an HDD you had to pay $100 more than one without it. Despite the fact a 250GB HDD didn't cost anywhere near $100.

We are talking material cost here not retail pricing.

TheLegendofMart mentioned Microsoft saving on RAM to accommodate Kinect, I am saying Kinect's material won't be as costly as it seems and there has to be some other reason if they were trying to hit a price point.

In short, my post had nothing to do with the overpriced accessories. Try to follow the conversation.

Link to comment
Share on other sites

the raw processing power of the system makes me cringe though after I look at my desktop systems raw power... dev's are going to have to come up with some highly optimized code to make good use of these systems... sure they are faster raw FLOP wise then the 360 and PS3, but compared to a Intel ivy bridge i7 system with a good quality mid range graphics card......

Would you not want fully optimized code to begin with?

Link to comment
Share on other sites

I don't got more time for this anymore. I've got better things to do, since I know WHY I'll buy a specific console instead of wondering how much % extra performance I may be getting.
The thread was never about why Graimer will buy a specific console in the first place. If you didn't want to discuss the performance differences between the consoles why bother posting here?

It seems like the majority of people posting in these console comparison threads are only trying to justify their pre-determined choice of one or the other. Gee, be happy with your choice and let those who want to discuss objectively do so.

Link to comment
Share on other sites

Could we maintain a maturity level in these threads, hopefully a positive one?

If this continues warns will be handed out along with thread closure.

Link to comment
Share on other sites

Pleeeeeeeease do not close this thread. I am enjoying the debates.

I know it's hard to restrain ourselves at times, but just try to drop the personal attacks.

Link to comment
Share on other sites

Whoa, moderation in the gamer's hangout section!? That's a rarity.

Apparently it needs to become more frequent. ;)

Pleeeeeeeease do not close this thread. I am enjoying the debates.

I know it's hard to restrain ourselves at times, but just try to drop the personal attacks.

:shiftyninja:

Just stay within the confines of the community rules and it's all good. :)

Link to comment
Share on other sites

I noticed some people keep comparing gddr3 to gddr5 and the xbox one uses DDR3 and NOT GDDR3 like the xbox360. GDDR3 is basically ddr2. DDR3 would be better for all purpose computing but if your only doing games gddr5 is better. Basically gddr5 handles large amounts of data better than ddr3 but ddr3 handles small amount of data better. Also gddr5 has horrible latency compared to ddr3 but when doing gpu intensive task it shouldnt matter to much.

Link to comment
Share on other sites

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.