Substance Engine benchmark implies PS4 CPU is faster than Xbox One's


Recommended Posts

power consumption matter less than how it was on portable devices.

That is true, but it still matters since we are talking about closed systems with cooling built around a certain expectation when it comes to power consumption.

Also there are end users that watch this stuff very closely and do care. Some won't even use the 'standby' feature on the ps4/x1 due to the power draw.

Link to comment
Share on other sites

To everyone here who's using this to draw a line at CPU performance between consoles, you're of your rocker. This is one use of one core on the same APU's without details of the benchmark being mentioned. Is it simply flushing the data of the compressed textures after calculation or storing into RAM? Is it pulling assets from RAM to compress? We simply don't have this information which doesn't make it correct to judge CPU's based on this test alone.

 

Also I very doubt the "turbo boost" method is being implemented on both consoles. You need a very predictable and stable environment in consoles to ensure the correct timings of calculations within engines. Throw you're CPU speed up suddenly, everything goes out of the window.

 

Definitely interesting though, never the less.

Link to comment
Share on other sites

This "texture creation" task sure sounds like something where memory bandwidth could make a difference. In which case, the CPUs may be the same, but the fact that the PS4 has DDR5 memory could make up for the difference.

Link to comment
Share on other sites

This is likely from more PS4 memory bandwidth as pointed out above. Any texture work relies heavily on bandwidth.

For games, developers have access to 6 of the 8 cores on both XB1 and PS4. Someone said it was 5 for XB1 but that is definitely wrong. 2 cores are reserved for OS/apps on each console.

Another thing, 10% of the GPU is reserved on XB1 for Kinect/OS/snap mode. MS have said they will be freeing this up to developers in the future.

Link to comment
Share on other sites

Also I very doubt the "turbo boost" method is being implemented on both consoles. You need a very predictable and stable environment in consoles to ensure the correct timings of calculations within engines. Throw you're CPU speed up suddenly, everything goes out of the window.

Both Intel and AMD use it in PC's now, software has sequences or timers which keep timings and calculation in order having a faster CPU clock frequency just allows software instructions to operate faster.

For example if I upgrade my PC CPU does everything go out the window? No, the PC just runs a bit smoother and faster.

Link to comment
Share on other sites

i'm pretty sure the cpu architectures are pretty much the same, microsoft's has a faster clock speed BUT sony has 1 extra core for developers to use. Microsoft uses 1 for o.s, 1 for kinect and 1 for audio processing giving game developers 5 to use. Sony uses 1 for audio and 1 for o.s as far as i'm aware. 1 extra core would explain this performance improvement.

Uh.. this is inaccurate.

XBO reserves 2 cores, 3GB RAM for OS/apps. Both consoles have dedicated audio processors and don't see why they would need to reserve CPU for it.

Link to comment
Share on other sites

 

lol, interesting read.

 

Just in regards to main post looks like lots of News sites are picking up the story now and posting on it:

 

http://www.cinemablend.com/games/PS4-CPU-More-Powerful-Than-Xbox-One-CPU-According-Benchmark-Test-61203.html

http://gamingbolt.com/substance-engine-increases-ps4-xbox-one-texture-generation-speed-to-14-mbs-12-mbs-respectively

http://n4g.com/news/1418699/developer-hints-at-ps4s-cpu-being-more-powerful-than-xbox-ones-cpu

 

http://www.extremetech.com/extreme/171375-reverse-engineered-ps4-apu-reveals-the-consoles-real-cpu-and-gpu-specs

Reverse Engineering of the PS4 confirms its a Jaguar CPU (Same as Xbox One) but also finds out the GPU has 20 compute units rather than the specified 18.

 

Also regardless if the PS4 CPU is faster than the Xbox One's CPU, I don't see the CPU being the bottleneck in terms of graphics for games. PS4 also happens to have much better graphics specs but the X1 has the hdmi pass-through and NFL app.

Link to comment
Share on other sites

DirectX Texture = DXT

 

If your asking what games use DirectX API's, then its pretty much every game.

These results are for a single engine not the whole console API.

Link to comment
Share on other sites

DirectX Texture = DXT

 

If your asking what games use DirectX API's, then its pretty much every game.

 

Sorry got confused and thought it was a complete engine like UE and FB3

Link to comment
Share on other sites

http://www.extremetech.com/extreme/171375-reverse-engineered-ps4-apu-reveals-the-consoles-real-cpu-and-gpu-specs

Reverse Engineering of the PS4 confirms its a Jaguar CPU (Same as Xbox One) but also finds out the GPU has 20 compute units rather than the specified 18.

 

 

The ps4 isn't the only one with extra compute units.  The X1 has two additional units then are specified.

 

The reason both have them is in case of defects during manufacturing.  Up to two units can be defective and the chip can still pass for production.

 

This happens in must chip production, especially large chips like these.  What that also means is that as yields improve, you could see a lot of PS4s and X1s where all of their compute units are fully functional, so they actually have more functional units than the specs say, although neither system actually makes use of the reserved bits.

Link to comment
Share on other sites

This happens in must chip production, especially large chips like these. What that also means is that as yields improve, you could see a lot of PS4s and X1s where all of their compute units are fully functional, so they actually have more functional units than the specs say, although neither system actually makes use of the reserved bits.

Is that enabled via software update or it needs hardware adjustments to enable the extra CU's?

Link to comment
Share on other sites

Is that enabled via software update or it needs hardware adjustments to enable the extra CU's?

 

Probably disabled via hardware modifications so it most likely isn't possible and I'd certainly doubt that you'd be able to use them with the default system software even if you could.

Link to comment
Share on other sites

Is that enabled via software update or it needs hardware adjustments to enable the extra CU's?

 

 

It depends on how its manufactured, but I would say in this case it could be a software/firmware adjustment.

 

On the pc, there have been video cards that are 'hacked' to enable hardware features that are inaccessible for reasons like that.  Those just required a custom firmware and the hope that the extra units were not in fact damaged.

 

I remember it being done for some AMD cpus a  long time ago when they would have extra cache built onto the cpu in case of failure during manufacturing.  Some risk taking modders found a way to make it accessible.

Link to comment
Share on other sites

It depends on how its manufactured, but I would say in this case it could be a software/firmware adjustment.

 

On the pc, there have been video cards that are 'hacked' to enable hardware features that are inaccessible for reasons like that.  Those just required a custom firmware and the hope that the extra units were not in fact damaged.

 

I remember it being done for some AMD cpus a  long time ago when they would have extra cache built onto the cpu in case of failure during manufacturing.  Some risk taking modders found a way to make it accessible.

 

In the case of modern GPUs, the binning and ID is done via hardware settings. There have been physical modifications done on workstation cards in order to make them id as consumer cards in order for them to use consumer drivers.

 

I feel if this sort of thing were generally doable via firmware, then we'd see a-lot of software firmware hacks around for i7 pentiums and celerons to enable more cores and HT. 

 

Example (from consumer -> workstation)

http://www.tomshardware.com/news/Nvidia-GTX-690-Quadro-K5000,21656.html

Link to comment
Share on other sites

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.