Will PS4 games really perform 50 percent better than Xbox One games?

The hardware battle between Sony's PlayStation 4 and Microsoft's Xbox One would seem, on paper at least, to be in Sony's favor. While both next-generation game consoles use AMD processors, the PS4's chip has 1152 GPU cores that are capable of generating 1.84 teraflops. The Xbox One GPU has 768 GPU cores that can only manage 1.23 teraflops. That means the PS4 should be able to outperform the Xbox One by 50 percent, in GPU performance.

That's the theory. In practice, the performance difference between the two systems in terms of the same game for both platforms may not be quite as large as the numbers suggest. Eurogamer reports that their Digital Foundry team decided to perform an experiment by building PC rigs with hardware that are based on the same basic AMD architecture as the PS4 and Xbox One.

After the rigs were built, several PC gaming benchmarks were run on both systems. The result was that while the system with the approximate PS4 hardware specs generated higher frames per second on those benchmarks than the rig with the Xbox One hardware, they didn't come close to a 50 percent increase. Indeed the percentages were between 17.6 and 33 percent.

The team then ran the PC version of Crysis 3 on both systems, since that game is perhaps the closest in terms of a next-generation console title that is currently available. Both rigs ran the game first with full 1080p resolution and in that test the PS4 test rig had only a 19.3 increase in frame rate compared to the Xbox One PC rig.

While the article admits it's still too early to come to any firm conclusions, they do say that for the launch period of both consoles, the differences between PS4 game performance and their Xbox One counterparts will likely be minimal. The hardware performance of both consoles may only manifest themselves in games during the third year of their lifecycles as developers learn to use more of their capabilities.

Source: Eurogamer

Report a problem with article
Previous Story

Analyst: Nokia Lumia 1020 sales off to 'modest' sales start

Next Story

Microsoft launches new Yammer extension for Google's Chrome

122 Comments

Commenting is disabled on this article.

oh and another little thing maybe the x1 is able to achieve 60fps becuase its not really 60 FRAMES per second as its not loading the full frame because its more like 100,200,300,400 etc TILES per second, so maybe theres food for thought!!

blah blah blah... GDDR5 better bandwidth on PS4 big strong wow. it doesnt neccesarily mean itll be better. Im not getting either console cus i dont like em but i think what MS is going to be doing with tile based rendering on 11.2 will defeat the PS4 or at least keep up with it...

Think of it like this... PS4 has 1meter x 1 meter tile in the bathroom. X1 has 1mx1m tile split into 100x 1cm tiles. you crack the PS4's tile youve got to replace the whole thing taking a long time. you crack 1 of the x1, tile it only has to replace that 1 cm tile... really short time.

Now AA, and particle effects is a bloody killer i remember playing BF3 when i got into the street with cop cars with flashing lights, my computers gfx got anihilated was pushing 2fps. the PS4 has to render the whole scene, the x1 only renders the lights and the extent to which the light reflections change the environment. which means it doesnt need as much rendering power to do the same thing, but cus these bits are smaller maybe the eSRAM can speed these little changes through really fast.

thats how MS will get around it in my opinion...

It is similar, but Sony is still using a bunch of numbers to make their console look better.

The fact of it is that people have played the PS4 and were not impressed. Sub 30FPS for AAA titles shouldn't be happening.

At least the cloud is actually benefitting people at launch, with hosted servers for multiplayer. Just to name one benefit.

PS4 can do tiled rendering.

It's ridiculous the minute things people are hanging their hats on to avoid the giant specs lead PS4 has.

Why not talk about how amazing the kinect is instead? I mean it's the albatross that caused the X1 to be overpriced and underpowered, so you'd better hype it up.

how does the PS4 do tiled rendering, cus MS can hold back the 11.2 from the PS4 or if they use opengl does that do it as well?

like i said i aint getting consoles cus i think there crap, im just defending the xbox abit cus i like the different approach MS are going

Kind of like the Cell?

Ease of Development and sales will win. The Kinect, if it is what they say it is, will be in demand and people will want it.

Anything MS can do with their X1 software, Sony can do with PS4 software, with more hardware power behind it.

PS4 will be easier to develop for, no ESRAM to worry about.

I don't think Kinect is going to move X1s, I see it more as an albatross weighing it down.

startscreennope said,
Anything MS can do with their X1 software, Sony can do with PS4 software, with more hardware power behind it.

PS4 will be easier to develop for, no ESRAM to worry about.

I don't think Kinect is going to move X1s, I see it more as an albatross weighing it down.

You really have no idea what you're talking about, do you. It's getting laughable.

spenser.d said,

You really have no idea what you're talking about, do you. It's getting laughable.

You really have no argument do you, it's getting laughable.

Sony is past the Krazy Ken Cell spec overinflation. MS on the other hand has handily picked up the mantle with the laughable "infinite power of the cloud" marketspeak.

It is similar, but Sony is still using a bunch of numbers to make their console look better. The fact of it is that people have played the PS4 and were not impressed. Sub 30FPS for AAA titles shouldn't be happening. At least the cloud is actually benefitting people at launch, with hosted servers for multiplayer. Just to name one benefit.

quazl said,
It is similar, but Sony is still using a bunch of numbers to make their console look better. The fact of it is that people have played the PS4 and were not impressed. Sub 30FPS for AAA titles shouldn't be happening. At least the cloud is actually benefitting people at launch, with hosted servers for multiplayer. Just to name one benefit.
The vast majority of AAA games on PS3 and 360 were 30 FPS, sacrificing framerate for image quality. Expect the same next gen.

only 33% faster - thats a lot!

DDR5 runs a lot faster than DDR3 - why do you think servers use DDR5? It can calculate a shed loads more faster than 3.

There is no such thing as DDR5. Its GDDR5 that the PS4 has which is the graphics equivalent to the DDR3 memory. GDDR5 isn't much faster than DDR3, especially when it comes to non-graphics things.

You need faster memory for transferring larger amounts of data, that is why Gddr5 is used, but DDR3 is better for transferring smaller amounts of data. If you notice, DDR3 video cards have mainly 64bit memory bandwidth or 128 (64x2), while gddr5 cards have 256 or 384 bit memory bandwidths.

nitroxhotshot said,
You need faster memory for transferring larger amounts of data, that is why Gddr5 is used, but DDR3 is better for transferring smaller amounts of data. If you notice, DDR3 video cards have mainly 64bit memory bandwidth or 128 (64x2), while gddr5 cards have 256 or 384 bit memory bandwidths.

That's all well and true if you want to brute force large textures down the GPU pipeline. But saying that, MS has already demonstrated how you can take advantage of large high quality textures in games without needing the massive allocation of textures and high bandwidth memory. Look at the new tiled resource feature in DX 11.2, it's made specifically to allow for high quality textures, or in this case, parts of them, to be used when needed and you don't have to load the whole thing up and brute force it through the GPU.

If this feature is taken advantage of the way it's suppose to be by developers then the bandwidth differences between the XB1 and PS4 aren't that important anymore. Where the PS4 has to load and stuff everything through it's GPU it's faster GDDR5 helps with all the large textures/data etc. Now the XB1 can use the bits it needs, freeing up the GPU and memory to do other tasks and not have to worry about how fast it can work through the large textures being rendered on screen.

mnl1121 said,
There is no such thing as DDR5. Its GDDR5 that the PS4 has which is the graphics equivalent to the DDR3 memory. GDDR5 isn't much faster than DDR3, especially when it comes to non-graphics things.
Sure, 1600MHz DDR3 is almost as fast as 5500MHz GDDR5!

startscreennope said,
PS4 can do tiled resources, it's a software feature, it can be coded in.

Actually it can't, at least not in a way to speed up graphics.

Windows NT is the ONLY OS that has a kernel level GPU scheduler. This is what allows the new DX 11.2 tiled resource to work and be 'fast'.

Even Windows 7's WDDM/WDM technology can't do the tiled resources of DX11.2, and it is only one step back in the progression of the WDDM/WDM technologies.

Since only NT is capable of this GPU control and granularity at the kernel level, it is means the FreeBSD OS the PS4 is using cannot do this. Even if the tiled resources are coded into the gaming framework, it doesn't mean it will be fast enough to be an advantage over just resource swapping.

***
When Vista was in development a lot of OS engineers like myself found that the GPU scheduling technology of the WDDM would be interesting and wondered if any other OS architecture would try to add this functionality.

Unfortunately, the rest of the world didn't realize what Microsoft was doing for a long time, and even now, it would take such radical breaks in the kernel designs of Linux and OS X that implementing this technology won't happen anytime soon. (This is one way the object model of NT demonstrated it is high extendible and flexible, as these changes were not hard for NT.)

So here we are with Windows having a lot of advantages and nothing else with kernel level GPU management, and it keeps stinging.

So PS4 with FreeBSD and OpenGL should just give up using tiled resources (also called PRT, a feature of AMD GCN APUs found in both the PS4 and X1) because they don't have a "kernel level GPU scheduler" so even if they try to implement it, it won't be in a way that "speeds up graphics". Grade A FUD and grasping at straws right there, seeing as the feature is already implemented and running in PS4 game engines.

But I suppose hair splitting and FUD about "tiled resources" is all that's left in the face of such an obvious hardware advantage.

In fact, doing a google search for "kernel level GPU scheduler" just leads to more FUD posts by you. LOL

This quote about the Xbox ESRAM is where it gets interesting.

"DICE's BF3 reflects the parity in compute power, offering virtually identical performance, suggesting that sans MSAA, the tech isn't hugely reliant on bandwidth. Transplanting those findings across to the next-gen consoles, developers for the Microsoft console have their work cut-out in utilising the DDR3 and ESRAM effectively in matching the sheer throughput of the PS4's memory bus. Getting good performance from the ESRAM is key in ensuring that Xbox One is competitive with the PS4."

Considering AA can be done for 'free' on ESRAM this levels the playing field more than people are expecting.

shao said,
This quote about the Xbox ESRAM is where it gets interesting.

"DICE's BF3 reflects the parity in compute power, offering virtually identical performance, suggesting that sans MSAA, the tech isn't hugely reliant on bandwidth. Transplanting those findings across to the next-gen consoles, developers for the Microsoft console have their work cut-out in utilising the DDR3 and ESRAM effectively in matching the sheer throughput of the PS4's memory bus. Getting good performance from the ESRAM is key in ensuring that Xbox One is competitive with the PS4."

Considering AA can be done for 'free' on ESRAM this levels the playing field more than people are expecting.

Developers should be fine with taking advantage of the ESRAM in the XB1 since the 360 had it's own, though smaller, slice of embedded memory in the GPU that was also used for free 4xAA. I don't think it'll be hard for them to take advantage of, and I bet MS will make it pretty easy with the tools they give them.

Maybe the next gen console is entirely base on pc architecture! So developer don't need to worry about how to develop a game with different architecture like the cell. I just wish we could easily change the console graphic card then we can stop arguing which console is better which is waste of time

I love both my PS3 and my Xbox 360, but the PS3 out-performed the 360 on any game that was made for both, hands down. So based on experience, I'm assuming the PS4 will be the same way, but I don't particularly care which has more power. I'm getting the PS4 first and I'll get an Xbox One in a year or two.

mattw891 said,
I love both my PS3 and my Xbox 360, but the PS3 out-performed the 360 on any game that was made for both, hands down. So based on experience, I'm assuming the PS4 will be the same way, but I don't particularly care which has more power. I'm getting the PS4 first and I'll get an Xbox One in a year or two.

Really? You ever check out professional reviews? Maybe a site like Lens of Truth or other sites that compare FPS and image quality?

There were very few games that the PS3 and Xbox 360 were close to being equal, with the Xbox 360 almost always having higher quality images.

The PS3 pre-rendered cut scenes are usually better, but that is just playing a video, and was possible because of the extra space on the BluRay.

Better hardware specs on paper means nothing, IT professionals should know better but hey, nothing written in here looks professional, so we're good.

BinaryDevotee said,
Better hardware specs on paper means nothing, IT professionals should know better but hey, nothing written in here looks professional, so we're good.

For videogames (and for the same quality), fps counts. In this case, ps4 is about 20% superior.

BinaryDevotee said,
Better hardware specs on paper means nothing, IT professionals should know better but hey, nothing written in here looks professional, so we're good.
They are using nearly identical architectures, main diff being RAM/ESRAM and GPU power. It's very easy to make a comparison in this case.

Brony said,

For videogames (and for the same quality), fps counts. In this case, ps4 is about 20% superior.

In this case? This case isn't actual hardware, so means absolutely nothing. It's like drinking milk and judging the cheese it will make.

18 CU doesn't matter when your GDDR5 is stalling your cpu,audio and I/o thus making those extra CU completely useless. Theres a reason it isn't used in PCs besides in vram,because DDR3 is faster at smaller data transfers,and has less latency. There needs to be the right balance for hardware to perform optimally. Theres also a reason most games running on ps4 hardware at e3 were running sub 30fps with horrible image quality,while xbox one games on real hardware were running at 1080p 60fps.

Steve121178 said,
A lot of Xbox One games were running on PC's. Here is Microsoft confirming this:

http://www.cinemablend.com/gam...box-One-Dev-Kits-56834.html

A lot of my friends covering E3 confirmed this also.

As you would expect, a lot of games at E3 were 'work in progress' thus not fully optimised. At no point where any Xbox One or PS4 games finished!


That was confirmed to only have been for one specific game. The rest were running on Xbox One architecture and were very much outperforming the PS4. Get your facts straight.

Steve121178 said,
A lot of Xbox One games were running on PC's. Here is Microsoft confirming this:

http://www.cinemablend.com/gam...box-One-Dev-Kits-56834.html

A lot of my friends covering E3 confirmed this also.

As you would expect, a lot of games at E3 were 'work in progress' thus not fully optimised. At no point where any Xbox One or PS4 games finished!

its funny that you say a lot of games were "confirmed" to run on windows 7 pcs, yet in your own link that you reference,the only evidence they state is lococycle, and they even reference Exhophase who played a handful of games and confirmed they were running on xbox one hardware.

http://www.exophase.com/60348/...story-about-xbox-one-demos/

and this is what the lococycle dev says


@PNF4LYFE @XboxP3 @Edwardmiles268 @DamonBaird1992 LocoCycle running on PC was my decision. We're still working on making it awesome.

now, more evidence that most of the games were running on xbox one hardware, heres a digital foundry article a few days later after their initial observations.


When it comes to the state of software development on PS4, the situation as it stands is surprising. On the one hand, freely playable first-party titles such as Knack and DriveClub suffer from noticeable frame-rate stutters down from 30fps, while on the other, "hands off" demos for the new Infamous and Assassin's Creed games appear to run without a perceptible hitch. This is in stark contrast to the playable software confirmed to be running direct from Xbox One hardware, such as Forza Motorsport 5 and Killer Instinct, which benefit to no end for targeting the 1080p60 gold standard, and largely succeed in doing so.

http://www.eurogamer.net/artic...hands-on-with-playstation-4

and it doesn't matter if games are finished or not,it clearly shows the difference in platforms. if ps4 games are still not hitting 30fps,while at the exact same place in time as xbox one,that has games running already at 60fps 1080p,what does that tell you? so ps4 games can still be refined just to hit their target fps of only 30fps, but xbox one games are already hitting their target fps,and will be even more refined. this is a big deal,and cant be disputed.

vcfan said,
if ps4 games are still not hitting 30fps,while at the exact same place in time as xbox one,that has games running already at 60fps 1080p,what does that tell you? so ps4 games can still be refined just to hit their target fps of only 30fps, but xbox one games are already hitting their target fps,and will be even more refined. this is a big deal,and cant be disputed.
It means developers have different framerate targets and priorities on framerate or image quality. It also means you're using irrelevant info as a red herring to imply PS4's hardware advantage is somehow false.

vcfan said,
18 CU doesn't matter when your GDDR5 is stalling your cpu,audio and I/o thus making those extra CU completely useless. Theres a reason it isn't used in PCs besides in vram,because DDR3 is faster at smaller data transfers,and has less latency. There needs to be the right balance for hardware to perform optimally. Theres also a reason most games running on ps4 hardware at e3 were running sub 30fps with horrible image quality,while xbox one games on real hardware were running at 1080p 60fps.
Delusional "GDDR5 latency" nonsense that has been debunked over and over, combined with red herring "implications" about developer framerate targets. Stop by a serious gaming console forum like neogaf and post that, you'll get laughed at, torn apart, then banned.

lol at neogaf. so that's where electrical engineers hang out at? oh dear.

and rightttt..... developers are intentionally targeting ****ty framerates. uh huh. its all part of the plan. LMAO. nice excuse.

Edited by vcfan, Jul 30 2013, 5:42pm :

GDDR5 latency is not an issue.

30FPS is not a ****ty framerate, it's a common standard for AAA console games that sacrifice framerate for image quality.

Yes please post on neogaf so I can laugh when you get banned for spreading long debunked nonsense.

gddr5 is only good for gpus because of their parallel processing nature. cpus work in a totally different way. they execute instructions linearly, and even with out of order execution,certain memory operations need the ram right away. conversation over.

30fps is a common standard...for last gen. the point is,if the ps4 hardware is so much powerful, then why is xbox one running circles around it in performance? because your previous gddr5 is handicapping other areas of the system,which in turn handicaps the gpu. I think developers recently figured that one out,and that's why theyre having trouble even hitting 30fps,which xbox one developers are already hitting 60fps with ease.

and I don't care about what some gaming geeks on neogaf think. they play video games,they aren't electrical engineers. why would I waste my time debating technical information on such a forum? I could tell you to post your little theories on engineering forums,and see who laughs at who.

GDDR5 latency is a complete red herring non-issue, grasping at straws at its best.
http://www.eurogamer.net/artic...ace-to-face-with-mark-cerny

"Latency in GDDR5 isn't particularly higher than the latency in DDR3. Also, GPUs are designed to be extraordinarily latency tolerant so I can't imagine that being much of a factor." In before ad hominem attacks on Cerny.

Xbox One is not running circles around PS4. 60 FPS games on X1 will have worse image quality than 60 FPS games on PS4. The 30 FPS games on PS4 have much higher IQ per frame. It's a developer decision to go with 30 or 60 FPS.

Trust me they take console wars seriously. You'd get laughed at, maybe corrected if they were feeling nice, then banned.

ask yourself this. intel and amd are making these fast chips,why are they using ddr3,and now going to move to ddr4 instead of just using gddr5? because its crap for cpu usage. the only proof you provided is a quote from mark cerny whos deflecting the question by only speaking about the gpu,which is something i already said doesnt get affected by latency. you obviously dont know how these things work,and just getting me random quotes. a non technical quote from anybody means garbage. you had the nerve to say grasping at straws?

they will have worse image quality on x1? thats funny,because everyone is saying how amazing stuff like forza and titanfall are, yet they are saying alot of ps4 games look like crap.

drive club

Unfortunately, this higher resolution only amplifies the low quality, blurry, flat-looking textures used across this level, which would easily look at home on current-gen hardware. It's also a shame that, while the scenery draw distance is broad, there's an incredible amount of pop-in for trees and waving NPCs as we approach at high speeds.

knack


The Pixar aesthetic is let down by some muddy image quality, and heavily dithered shadows. We're promised 1080p native resolution here, but Knack doesn't look as crystal clear as we'd expect from such a pixel count - perhaps in part owing to the HDTV settings being used at the exhibition. It's a real disappointment on the grounds of image quality, and while the transparency effect on Knack and the big, beautiful ocean view during the first stage are visual treats, there isn't a whole lot to the rest of stages shown.

thief


When it comes to performance, the game is v-synced, but very jittery in the frame-rate stakes for transitions into new areas, and throughout an entire sequence where a bridge burns to cinders. It also appears to run at native 1080p, though there's little being achieved here visually which we haven't already seen before

http://www.eurogamer.net/artic...hands-on-with-playstation-4

lol,maybe you should stop going to neogaf and believing all the bs theyre feeding you,because its obviously all wrong.

btw its obvious which banned member you are

Edited by vcfan, Jul 30 2013, 11:20pm :

More Cerny to educate you on the PS4's lack of bandwidth bottlenecks. Sorry, all the PS4 tech specs have been laid on the table, they are there for you to either understand or shout ignorant, false FUD about.

Once again, "I think PS4 launch games look bad therefore it has worse specs than Xbox One" is the most embarrassing red herring in the fanboy stable. You don't come right out and say "this means PS4 is inferior" because the facts are right in your face and you can't deny those, so you dance around with BS implications based on ignorance of software development and framerate/IQ targets.

Anyway, on to Cerny, or just use Google and learn a thing or two:

Oh and I'll laugh if you declare Cerny, the architecht behind the PS4 and certified genius, is somehow wrong and your blustering fanboy self knows the deep dark secrets of PS4 bandwidth bottlenecks. Even though, you know, devs have claimed PS4 was designed specifically to avoid bottlenecks.

https://www.google.ca/search?q=gddr5+latency+ps4

"Just as an example…when the CPU and GPU exchange information in a generic PC, the CPU inputs information, and the GPU needs to read the information and clear the cache, initially. When returning the results, the GPU needs to clear the cache, then return the result to the CPU. We've created a cache bypass. The GPU can return the result using this bypass directly. By using this design, we can send data directly from the main memory to the GPU shader core. Essentially, we can bypass the GPU L1 and L2 cache. Of course, this isn't just for data read, but also for write. Because of this, we have an extremely high bandwidth of 10GB/sec.

Also, we've also added a little tag to the L2 cache. We call this the VOLATILE tag. We are able to control data in the cache based on whether the data is marked with VOLATILE or not. If this tag is used, this data can be written directly to the memory. As a result, the entirety of the cache can be used efficiently for graphics processing.

This function allows for harmonization of graphics processing and computing, and allows for efficient function of both. Essentially “Harmony” in Japanese. We're trying to replicate the SPU Runtime System (SPURS) of the PS3 by heavily customizing the cache and bus. SPURS is designed to virtualize and independently manage SPU resources. For the PS4 hardware, the GPU can also be used in an analogous manner as x86-64 to use resources at various levels. This idea has 8 pipes and each pipe(?) has 8 computation queues. Each queue can execute things such as physics computation middle ware, and other prioprietarily designed workflows. This, while simultaneously handling graphics processing."

you have absolutely no idea what that even means,and youre just pasting random crap. this has nothing to do with the cpu accessing the ram itself,or the i/o, or the audio. this is talking about the cpu and gpu communicating with a cacheless bus. you can go look up what a software engineer says all you want(mark cerny )but the fact is, you're not going to disprove physics. please educate yourself. I suggest you start learning computer architecture instead of parsing mark cerny marketing interviews.

vcfan said,
you have absolutely no idea what that even means,and youre just pasting random crap. this has nothing to do with the cpu accessing the ram itself,or the i/o, or the audio. this is talking about the cpu and gpu communicating with a cacheless bus. you can go look up what a software engineer says all you want(mark cerny )but the fact is, you're not going to disprove physics. please educate yourself. I suggest you start learning computer architecture instead of parsing mark cerny marketing interviews.
LOL. Here we go with the personal attacks. Don't discuss anything Cerny said, just throw around wild accusations. Right, the PS4's performance will be utterly cripped because the CPU to RAM and audio (LOL) bandwidth is a tiny trickle and all the devs working on PS4 couldn't figure it out.

You genius you, you've solved the mystery - and of course, personal attacks on Mark Cerny as a "marketer" - so are you accusing him of distorting the truth/lying? Oh boy, the rabbit hole goes deep when it comes to PS4 FUD.

I accept your admission of defeat by way of personal attacks instead of coherent points regarding PS4 bandwidth "bottlenecks" that don't actually exist except in your head.

Ok, here's your next test: Accuse ALL these people of lying about PS4 bottlenecks. Go.

http://www.ps4.sx/2013/07/ps4-...earned-at-develop-2013.html
http://n4g.com/news/1227577/ps...s-claims-killzone-developer
http://www.gamersnexus.net/gui...xplained-implications-on-pc

Edited by startscreennope, Jul 31 2013, 2:03pm :

I only read the article with the killzone developer. if he is being honest, then he is a ****in idiot. but he isn't being honest,he's smart,he knows people like you will believe anything,and case in point,you posting these worthless articles that mean absolutely nothing. have you learned nothing from the ps3 days when they were pumping the same marketing stuff that you guys ate up? how did that turn out? you still refuse to go learn the difference in memories,and only keep posting more stupid marketing stuff.until you do,i wont waste my time schooling you any longer. go read about memory,bandwidth,architecture,come back and make a real argument,then we can debate. until then,im not interested in reading some more marketing mumbo jumbo from software developers.

LOL.

Irrelevant PS3 FUD - check.
Refusing to read - check.
Insulting developers/engineers/other people smarter than you - check.

You've said nothing about these magical, mysterious PS4 bottlenecks and linked to nothing.

Once again I accept your utter defeat as you continue to flounder, insult people smarter than you, and drop FUD instead of make any sort of technical points regarding PS4 hardware.

i already told you about the bottlenecks,but you were too busy looking up mark cerny quotes. just look at ddr3 timings and compare them to gddr5 timings,and understand what this means. gpus are sort of immune to these latencies because like i said,they perform in a parallel nature,so if they are waiting too long for data,they can do something else until the data is ready to be read. whereas everything else doesnt,so everything else will suffer from the wider timings of gddr5,and if everything else is suffering,this in turn will cause delays in sending jobs to the gpu. therefore,even though the gpu has this power,alot of it is wasted in waiting for other parts of the system to feed it jobs. cpus work in a linear fashion. they execute one instruction,then move to the next,and so on. they also work on smaller data sets,so latency is more important than bandwidth. look at cpu benchmarks and compare bandwidth of 2 different speeds,after a certain point,it doesnt matter how much you increase bandwidth,there is zero difference in performance,whereas in the benchmarks, latency will kill your performance pretty quick.

you have no technical knowledge to even know what that means,so thats why i told you to read up on how ram works,then i will gladly debate with you if you have any objections to these findings.

and how am i insulting them? im calling them what they are,software developers. are they not? they are not hardware engineers. they write software and they dont create silicon. can you reference mark cernys experience building hardware in the past?

maybe you should just stick to neogaf. you fit right in.

Between the two, the PS4 appears to be the developers console of choice. Take Watch Dogs for example, the lead platform is the PC and then the PS4 seems to be getting a lot of love.

The fact that the PS4 is more powerful on paper is just another significant string to Sony's bow. Plus the PS4 was designed as a games console, not a jack-of-all-trades entertainment hub which is primarily aimed at the US audience like the Xbox One.

Microsoft have got it disastrously wrong so far.

Steve121178 said,
Between the two, the PS4 appears to be the developers console of choice. Take Watch Dogs for example, the lead platform is the PC and then the PS4 seems to be getting a lot of love.

The fact that the PS4 is more powerful on paper is just another significant string to Sony's bow. Plus the PS4 was designed as a games console, not a jack-of-all-trades entertainment hub which is primarily aimed at the US audience like the Xbox One.

Microsoft have got it disastrously wrong so far.

http://gengame.net/2013/07/xbo...dogs-city-ai-more-dynamism/

That's interesting, because it looks to me like they're giving X1 plenty of love.

The Xbox One is also a games console first. Not sure how that is so hard to understand.

Xbox One is Microsoft's attempt at being your "go to" box for all your entertainment needs, be that games, movies or any other form of media.

The PS4 was designed as a games machine. The PS4 has only been about games. During conferences Sony talked about nothing but games and gaming services. Microsoft on the other hand banged on about everything but games during their disastrous 'unveiling' conference. Microsoft then went from one PR disaster to another, infuriating gamers and the gaming press along the way.

Check out the cover of the highly respected Edge magazine:

http://i2.cdnds.net/13/27/618x...gaming-edgemagazine-ps3.jpg

And how do you feel about Microsoft's backtracking? They have bowed to pressure to reverse some of that nonsense but ultimately you can see their vision for how they expect you to play games. They want to offer you nothing more than a glorified licensing service where you never really own the game, just the license to play a game. It's all so very wrong.

Steve121178 said,
Xbox One is Microsoft's attempt at being your "go to" box for all your entertainment needs, be that games, movies or any other form of media.

The PS4 was designed as a games machine. The PS4 has only been about games. During conferences Sony talked about nothing but games and gaming services. Microsoft on the other hand banged on about everything but games during their disastrous 'unveiling' conference. Microsoft then went from one PR disaster to another, infuriating gamers and the gaming press along the way.

Check out the cover of the highly respected Edge magazine:

http://i2.cdnds.net/13/27/618x...gaming-edgemagazine-ps3.jpg

And how do you feel about Microsoft's backtracking? They have bowed to pressure to reverse some of that nonsense but ultimately you can see their vision for how they expect you to play games. They want to offer you nothing more than a glorified licensing service where you never really own the game, just the license to play a game. It's all so very wrong.

I wish they'd bring it back. The benefits to moving to digital were immense. People just got their panties in a bunch because they're afraid of DRM.

Again though, its a games machine first. If you don't think Microsoft is concentrating on games as much as Sony you're fooling yourself. They just also have q bunch of extra features which I am most definitely looking forward to.

spenser.d said,
I wish they'd bring it back. The benefits to moving to digital were immense. People just got their panties in a bunch because they're afraid of DRM.
You just wish MS would send the Xbox One out to die?

Why are MS's most blind defenders the ones whose suggestions would cause it to fail the hardest?

startscreennope said,
You just wish MS would send the Xbox One out to die?

Why are MS's most blind defenders the ones whose suggestions would cause it to fail the hardest?

I'm not a blind defender. You're a blind detractor - disregarding any benefits for the sake of crying DRM. I see the benefits and they'd move the industry forward.

spenser.d said,

I'm not a blind defender. You're a blind detractor - disregarding any benefits for the sake of crying DRM. I see the benefits and they'd move the industry forward.

Let me guess, you're going to talk about the imaginary (never fully explained) "family sharing" next. There are no benefits to DRM.

Fanboys always start preaching about "forcing the industry to move forward", same nonsense with the start screen, surface, windows phones. All overpriced, underpowered, lacking in features, and market failures. You basically want MS to bang its face into a business failure brick wall until there's nothing left but a bankrupt mess.

You really want MS to fail don't you? You're like the perfect Manchurian candidate for MS's competitors, telling MS to shoot itself in the feet again and again.

spenser.d said,

I'm not a blind defender. You're a blind detractor - disregarding any benefits for the sake of crying DRM. I see the benefits and they'd move the industry forward.


Even if you personally want the media sharing and original DRM structure of the X1, the vast majority of gamers wouldn't.
And where would they go? PS4 obviously.

But think of this, if the vast majority of gamers is on PS4, where will the game development go to?

XBO could easily end up as an overpowered android tv box.

Steve121178 said,
Xbox One is Microsoft's attempt at being your "go to" box for all your entertainment needs, be that games, movies or any other form of media.

The PS4 was designed as a games machine. The PS4 has only been about games. During conferences Sony talked about nothing but games and gaming services. Microsoft on the other hand banged on about everything but games during their disastrous 'unveiling' conference. Microsoft then went from one PR disaster to another, infuriating gamers and the gaming press along the way....

Being able to do more than one thing does not make a console performance worse in gaming.

You are using flawed logic.

Example: Windows 8 is a multi-purpose OS, that gaming is NOT the #1 priority, yet it is the FASTEST OS for gaming.

Microsoft is in the business of kernels, compilers, OS design, and hardware design. Creating OS technology, drivers, and the gaming platform all on their own. They have a good chance of integrating these into a console to get the most performance out of it, no matter what other features they provide to the end user.

Sony is using a borrowed OS, a borrowed set of graphics frameworks, and a borrowed set of development technologies.


Let's use your logic for a moment...

You are correct that a Ferrari will usually outperform a Cadillac, as that is what it is what a sports car is designed to do.

However, that doesn't mean a company can't design a Cadillac that is designed to be a luxury car to carry lots of groceries and people that is also insanely fast.

Check out this video: http://www.youtube.com/watch?v=1sUjOiCj-bQ

(Skip to the 6:30 mark if you want to see the results.)

*And not only is the Cadillac faster, the Cadillac Station Wagon is faster than the Ferrari.

(Can't remember if the video also shows, but it even out corners the Ferrari, the other 'staple' of what a sports car is supposed to do better.)

So would you rather have the sports car designed Ferrari or be ok with the faster car that does everything?

Edited by Mobius Enigma, Jul 30 2013, 7:46pm :

Mobius Enigma said,
Being able to do more than one thing does not make a console performance worse in gaming.

You are using flawed logic.

Example: Windows 8 is a multi-purpose OS, that gaming is NOT the #1 priority, yet it is the FASTEST OS for gaming.

Microsoft is in the business of kernels, compilers, OS design, and hardware design. Creating OS technology, drivers, and the gaming platform all on their own. They have a good chance of integrating these into a console to get the most performance out of it, no matter what other features they provide to the end user.

Sony is using a borrowed OS, a borrowed set of graphics frameworks, and a borrowed set of development technologies.

It is true in X1's case. They sacrificed a huge amount of silicon budget for ESRAM to make room for 8GB of DDR3 when they should have just went for GDDR5, leaving them with a less powerful GPU and likely yield issues. They put a huge amount of their budget and R&D into Kinect, raising the price of the console.

Windows 8 is the fastest OS for gaming? Explain Left 4 Dead 2's hasty Linux port running faster than Windows on the same hardware. Windows is the most used OS for gaming, not necessarily the fastest.

Both Sony and MS will write software to maximize their console's performance, sure. It will be hardware specs that make the difference, where PS4 has the advantage.

startscreennope said,
It is true in X1's case. They sacrificed a huge amount of silicon budget for ESRAM to make room for 8GB of DDR3 when they should have just went for GDDR5, leaving them with a less powerful GPU and likely yield issues. They put a huge amount of their budget and R&D into Kinect, raising the price of the console.

Windows 8 is the fastest OS for gaming? Explain Left 4 Dead 2's hasty Linux port running faster than Windows on the same hardware. Windows is the most used OS for gaming, not necessarily the fastest.

Both Sony and MS will write software to maximize their console's performance, sure. It will be hardware specs that make the difference, where PS4 has the advantage.

There are latency issues with GDDR5 that are being overlooked, where the faster eSRAM design is a better option and filling it from GDDR3 or GDDR5 would be irrelevant in the outcome of performance.

As for Left 4 Dead 2, you forgot the tagline "Valve says..."

Valve has been amazing misleading about Linux and OpenGL performance, and are laughed out of any real engineering level conversation.

They took a DX9 game engine, rewrote it for OpenGL 4.x with native unified shader support, and then proclaimed OpenGL was faster.

DX9 on modern GPUs runs through a translation/emulation mode that is not as fast as native DX10/11 code. Coding for native unified shader in OpenGL and then comparing it to a 10 year old DX9 engine is misleading at best.


In the L4D2 story as Valve tells it, and as you are repeating was that it was optimized over several years, yet was faster on the initial port to OpenGL on Linux.

What they are hinting at, but are not saying is the L4D2 is running on the OLD DirectX9 engine. (It isn't more optimized or polished if it is running on an outdated graphic framework.) When they ported L4D2 to OpenGL 2.x, the same generation as DX9, it WAS NOT faster, in fact it was slower than the DX9 version.

When they build a DX11 engine to an accurate comparison to an OpenGL 4.2 port of game, then they can have a discussion on this topic.

Until then, they are just lying to people, and they are doing it on purpose with intent of pushing Linux as they want out of the Windows ecosystem because of confusion of their idiot CEO about distribution limitations of Windows 8.

Wow, I am truly sorry they are still conning people with this.

(Even with the misleading people, the FPS difference of the game was like 270fps to 315fps which is like only 3-4% in optimal circumstances with a newer framework.)

GDDR5 latency is a complete red herring non-issue, grasping at straws at its best.

http://www.eurogamer.net/artic...ace-to-face-with-mark-cerny

"Latency in GDDR5 isn't particularly higher than the latency in DDR3. Also, GPUs are designed to be extraordinarily latency tolerant so I can't imagine that being much of a factor." In before ad hominem attacks on Cerny.

As for Left 4 Dead on Linux - splitting hairs. Are you saying if they went from DX9 to DX11.2, there'd be an FPS boost? I'd love to see that.

startscreennope said,
GDDR5 latency is a complete red herring non-issue, grasping at straws at its best.

http://www.eurogamer.net/artic...ace-to-face-with-mark-cerny

"Latency in GDDR5 isn't particularly higher than the latency in DDR3. Also, GPUs are designed to be extraordinarily latency tolerant so I can't imagine that being much of a factor." In before ad hominem attacks on Cerny.

As for Left 4 Dead on Linux - splitting hairs. Are you saying if they went from DX9 to DX11.2, there'd be an FPS boost? I'd love to see that.

Splitting hairs?

Do yourself a favor and go look up the performance difference of DX9 based tools like 3DMark 03, 05, 06, notice that the newer GPUS do not have a massive advantage over video cards from native DX9 architecture cards that are over six years old.

At best you will find that in six years, the overall performance of GPUs has jumped by over 10x, yet in DX9 tests (3DMark 05 or 06) the newer GPU are only 2-4x faster at running DX9 content.

This is not splitting hairs, modern GPUs have not had a massive performance improvement on DX9 gaming, as they are design for DX10/DX11 and use a unified shader and other newer optimizations that DX9 and DX9 games DO NOT USE.

...
Do your own research, find the fastest DX9 architecture GPU from around 2006, and then compare it to the fastest GPU this year.

Notice the overall performance difference is over 10x and then look for the DX9 specific tests and the notice that the difference is only 2-4x the difference running DX9.

This performance difference is closer to only 2x when using an earlier DX8/9 test like 3DMark03, which is EXACTLY what the L4D2 engine is based on.

This isn't splitting hairs, this is Valve lying to people, and very few people outside of the engineering world calling them on it or even doing basic research to see how they are lying.

The original article and test is so flawed. I'm happy to discover which system truly performs better but in this case every result is pretty much no more accurate than a guess.

Once the two are out, then we'll start to see a broader range of more accurate figures and then i'll make my mind up on which one to get.

This is unprofessional journalism to the maximum.

I can't believe you've posted a video with equivalent PC specs, its a complete chalk and cheese scenario.

Tha Bloo Monkee said,
We all know that the PS4s hardware is better on paper. Whether 50% is accurate however is another story.

So was the PS3, it didn't work out to be faster...

(Go look up the GPUs and CPU differences, the PS3 is clearly much faster on paper, yet in gaming was the slower console that had to sacrifice quality to match the Xbox 360's performance.)

dead.cell said,
You're comparing apples and oranges though...

The point is speculation is seldom accurate and the 'unknown' can create a radical difference in the 'expected' results.

The world didn't understand the Xenos GPU in the Xbox 360 or how it would impact game performance. They assumed that because the clock speeds were lower and it had a lower fill rate and fewer shaders it would fall behind the PS3 significantly.

The same arguments are being made again, based on the same type of 'specifications', and we already know a few of the differences in the GPU Microsoft has made like the cache and buffer and how DX11.2 works on the newer GPU that should at least give people pause and not fall down the same trap that happened with PS3 speculation.

I agree they are Apples and Oranges and either system could wow or disappoint everyone.

In case people are not reading the article...

This is a test of hardware based on speculation; however, it does not use the same hardware or even the available equivalent hardware, as they are running the test on Intel i7 processors that are faster per core than the non-customized AMD CPU/APU.

(Which is kind of odd that a 'hardware' test would only try to replicate the GPU potential.)

As the article admits, the difference in OS optimization and framework performance will be considerably different between the PS4 and the Xbox One, with Microsoft demonstrating a rather large advantage with the Windows 8.1 kernel and DirectX11.2.

The article also mentions that the affect of the eSRAM is unknown; however, anyone following the changes in DX11.2 should be able to see how it is used and why it alone should tip the performance and graphics quality to the Xbox One.

In the end all this test shows is that at best the PS4 is going to be about 20% faster, which the eSRAM on its own would easily overcome.

Microsoft has yet to fully reveal the GPUin THEIR custom APU, which has the new eSRAM and new memory pipelines and drops legacy bridge support that add to a new level of performance that lowers the internal RAM and CPU/GPU communication/transfer latency.

The Xbox One CPU/GPU is a big unknown with only speculation that is based on the early developer units, which if the Xbox 360 is any indication could be up to 1/2 the performance of the final APU, as the Xbox 360 final hardware, especially the GPU was nearly 2x faster than the developer kit hardware.

I like that they tried to get a 'good guess' at the differences, but the way the test was handled was poorly executed, as it does give the 'potential' PS4 a lot of advantages that the shipping PS4 will not have.

Even at the best possible outcome with the tests favoring the PS4, the potential performance difference is at best 10 FPS of difference.

This should concern the PS4 community because their FreeBSD based OS, drivers and gaming framework is not equal to Windows NT 8.1, an optimized WDM 1.3 driver and DirectX 11.2, let alone the unknown changes Microsoft has made to the customized AMD APU.


I will offer my prediction, which is based on personal access that I cannot disclose, hence why I am calling it a prediction...

The PS4 will struggle to keep the same FPS as the Xbox One, will have less consistency in maintaining a solid FPS rate, and will have to use lower quality textures.

(A month or two after launch, I will be glad to eat crow if my prediction is incorrect.)

I agree fully on this. People forget how important optimization is.
Read an article a while back regarding the GPU design of the XBOX 360 (which was to be slower than PS3) but MS had went a new way and design and architecture is today to be found in most graphic cards that you can find. What I really what to know more about is the eSRAM and how it will work.
What I think people also tend to forget is services. I have a PS3 today because that was the best choice for me when I bought it. However I have ordered a XBOX One because I think the services for it will be awesome. Microsoft is a services company and I think they have a huge advantage over Sony. Even if I have the work Cloud I think CloudComputing and the services around XBOX One will be something that people will be drawn to. Even none gamers could own a XBOX One and only use other features (TV and so on). The only bad part is I live in Sweden and Microsoft tend to forget that there is a world outside US.
But only time will tell is I have placed my money right!!

PS4 is undeniably the more powerful system. ESRAM is an overengineered solution to a RAM bandwidth problem that should have just been "go with GDDR5". The size of the ESRAM cache on the APU took up a huge amount of their silicon budget not only leading to less powerful GPU but a larger overall chip and more yield issues. Pretty much a lose-lose from a design/price/performance perspective.

PS4 performing worse than X1? Not happening unless devs are deliberately gimping the PS4 version.

This worries me a little with the first release day games, but just like Naughty Dog did with with the PS3, I expect as games come out for it, the code will get better optimised and the games will make use of that extra processing power

This is exactly what people said about the PS3 in regards to XBOX 360. However as we all know performance in graphics is not all. Time will tell but I think that XBOX One may surprise you.

Optimized code is not just related to the SDK being better or worse, the devotion of the developers behind it is MUCH more important. And in the end raw power output will matter on the graphics aspect and game complexicity.

startscreennope said,
PS4 is undeniably the more powerful system. ESRAM is an overengineered solution to a RAM bandwidth problem that should have just been "go with GDDR5". The size of the ESRAM cache on the APU took up a huge amount of their silicon budget not only leading to less powerful GPU but a larger overall chip and more yield issues. Pretty much a lose-lose from a design/price/performance perspective.

PS4 performing worse than X1? Not happening unless devs are deliberately gimping the PS4 version.

Without paying attention to the architectural changes, compare these two GPUs:

GPU 1 - PS3
http://en.wikipedia.org/wiki/RSX_%27Reality_Synthesizer%27

GPU 2 - Xbox 360
http://en.wikipedia.org/wiki/Xenos_(graphics_chip)

Just from 'technical' specifications, the PS3 GPU is faster in on paper:

PS3 - Core 550mhz
Xbox 360 - Core 500mhz

PS3 - RAM 700mhz
Xbox 360 - RAM 500mhz

PS3 GPU - 13.2 GigaTexels per second
Xbox 360 - 8 GigaTexels per second

So just based on this information, do you believe that thePS3 titles, "devs were deliberately gimping," the PS3 as well?

Coming into the launch of the PS3 and the Xbox 360, the world was sure the PS3 would crush the Xbox 360, as the PS3 had a faster CPU and a faster GPU. There was lots of talk how the PS3 would do 1080p gaming and could do dual screen gaming where the Xbox 360 would be limited to 720p gaming and not have the GPU power to supply two screens like the PS3, etc etc etc

In retrospect, the PS3 was graphically inferior to the Xbox 360, at a level that to get visual effects out of PS3 titles, developers would have to use cores of the PS3 processor for anti-aliasing and HDR effects just to keep up with the GPU of the Xbox 360.

The Xbox 360 also was able to do GP-GPU operations which wasn't feasible on the PS3.

Even what was known of DirectX was misleading, as the DirectX 'subset' of the Xbox 360 was more of a 'superset' having features from DX10 and DX11 because of the Unified shader architecture.

At best the PS3 barely held its own to the Xbox 360, with the only 'real' advantage of the PS3 was the BluRay pre-rendered cutscenes that were prettier.


So have a bit of doubt that faster on paper equates to being faster in real world.

Little was known about the technology in the Xenos GPU of the Xbox 360, and even AMD had no idea how it would work for gaming, as they were not going to use the architecture.

It was the speed of games on the Xbox 360 that pushed AMD and Nvidia to adopt the Xbox 360 GPU design, that is now at the heart of all modern GPUs.

The GPU differences of the XBox 360 didn't seem to be important nor were they expected to perform well, instead they changed the entire gaming industry with SEVERAL new technologies that paved the way for newer technologies to exist.

As NVidia has stated, CUDA 2.x would not exist without the technologies the Xbox 360 introduced. OpenCL would not exist without these technologies either.

The Xbox 360 not only out performed expectations, but changed 3D and GPU usage.


Sony also has a big haul getting performance out of their OS and getting an optimized driver and development set for the GPU. Sony also doesn't have DirectX, which is faster than OpenGL with more features. (Just the lack of tiled resources alone is a MAJOR difference in using larger textures.)

I am surprised that even with people able to look back at what happened with the Xbox 360 and the PS3 that they get right back into the same old trap of buying 'technical' numbers instead of real world performance.

You are aware the Cell CPU was designed to be able to support the GPU right? And so its not an escape or an excuse to fix its 'lackings' compared to the 360. rather a different solution for the same problem.

And it isn't all about optimized drivers and development kits.

It doesn't help you claiming that the PS3 isn't graphically superior to the 360. Because it flat out is. It might require some more effort, but I rather have developers that depend on their own skills rather then a SDK or Microsoft for their coding knowhow to make optimized games.

Its the little things that matter to, like having Blu-ray discs instead of multiple DVD's. Being able to load from harddrive AND optical disc at the same time to properly increase loading times.

But you should check out some PS3 exclusives and more into the Playstation history.

For the PS3 it has been very, very important to use threading techniques to optimize performance. This will have its benefits to a new chipset which is also heavily dependent on threading.

Shadowzz said,
You are aware the Cell CPU was designed to be able to support the GPU right? And so its not an escape or an excuse to fix its 'lackings' compared to the 360. rather a different solution for the same problem.

And it isn't all about optimized drivers and development kits.

It doesn't help you claiming that the PS3 isn't graphically superior to the 360. Because it flat out is. It might require some more effort, but I rather have developers that depend on their own skills rather then a SDK or Microsoft for their coding knowhow to make optimized games.

Its the little things that matter to, like having Blu-ray discs instead of multiple DVD's. Being able to load from harddrive AND optical disc at the same time to properly increase loading times.

But you should check out some PS3 exclusives and more into the Playstation history.

For the PS3 it has been very, very important to use threading techniques to optimize performance. This will have its benefits to a new chipset which is also heavily dependent on threading.

Somehow you are skipping over the points.

It is not wrong for the CPU to support the GPU, in fact Windows 8 was designed to use CPU cores when a GPU operation would be faster on the CPU than waiting for the GPU.

The point was that any CPU advantages the PS3 had was 'burned' by trying to keep up with the GPU performance of the Xbox 360, even though 'technically' the Xbox 360 GPU was slower on 'paper'.

As for threading performance, you are forgetting that the Xbox 360 ALSO depended on threading optimization.

Everyone likes to talk about the PS3 having a six-core cell processor, but they too easily never remember that the Xbox 360's tri-core CPU was capable of 6 operations per cycle. As each core used a 'Hyperthreading like' technology to process two threads on each core.

Microsoft's development tools were better at threading, and even as the PS3 dev tools got better, so did Microsoft's.

Note what I mentioned about Windows 8's use of the CPU, this also happens on the Xbox One, with the NT scheduler able to agnostically throw many calls to either the CPU or GPU side depending on the load. The PS4's OS and frameworks do not do this.

PS4/X1 are nearly identical in architecture besides ESRAM+DDR3 vs GDDR5 and a far more powerful GPU. It's much easier to make direct comparisons this gen.

Posting an irrelevant wall of text about PS3/360 differences, very different architectures, proves nothing about PS4/X1.

startscreennope said,
PS4/X1 are nearly identical in architecture besides ESRAM+DDR3 vs GDDR5 and a far more powerful GPU. It's much easier to make direct comparisons this gen.

Posting an irrelevant wall of text about PS3/360 differences, very different architectures, proves nothing about PS4/X1.

....far more powerful GPU...

You do not know this. There is NO information available to even do a 'base' comparison.

The reason for the PS3 vs 360 information, is that if you look at the clock speeds, the number of shaders, and bandwidth the PS3 GPU is SIGNIFICANTLY faster than the 360 GPU.

Which ended up NOT being reflective of real world performance, as the comparative information was based on DX9 and OpenGL2.x generation GPU architecture.

The point being, we are here again comparing stream processors and clock speeds, and the things that REALLY are important to the final performance are only partially known on the Xbox One and are severely not understood. Just the lower latency adaptations MS mades to the APU are a significant increase in GPU performance as BOTH consoles depend on a shared memory model with an integrated CPU/GPU combination.

The cache is the only other 'truly' known feature, and just in shoving pixels to the screen, it is enough to easily add a 20% bump if the OS can keep it filled, which Windows was DESIGNED to do going back to the original WDDM technologies. If you then consider how DX11.2 was designed to handle textures in a new way that plays to this cache, it is another major performance gain that also adds the potential for significantly larger textures.

Additionally, the PS4 OS does NOT have kernel level control of the GPU to manage the shared RAM assets dynamically/transparently, and relies on the system or developers allocating a semi-fixed set of RAM to the GPU.

This means the PS4 will spend cycles and time copying assets from System RAM to the semi-fixed set of allocated VRAM.

In contrast, Windows NT's WDDM was designed to handle dynamic VRAM and does NOT have to move assets from System RAM to allocated VRAM, as NT treats it as one continuous pool, able to hand any chunk of RAM to the GPU and frame buffer directly.
(This is also big performance difference, and is one reason the Xbox 360 did more with less hardware, as it was the first system to use this technology that Vista added to the Windows fork.)

...

So you can try to state that definitively the PS4 GPU is faster, but there is NO released technical data that supports this, and even the tests in this article that are giving the PS4 the benefit of the doubt, it is showing that the GPU difference is less than 10 FPS AT BEST.

Mobius Enigma said,
I will offer my prediction, which is based on personal access that I cannot disclose, hence why I am calling it a prediction...

The PS4 will struggle to keep the same FPS as the Xbox One, will have less consistency in maintaining a solid FPS rate, and will have to use lower quality textures.

(A month or two after launch, I will be glad to eat crow if my prediction is incorrect.)

PS4 is going to be weaker than X1 at multiplatform titles, only on Neowin.

You can be obstinate about hardware specs and 'we don't know the clock speed yet', but the reality is they are both nearly identical architectures and thus almost directly comparable. The fabled "secret sauce" doesn't exist to bring X1 even to parity with PS4, let alone ahead of it. ESRAM is there as a bandaid for the DDR3 bandwidth and maintain parity at best with PS4's width, not to give it some 20% performance boost.

So you admit PS4 is going to be more powerful - by "less than 10 FPS at best". Which is not a very useful figure, considering framerates are variable. All the specs show a 20-30% FPS difference. Maybe it will only be 15-20%, but it will be there.

"kernel level control of the GPU" this sounds a lot like secret sauce - some minor technical software point that is basically irrelevant, yet stated to deflect obvious hardware advantages.

Edited by startscreennope, Jul 31 2013, 2:45pm :

hardware that are based on the same basic AMD architecture as the PS4 and Xbox One
and there is the problem. Its been confirmed by Microsoft they worked with AMD to customize their chip

Exactly. Like AMD's fabricator would release anything like this without getting into big trouble. Watch, it will be the same chip with ESRAM. Guess we'll find out at the end of the year.

Before publishing news stories like this, people should probably understand the specs of the machines as well as the cloud support provided by the XBox One...

Did Microsoft release these "custom" GPU specs or is this 768 GPU cores a speculation? Everywhere I read, they haven't released a thing about that SOC.

Patrick Danielson said,
Did Microsoft release these "custom" GPU specs or is this 768 GPU cores a speculation? Everywhere I read, they haven't released a thing about that SOC.

They did, but they didn't say anything about clock speed and you're also not taking into account other custom GPU tweaks that they've made. It's NOT a stock GPU you can get for your PC so comparing it to one isn't going to be the same. The GPU on the XB1 has the new move engines and has the 32MB of embedded fast ESRAM? IIRC. Plus there's custom DirectX 11.2 software/hardware support that helps with things to make up for the difference in theoretical performance with the PS4, case in point the new tiled resources that DX11.2 has in hardware.

Clock speed is 800 at best, downclocked at worse. Given the yield and overheating rumors it's extremely unlikely they tried to overclock it.

Again the ESRAM is a band-aid on the DDR3's slow width, and it ended up causing all sorts of other problems the PS4 doesn't suffer from. Should have just went for GDDR5.

Software is only going to play a big role if the devs are doing quick and dirty PC ports with little to no PS4 optimization. Even then devs would have to gimp PS4 games on purpose to make them look worse than X1. I doubt that will be the case. In terms of first party titles, the next God of War and Naughty Dog titles will blow away anything on X1 graphically.

"Move engines" are not some kind of secret special sauce, it's just DMA registers that are in all modern GPUs.

MS not revealing tech specs is grounds for pessimism, not optimism.

Basically MS sacrificed hardware power for their vision of kinect/cable box TV living room dominance. The market seems set to reject their vision, just like Win 8, Win phones/tablets have been rejected. Overpriced, underpowered, with 'features' almost nobody wants.

Edited by startscreennope, Jul 30 2013, 7:44am :

startscreennope said,
Clock speed is 800 at best, downclocked at worse. Given the yield and overheating rumors it's extremely unlikely they tried to overclock it.

Again the ESRAM is a band-aid on the DDR3's slow width, and it ended up causing all sorts of other problems the PS4 doesn't suffer from. Should have just went for GDDR5.

Software is only going to play a big role if the devs are doing quick and dirty PC ports with little to no PS4 optimization. Even then devs would have to gimp PS4 games on purpose to make them look worse than X1. I doubt that will be the case. In terms of first party titles, the next God of War and Naughty Dog titles will blow away anything on X1 graphically.

"Move engines" are not some kind of secret special sauce, it's just DMA registers that are in all modern GPUs.

MS not revealing tech specs is grounds for pessimism, not optimism.

Basically MS sacrificed hardware power for their vision of kinect/cable box TV living room dominance. The market seems set to reject their vision, just like Win 8, Win phones/tablets have been rejected. Overpriced, underpowered, with 'features' almost nobody wants.

That YOU don't want. A lot of people are excited about the huge feature set the Xbox One offers over the PS4.

underpowered? LOL. They can achieve theoretical speeds with DDR3 and ESRAM that are as high as GDDR5 in PS4, and the DDR3 solution is cheaper.

Xbox One is a box that does everyone, and it does everything very good. How can you not like that

startscreennope said,
Clock speed is 800 at best, downclocked at worse. Given the yield and overheating rumors it's extremely unlikely they tried to overclock it.

Again the ESRAM is a band-aid on the DDR3's slow width, and it ended up causing all sorts of other problems the PS4 doesn't suffer from. Should have just went for GDDR5.

Software is only going to play a big role if the devs are doing quick and dirty PC ports with little to no PS4 optimization. Even then devs would have to gimp PS4 games on purpose to make them look worse than X1. I doubt that will be the case. In terms of first party titles, the next God of War and Naughty Dog titles will blow away anything on X1 graphically.

"Move engines" are not some kind of secret special sauce, it's just DMA registers that are in all modern GPUs.

MS not revealing tech specs is grounds for pessimism, not optimism.

Basically MS sacrificed hardware power for their vision of kinect/cable box TV living room dominance. The market seems set to reject their vision, just like Win 8, Win phones/tablets have been rejected. Overpriced, underpowered, with 'features' almost nobody wants.

You're taking rumors and speculation as fact now? I love how you post a wall of text that is nothing more than rehashed rumors and guesswork on what you think will be the case. I also like how you try to brush off the software side and then talk about first party PS4 games blowing away the XB1 titles, right, because the first party XB1 titles won't take full advantage of the software and DX changes to counter the bandwidth difference.

GP007 said,
You're taking rumors and speculation as fact now? I love how you post a wall of text that is nothing more than rehashed rumors and guesswork on what you think will be the case. I also like how you try to brush off the software side
Please point out where I claimed a rumor was a fact. False accusations and HURR HURR WALL O TEXT are the best you can do, now? Get off the playground, kid.

Any software dev kit optimization MS can do with their software, Sony can do, but with stronger hardware backing it up. DirectX does not have some kind of secret software magical sauce that's impossible to do elsewhere.

I think Kinect will be an expensive flop and consumers will reject X1 for being overpriced and underpowered. "Xbox One is a box that does everyone" Marketing buzzwords.

Yes, X1 is underpowered compared to PS4. ESRAM + DDR3 seems to be pretty expensive for MS as cramming the ESRAM onto the APU caused it to be massive (5 bil transistors) and led to likely awful yields.

Ad Man Gamer said,
XBOX 360 was underpowered compared to the PS3, and look what a flop that was.
360 had better multiplatform titles most of the time. The gap between PS4 and X1 is much wider, think the original Xbox vs Gamecube.

Although you make a good point - J.Allard was primarily responsible for the 360's success, and he's long gone, replaced by suits, bean counters, and "metrics".

startscreennope said,
Basically MS sacrificed hardware power for their vision of kinect/cable box TV living room dominance. The market seems set to reject their vision, just like Win 8, Win phones/tablets have been rejected. Overpriced, underpowered, with 'features' almost nobody wants.

Almost nobody? I constantly see people post about those features being factored into why they would get an X1. I know that's a good part of why I would get one... I don't game often, and these are actual NEW features - not just new generation, just like the last generation with more horsepower.

I hope they bring in digital options as well - even if it's segregated from "Disc based" options. This isn't just X1 vs PS4... They need to compete with Steam as well. Community base, easy transfer of games, constant sales and new features... if Steam released a an actual console in the same ballpark as X1/PS4... they wouldn't stand a chance.

startscreennope said,
Please point out where I claimed a rumor was a fact. False accusations and HURR HURR WALL O TEXT are the best you can do, now? Get off the playground, kid.

Any software dev kit optimization MS can do with their software, Sony can do, but with stronger hardware backing it up. DirectX does not have some kind of secret software magical sauce that's impossible to do elsewhere.

I think Kinect will be an expensive flop and consumers will reject X1 for being overpriced and underpowered. "Xbox One is a box that does everyone" Marketing buzzwords.

Yes, X1 is underpowered compared to PS4. ESRAM + DDR3 seems to be pretty expensive for MS as cramming the ESRAM onto the APU caused it to be massive (5 bil transistors) and led to likely awful yields.

In your first line maybe? Or do you forget what you type yourself? Here it is, "Clock speed is 800 at best, downclocked at worse. Given the yield and overheating rumors it's extremely unlikely they tried to overclock it." You're going off of rumors, the 800Mhz number is guesswork at best, no one knows the official clock speed and the yield and overheating "rumors" you talk about are just that, rumors.

Whenever you talk about the hardware you're speculating, ESRAM expensive and bad yields, where's the proof or are you still sticking to that rumor from over a month ago?

And really, you think Sony can keep up on the software side of things with MS? Not to mention the fact MS fully controls the DX API and works directly with the hardware makers to support it's features and new implementations. You really think that's the same as Sonys ability to impact OpenGL and so on? With what, they're vender specific extensions?

See the words "at worst" and "unlikely"? Those are adjectives that suggest uncertainty, aka acknowledging that I'm speculating on a rumor.

The yield issues are rumors but it makes sense. A 5 billion transistor SOC (official number) is huge by chip standards, and any newly designed chip is going to face serious yield issues at that size, MS or not.

So all you've got is grammar picking and claims that MS is capable of writing better dev software than Sony possibly ever could. Even though, you know, a new OpenGL was just released and has pretty much all features of DirectX 11.1. Do you even know what the latest OpenGL version number is, let alone its featureset? Don't go rushing to search for it all at once, now. To answer your question, hell yes Sony will be able to write or commission software to squeeze every last byte out of the PS4's potential power.

Sony has the favor of many developers, especially those not American.
And raw power will be very important, proper or less proper SDK's are not of that giant importance. At best, the better SDK promotes lazy developing, resulting in less optimized code, which in the end will be the most important.

startscreennope said,
See the words "at worst" and "unlikely"? Those are adjectives that suggest uncertainty, aka acknowledging that I'm speculating on a rumor.

The yield issues are rumors but it makes sense. A 5 billion transistor SOC (official number) is huge by chip standards, and any newly designed chip is going to face serious yield issues at that size, MS or not.

So all you've got is grammar picking and claims that MS is capable of writing better dev software than Sony possibly ever could. Even though, you know, a new OpenGL was just released and has pretty much all features of DirectX 11.1. Do you even know what the latest OpenGL version number is, let alone its featureset? Don't go rushing to search for it all at once, now. To answer your question, hell yes Sony will be able to write or commission software to squeeze every last byte out of the PS4's potential power.

So you're still speculating, "it makes sense"? Without proof of such it's a guess on what you think is going on. And yippy for a new OpenGL version that has "pretty much all of the features of DX11.1" that's good for them but we're already talking about DX11.2 with the XB1.

Guesswork to the end on your part. You have theoretical performance numbers, speculation about yield issues and complexity, hopes of matching software tools and features. And yet we still don't have solid figures for the XB1s hardware but in your view not sharing everything right away spells doom and gloom for them.

MS not revealing tech specs is grounds for pessimism, not optimism. If they had a hardware bullet point to wave around they'd be doing so with wild abandon, given the generally negative public perception of the Xbox One thus far.

That's a pretty significant difference. Being that the architecture is so close, it'll be interesting to see how the FPS and visuals match up. I don't buy this "3 years until you notice" crap, this is a x86, it just won't take as long.

Because people/idiots wrongly think that the PS4's GPU is 50% more powerful. Even the idiots at Eurogamer have fallen for it. They go on the amount of Compute Units, which is wrong (18 in PS4 vs 12 in Xbox One). It's not that simple, graphics performance does not magically scale like that.

The XBO has 33% less GPU compute power (1.84 TFLOPs in the PS4 vs 1.23 TFLOPs in the Xbox One).

NoClipMode said,
The XBO has 33% less GPU compute power (1.84 TFLOPs in the PS4 vs 1.23 TFLOPs in the Xbox One).
Even that's just on paper. Combined with the fact that the Xbox One has more, albeit slower memory allocated for games, it really comes down to the overall architecture and not a single component.

But on the single component level, I am not confident that the GPU on the PS4 is not bottlenecked by other hardware; to be fair, the same could be said about the Xbox One, but it is underclocked by comparison and therefore less likely.

Until we really see it, it's all a moot discussion. The Xbox One has its strengths, and the PS4 has its strengths. We won't know what they actually are until release.

In the end it will probably boil down to which console it's easier to develop / code for and which one is just a sloppy cut and paste or which one gives out more monies!

They should both be about the same in difficulty to program for. I think, in the end, it will come down to 1) game 2) what system most of your friends have.

You do have a point, I gave up on consoles a while ago (since the gamecube days), but IIRC there were soo many games brought up for the ps3 and x360, that the exact same game would run as ass on ps3 and wonderfully on the 360 or vice versa.

For developing, xbox has it much easier.
Since the playstation is japan made, it's harder for other people to develop as international programmers have a larger struggle to understand the fraction of translated documentation.

Oh but japanese games are great!

Vinylchan said,
For developing, xbox has it much easier.
Since the playstation is japan made, it's harder for other people to develop as international programmers have a larger struggle to understand the fraction of translated documentation.

Oh but japanese games are great!

I can't imagine that is true. I'd assume that Sony would document their hardware pretty well, and they'll be using fairly standard and well-known hardware and software (x86, OpenGL, etc).

If I were to hazard a guess, I'd say they're going to be pretty similar to develop for.

Vinylchan said,
For developing, xbox has it much easier.
Since the playstation is japan made, it's harder for other people to develop as international programmers have a larger struggle to understand the fraction of translated documentation.

Oh but japanese games are great!


I doubt this statement, partially. The reason my X1 will be likely to be easier to develop for is because MS has a very strong and known SDK, called visual studio. Lots and lots of programmers are familiar with it and MS has been working on it for ages with a lot of iterations.
VS might very well also be the reason why MS can get the most out of the eSRAM and overcoming the lower bandwidth of their DDR3 while still having the lower latency of DDR3.
Now, I'm not saying Sony does not have a good SDK, but at the very least it will not have the same user base to learn and improve from as VS.

Bizkit said,
They should both be about the same in difficulty to program for. I think, in the end, it will come down to 1) game 2) what system most of your friends have.

I would also point out that it will also come down to "additional" stuff - like the Kinect and associated stuff like TV controls. I for one, lean towards X1 as an addition to my Cable Box - "XBox On, Channel 37" and what not, while cooking dinner or doing dishes... X1 has a actual "Next Gen" feel to me due to something that simple. Something actually different from the "Pack". And something different from the past - it's actually more than just the 360 with more horsepower.

While the PS4 has the "Look it's an upgraded PS3" feel to me. Nothing really stands out from the PS3 other than more horsepower, which while nice isn't enough to get excited about. (I also don't trust Sony not to remove functionality (ala OtherOS) and to not add functionality (ala Rootkits). MS is "Evil" as well, but not to the level and malevolence as Sony IMO.)

Granted, I'll probably wait until after the initial rush with the age old wisdom of "Don't buy Microsoft 1.0 stuff - wait until round SP1".

Draconian Guppy said,
In the end it will probably boil down to which console it's easier to develop / code for and which one is just a sloppy cut and paste or which one gives out more monies!

I'm going to go with which pays out more monies. Lol.

Vinylchan said,
For developing, xbox has it much easier.
Since the playstation is japan made, it's harder for other people to develop as international programmers have a larger struggle to understand the fraction of translated documentation.

Oh but japanese games are great!

PS4 is being developed with western developer input and documentation will readily be available in both languages. The culture has shifted heavily towards the west. PS3 was a mess with its early Japanese only documentation, not so this time.

you my friend have the same thought process I had, though I on the other hand will probably be getting the xbox one during launch, though half of me is screaming "don't do it you ****ing sell out-theres no difference in waiting!".
what can I say? I'm a hype beast.
Also, the whole "next gen feel" thing is partly why I was looking forward to the old xbox one digital policies with family sharing, but now that they're bringing that back (eventually, and hopefully with trading/selling digital?) I can't wait! coupled with tv stuff, windows 8 app compatibility, window snapping, unlimited cloud storage and computing, badass new controller, the rumor that it runs 10 years without dying, etc. Can't wait!