Next gen Xbox specs reportedly leak

Get out the salt and put on your skeptical hat as next gen specs for Microsoft's follow-up to the Xbox 360 have reportedly leaked. While the source does not have a track record of a significant length, the source has been posting up incredibly detailed posts regarding the WiiU, next gen Xbox and the PS4.

The specs are posted below but the highlights include an 8 core X64 CPU running at 1.6Ghz, 8 GB of DDR3 RAM, USB 3.0, and a 50 GB 6x Blu-ray Disc drive.

CPU:

  • x64 Architecture
  • 8 CPU cores running at 1.6 gigahertz (GHz)
  • each CPU thread has its own 32 KB L1 instruction cache and 32 KB L1 data cache
  • each module of four CPU cores has a 2 MB L2 cache resulting in a total of 4 MB of L2 cache
  • each core has one fully independent hardware thread with no shared execution resources
  • each hardware thread can issue two instructions per clock

GPU:

  • custom D3D11.1 class 800-MHz graphics processor
  • 12 shader cores providing a total of 768 threads
  • each thread can perform one scalar multiplication and addition operation (MADD) per clock cycle
  • at peak performance, the GPU can effectively issue 1.2 trillion floating-point operations per second

High-fidelity Natural User Interface (NUI) sensor is always present

  • Storage and Memory:
  • 8 gigabyte (GB) of RAM DDR3 (68 GB/s)
  • 32 MB of fast embedded SRAM (ESRAM) (102 GB/s)
  • from the GPU’s perspective the bandwidths of system memory and ESRAM are parallel providing combined peak bandwidth of 170 GB/sec.
  • Hard drive is always present
  • 50 GB 6x Blu-ray Disc drive

Networking:

  • Gigabit Ethernet
  • Wi-Fi and Wi-Fi Direct

 Hardware Accelerators:

  • Move engines
  • Image, video, and audio codecs
  • Kinect multichannel echo cancellation (MEC) hardware
  • Cryptography engines for encryption and decryption, and hashing 

It looks like, if true, Microsoft is putting some serious hardware under the hood to make sure its next gen console performs during its life-cycle as well as the Xbox 360 has.

The inclusion of a NUI sensor would make it seem that Kinect will be baked in at a deep level and we should expect to see the next generation of the device debut during the lifecycle of the platform.

Seeing as the sensor will be built in, and not an add-on, would make it appear that next gen gaming will also be able to take advantage of the platform since all end-users will have access to the necessary equipment. At the current time, Kinect gaming is limited in scope because publishers are not guaranteed that the install base they are selling to will have the needed attachment. 

Not that it will be a surprise, but to see that Wi-Fi will be baked in this time around will appease many. When the Xbox 360 hit the store shelves, it required an adapter to be able to connect to wireless networks. The addition of gigabit networking is also a nice touch for those who want to stream photos and videos to the console.

Finally, the HDMI-in is a bit of a mystery as well. Could it be for set-top box functionality or does Microsoft have DVR functionality in mind? At this point, it's up for interpretation.

Microsoft is expected to unveil the next gen Xbox sometime in the first half of 2013. Some reports suggest a March timeline while others point to a June announcement

Source: VGleaks | Image courtesy of VGleaks

Thanks for the tip Audioboxer!

Report a problem with article
Previous Story

Microsoft's big (rest of) January: A look ahead

Next Story

Sony: We'll let Microsoft announce its console first

127 Comments

Commenting is disabled on this article.

I see the importance of these specs and their effect on the future of console gaming, but for me, I'm more concerned with the next generation's approach to DRM. There had been rumors recently that Sony & MS plan to make it so used games cannot be played. ...or moreover, that once you purchase a new game, it's locked to that console. If either of these are true, I believe it spells disaster for the industry. However, if Sony takes this approach and Microsoft doesn't, I think the console wars will end abruptly.

Everyone, let's not forget that this is just rumours and speculation. Let's not get silly by throwing numbers around that could be completely made up.

Even if this is the AMD CPU, do NOT be surprised to find it is not a stock offering and will be a Microsoft Hardware revision design. (The CPU in the Xbox 360 was redesigned by Microsoft as well, it is not a stock or even evolutionary PowerPC design.)

Since Microsoft is the one that supplied AMD with their current generation of mobile and SoC design technologies, it is very possible Microsoft would build from that work.

I would be very surprised to see a generic implementation of the AMD CPU and GPU technology, as there are ways to pull more performance out of the architecture when no longer dealing with various hardware compatibility issues and designs that a PC architecture still requires.

I also would be surprised to see Microsoft not take 'ownership' of the CPU and GPU design, as they were burnt by NVidia and refused to do this with the Xbox 360. This would be why we are seeing stories about IBM producing the CPUs for Microsoft, even if they are based on an AMD design Microsoft re-licensed back.

NVidia got nearly 8 strong years of Microsoft designs for their DX9 generation GPUs, and then tried to sell Microsoft's own tech back to them at an increased cost. I just don't see Microsoft exposing themselves to this type of situation, unless they are not adding anything to the hardware technologies, which I would find very odd.

Are these specs real or right? Maybe? They also could change and/or be released to keep Sony off track of the overall goal Microsoft has for the console.

Also specifications mean little without context. The Xbox 360 looked 'sad' compared to the PS3 when hardware was compared, but in reality it was the stronger console. Nobody heard of the new GPU architecture at the time, nor understood how its CPU integration and virtualization worked.

Even the GPU design from the Xbox 360 was not well understood and ATI didn't think it was advantageous and had NO plans to create a PC GPU based off the design and continued with their older model GPU architecture. It wasn't until the developers were 'wowed' by the Xbox 360 graphical performance and 'ease' that encouraged ATI and NVidia to adopt the new design that came from Microsoft.

So specs are fun to look at, and can mean 'something' but cannot paint the overall picture.

A good example where we see specs that are meaningless: Take a WP7 with a single core 1ghz snapdragon and note it can outperform a dual core iPhone and even some quad core Android phones. (And that is not even factoring the usability speed advantages that give WP7/8 an additional performance advantage.)

to make sure its next gen console performs during its life-cycle as well as the Xbox 360 has.

Joke?

The 360 has been holding back gaming since 2007. (the year Crysis was released, for anyone that didn't understand that)

This console, along with the PS4, will be outdated within 2 years; PC hardware is just moving way too fast for consoles to ever keep their old 7+ year cycle, let alone 3+.

Tell me a time when consoles were ahead of PC hardware for 3+ years. I don't think that was ever the purpose of a console.

DAOWAce said,

Joke?

The 360 has been holding back gaming since 2007. (the year Crysis was released, for anyone that didn't understand that)

This console, along with the PS4, will be outdated within 2 years; PC hardware is just moving way too fast for consoles to ever keep their old 7+ year cycle, let alone 3+.

Actually if you want to honestly look at what platforms have been retarding the gaming industry, you need to first start with the PS3 and then look to the surge in mobile gaming. The retention of WindowsXP also hasn't helped, with gamers still writing their low level aspects of their engines based on DX9 technologies.

The PS3 is a full technical 'generation' behind the Xbox 360 when it comes to the GPU design and interoperation of the CPU. So developers that would like to build a latest API game based on the latest GPU technologies, cannot target the PS3 without a complete separate engine design.

The Xbox 360 is capable of hardware features that Windows didn't hit parity with until DX11 was released, meaning that an engine design could target DX11 and still run on the Xbox 360 with complexity scaled back.

So blaming the Xbox 360 for not having more advanced games, is a serious misunderstanding of the DIrectX subset in the Xbox 360, as the latest PCs offer very few additional features.

What about 4k support? I think it's something that Xbox need for their next future game console. Not for game support but more for media support for the UHDtv.

Draken said,
if it becomes a norm, may be they upscale while rendering natively @1080p, that's what they do now.

All those wasted GPU cycles on anti-aliasing and the like. You know, the post processing effects to smooth out rough artifacts due to low resolutions......

Those cycles could better used elsewhere.

There was a photo of a slide that leaked a few months ago that said it was an IBM cpu, other info leaked also suggest that, so i wouldn't be so sure that they are using an AMD apu.

I'd guess that MS would have gone with IBM because their cpu is either more powerful, lower power consumption or a combination of both. I highly doubt ibm would be cheaper as amd is desperate for money, they would offer a custom apu at a lower price than ibm cpu + amd gpu. Plus it would likely be cheaper to manfacture an amd apu than ibm+amd assuming they use 2 dies on 1 chip.

Sony i'm sure will use a full amd apu as already rumoured, they spent a lot more on the ps3 and i'm sure that case will remain with the ps4.

I'm a bit disappointed that the hdd is sata2 and the wireless isn't 802.11ac. The hdd being sata2 would make very little difference in speed but there are a few minor improvements. Maybe ms will release a future xbox with 802.11ac. I hope their bluray drive doesn't just support 50gb max, i was hoping for 100gb or 128gb bdxl support as the ps3 is only just coping with 50gb for a few games. There are a small number of ps3 games that are 40-45gb in size, but there will now be much larger textures and 1080p videos added into games, 50gb will become a pain for developers in a few years time.

torrentthief said,
I'd guess that MS would have gone with IBM because their cpu is either more powerful, lower power consumption or a combination of both..

I would say the reason is mainly because it won't be easy to install Linux/XBMC on it.

802.11ac is way too expensive for a console. Most adapter, 1) are huge, no nanos yet, and 2) the routers still cost at least twice as much as an 802.11n router, 3) there just isn't meaningful market penetration of 802.11ac and won't be by the time the consoles are released. Mainly due to cost.

MorganX said,
802.11ac is way too expensive for a console..

today, and for a launch system.

MS didn't support wifi on the launch 360, only via a B/A adapter that was $70.
Third revision of the console has B/G/N baked in.

DClark said,
Wifi Direct is critical for SmartGlass evolution.

Exisisting implementation works Ok as it is.
Yes WiFi direct will speed up the messaging.
SmartGlass has a greater dependancy on HTML5 (webSockets) than network infrastructure.

for those concerned with clockspeed, you don't understand clock speed

clock is just how many times per second the internal flip flops will sample the line. there is much more to the actual speed of the processor than clock speed.

lets say a 1Ghz CPU can do one instruction per clock cycle,and a 4Ghz cpu takes 4 clock cycles per instruction,then these CPUs do work at exactly the same speed,provided they are both running the same cpu architecture.

x86 and x64 cpus have stronger instruction sets than ARM,therefore an operation that needs 3,4,5 or more instructions on ARM,can be done with one instruction on the x86 or x64.

A core 2 duo with the same clock speed as a Pentium 4 will be 4-5 times faster.

vcfan said,
for those concerned with clockspeed, you don't understand clock speed

clock is just how many times per second the internal flip flops will sample the line. there is much more to the actual speed of the processor than clock speed.

lets say a 1Ghz CPU can do one instruction per clock cycle,and a 4Ghz cpu takes 4 clock cycles per instruction,then these CPUs do work at exactly the same speed,provided they are both running the same cpu architecture.

x86 and x64 cpus have stronger instruction sets than ARM,therefore an operation that needs 3,4,5 or more instructions on ARM,can be done with one instruction on the x86 or x64.

A core 2 duo with the same clock speed as a Pentium 4 will be 4-5 times faster.


core2duo is a dualcore vs a hyperthreading. Also the core2duo design is based on the P3 which was a stronger architecture then the P4.

What I find interesting is that it looks like MS is addressing the most serious issue I see for a console that is going to be an app platform; multitasking. The next XBox is going to have 3GB reserved for the system and the rest for apps/games. This should allow for fast application/service switching. If its going to perform the functions of a DVR, Communications (Skype), Media platform, and Apps, having that, much RAM reserved for the system could allow for fluid (or at least faster) multitasking. In comparison, I believe the PS4 will have only 512MB-a GB of RAM reserved for the OS. It could be the deciding factor for who controls the living room. Either way it will be interesting.

Sony dropped the ball by having separate RAM for the system and the GPU. They couldn't add support for cross-game voice chat because of that. It looks like Sony is going to make the same mistake again. 512MB for the OS isn't enough, even if it's GDDR5. As you said, more RAM will allow for faster switching between apps. And it'll allow for more apps to be open at once without slowing everything down.

Of course its enough. The PS3 OS ran in 50Mb. The Orbis is going to be a pure gaming machine, the Durango a "multimedia hub"

Microsoft is making a mistake, people have lots of devices that do what the Durango do, Smart TVs, streamers, etc..

TheLegendOfMart said,
Of course its enough. The PS3 OS ran in 50Mb. The Orbis is going to be a pure gaming machine, the Durango a "multimedia hub"

Microsoft is making a mistake, people have lots of devices that do what the Durango do, Smart TVs, streamers, etc..

More and more people want to use those media services on their TVs. People like my dad would prefer to have an all in one solution where they could watch cable, record those shows, and be able to have access to the interactive content including being able to surf the web. Add in the cloud services for skydrive access and music/video services and it could be enough to put it well ahead of the PS4 for the average user. If MS has setup their system to have access to cable like services like live TV in addition to the on demand content, They could be setting up the XBox to be a DVR alternative for cable companies. If Cable companies like Comcast, Verizon, and others, decide to adopt the XBox as their next cable incentive, it could be the defining difference between the XBox and PS4 in terms of sales. The PS4 will not be able to get into as many homes just because of this limitation.

TheLegendOfMart said,
Of course its enough. The PS3 OS ran in 50Mb. The Orbis is going to be a pure gaming machine, the Durango a "multimedia hub"

Microsoft is making a mistake, people have lots of devices that do what the Durango do, Smart TVs, streamers, etc..


I don't think it's a mistake at all. I think it's the next logical step. The focus on multimedia capabilities will be even greater in the next generation. People want to use their gaming consoles for more than just gaming. They want to use it for streaming movies and music, browsing the web, and more. For example, a gaming console is a great device to use to stream videos from your PC. I frequently use my brother's PS3 for that reason alone. Sure, I could watch it on my PC. I could also transfer it onto a tablet or smartphone. Just because I have options doesn't mean I'd prefer anything other than watching it on a large screen TV (which doesn't have any streaming features).

As for the PS4's OS memory, it may be enough but I think they'll run into problems a couple of years down the line if new features are added. If I remember correctly, the Xbox 360's OS had a smaller memory footprint than the PS3. And despite that, Microsoft went for a unified memory approach where the OS had access to more than just 256MB of RAM.

If Sony decides to allocate a measly 512MB for the OS, then they're shooting themselves in the foot. They should make the PS4 more than just a gaming console.

TheLegendOfMart said,
Of course its enough. The PS3 OS ran in 50Mb. The Orbis is going to be a pure gaming machine, the Durango a "multimedia hub"

Microsoft is making a mistake, people have lots of devices that do what the Durango do, Smart TVs, streamers, etc..

I find this absolutely hiliarious to read.

The PS2 wasn't a trojan for DVD and the PS3 wasn't a trojan for Blu-ray. The playstation is a multimedia hub, even more than the 360, but the 360 has evolved over the years to include video on demand, music etc.

Not sure how you are speculating that Kinect will be in the console. Seeing as most consoles are located in not centered places (in a cabinet or somewhere) it would be dumb to put the sensor in the console itself. In fact that diagram at the top even has "Kinect In" as a port.

Maybe they're thinking more along the lines of having the parts of the unit that do some of the work inside the Xbox while the sensors and cameras are still independent. It would allow for them to make Kinect 2 smaller in size.

GP007 said,
Maybe they're thinking more along the lines of having the parts of the unit that do some of the work inside the Xbox while the sensors and cameras are still independent. It would allow for them to make Kinect 2 smaller in size.

Exactly, there is no "sensor" so to speak but the hardware required to decode the depth and video data at a hardware level. It will mean that they will either be able to shrink the size or pack higher capacity sensors/ more sensors in to the same space. The cable will essentially feed the raw sensor feeds straight into the dedicated NUI hardware!

Agreed. And honestly, though I don't believe rumors until they are proven true, Microsoft as a company can do a lot more with less than Sony can... Sony needs specs to make up for poor developer tools, inefficient console OS', and service flaws... So, even if the rumors are true, I wouldn't count Microsoft out.

I'll be really angry when this is true. Why can't they make it as powerful as Sony? Sony uses GDDR5 ram for the ps4, almost twice as fast as Durango etc... 1.8 terraflops instead of 1.2 Why can't microsoft do these things?

Jarrichvdv said,
IWhy can't microsoft do these things?

Price, performance, SDKs and tools.
The new PS3 will most likely be underutilized by game developers for the first half of its life, just like every other Sony console.

Microsoft's DX based technologies allow for consistent programming across their platform ecosystem.

Playing a pure spec game has never been MS tactic. They do more with software.
Look at every division and you will see this is the case.

deadonthefloor said,

Price, performance, SDKs and tools.
The new PS3 will most likely be underutilized by game developers for the first half of its life, just like every other Sony console.

Microsoft's DX based technologies allow for consistent programming across their platform ecosystem.

Playing a pure spec game has never been MS tactic. They do more with software.
Look at every division and you will see this is the case.


True. Microsoft might have the edge with D3D11.1. I don't think Sony is allowed to use that API with their next-gen console. My guess is they'd use a custom version of OpenGL.

People need to realise that until there is an official announcement, that all this is just speculation and rumours.

Anaron said,

True. Microsoft might have the edge with D3D11.1. I don't think Sony is allowed to use that API with their next-gen console. My guess is they'd use a custom version of OpenGL.

Might have the edge with DirectX? I'd say indefinitely have the advantage with DirectX. It is lightyears ahead of OpenGL! Start programming with both of them and then debug and you will see the performance difference!

It is likely going to be like a Google TV style setup.

You plug your cable box into the HDMI IN and HDMI OUT to your TV, then you have the OS as an overlay to the TV.

TheLegendOfMart said,
It is likely going to be like a Google TV style setup.

You plug your cable box into the HDMI IN and HDMI OUT to your TV, then you have the OS as an overlay to the TV.

Or.... It could be that the XBox 360 is going to replace your cable box and you'll get all of your live content on the XBox directly with a guide including the ability to use XBox Bing to search across all the video apps, guide, and web.

I could easily see the input for other video sources or even mobile devices, and because it is HDMI 1.4 it could control those devices like the next gen windows phones PCs or tablets. We just don't know and will have to wait until their special event or E3.

I still don't know if that cpu clock is right. This isn't a mobile device so why clock it at 1.6ghz and not a bit higher, I'd have at least aimed at 2ghz.

I see how the hdmi in port would get attention but I think the bit about each cpu core has one independent thread that's not shared as being more interesting.

GP007 said,
I still don't know if that cpu clock is right. This isn't a mobile device so why clock it at 1.6ghz and not a bit higher, I'd have at least aimed at 2ghz.

I see how the hdmi in port would get attention but I think the bit about each cpu core has one independent thread that's not shared as being more interesting.


Because its using a mobile cpu, the Jaguar is a tablet CPU.

GP007 said,
I still don't know if that cpu clock is right. This isn't a mobile device so why clock it at 1.6ghz and not a bit higher, I'd have at least aimed at 2ghz.

I see how the hdmi in port would get attention but I think the bit about each cpu core has one independent thread that's not shared as being more interesting.

These days higher clock speeds really only mean more heat. Keeping the heat down is pretty important for consumer devices. High end gamers don't care, but moms and dads buying their kid an xbox don't want to listen to it screaming half the house away.

GP007 said,
There's no real reason to use a mobile cpu in a non-mobile device. I'll wait for something more official.

Heat and price comes to mind.

GP007 said,
There's no real reason to use a mobile cpu in a non-mobile device. I'll wait for something more official.

Wait all you want, its an AMD Jaguar GUARANTEED.

Heat at 2ghz isn't going to be that much of a difference but it'd help with performance in the end. The only thing they might be going for is less noise, still we're not talking 3ghz+ here, the impact would be minor.

It's not just the CPU they have to worry about, its the GPU as well, the Bobcat which Jaguar is based on runs at 1.5-1.7Ghz for a nice power to energy efficiency ratio.

Sure they could overclock it but it's going to affect heat and power consumption.

By moving to a mobile based CPU, you cut on power consumption, heat and price, this will probably mean that the console could be quite a bit smaller than the current Xbox 360 or the original 360.

This could be the ethernet coming in from the TV on the HDMI cable. That isn't a list of external ports, its more of a what hardware handles what diagram.

Basically, the previously rumoured specs are true. It'll be interesting to see how it compares to the PS4 in real-world performance. It'll have more available memory than the PS4 but it's considerably slower (8GB DDR3 @ 68 GB/s vs. 4GB GDDR5 @ 192 GB/s).

Anaron said,
Basically, the previously rumored specs are true. It'll be interesting to see how it compares to the PS4 in real-world performance. It'll have more available memory than the PS4 but it's considerably slower (8GB DDR3 @ 68 GB/s vs. 4GB GDDR5 @ 192 GB/s).

I honestly don't think it will mean squat in real-world performance. If anything, Sony probably went with such fast RAM to compensate for skimping on the available amount as it will need to refill more often.

I concur. These specs seem relatively weak and not far-sighted. The previous gen of consoles providing all sorts of CPU/GPU advancements; these provide nothing but fodder for the blogosphere.

The drawing seems rather amateurish compared to legit Microsoft documentation, too.

It sounds like you have to take into account the 32mb esram buffer, it says in the specs that to the gpu the peak bandwidth will be 170 GB/s.

Nas said,
The drawing seems rather amateurish compared to legit Microsoft documentation, too.

That was my first reaction too, looks as though someone used visio or something. That could mean an employee hand-drew it then used visio when he got home so that he doesn't get caught, who knows.

The arrows even cover "gigabit ethernet"'s text.

torrentthief said,

That was my first reaction too, looks as though someone used visio or something. That could mean an employee hand-drew it then used visio when he got home so that he doesn't get caught, who knows.

The arrows even cover "gigabit ethernet"'s text.

Yep there is absolutley no doubt that this was made in Visio pretty quickly!

Anaron said,
Basically, the previously rumoured specs are true. It'll be interesting to see how it compares to the PS4 in real-world performance. It'll have more available memory than the PS4 but it's considerably slower (8GB DDR3 @ 68 GB/s vs. 4GB GDDR5 @ 192 GB/s).
Yup, I think RAM will be a big advantage for the Xbox, with twice the memory it'll be once again easier to program (although nowhere as radical as it's been with the 360vsPS3), it'll allow higher-res textures, more detail, more instances of everything, better caching of assets, a lot more optimizations (the space-time tradeoff (http://en.wikipedia.org/wiki/Space%E2%80%93time_tradeoff)). With the size of CPU caches being so small compared to system memory and the cost of invalidating them being so expensive I don't think the memory speed will be a huge issue except perhaps for streaming textures to the GPU, I'm not an expert but we'll see.

Overall both consoles are looking more powerful than I was anticipating and they'll really push the technical envelope. Moreoever, with a lot of headroom (especially on Xbox) they'll be easier to program for which could lead to higher quality titles, at less cost and with less development time.

Tpiom said,
They better make sure they don't get another fatal flaw like the Xbox 360 has (had?): Red Ring of Doom.

Well, as that has been resolved, I'd say they learned a lesson and we can all move on now...

Haha weak sauce.

The Durango GPU at 12CUs (Orbis has 18CU) is already weaker, but the fact that each 'core' will ALSO have to do physics to offload some of the load from the CPU that will have 2 cores reserved for the OS, where the Orbis has an APU that is capable of physics offloading from the CPU AND GPU means Microsoft is already on the backfoot.

It's still going to drive just about any game at 60fps@1080p

Most gamers don't care about the graphics as much as most people think they do. I'm sure I'll get a bunch of people telling me I'm wrong there, but the vast majority of people would rather have an extra £100 on their pocket for buy games than have some extra polygons on their screens.

Console have succeeded because they bring a cheap, level playing field to gaming. You don't need to worry about upgrading just to run the latest games, it comes as a guarantee. Microsoft recognise this.

Pong said,
It's still going to drive just about any game at 60fps@1080p

Most gamers don't care about the graphics as much as most people think they do. I'm sure I'll get a bunch of people telling me I'm wrong there, but the vast majority of people would rather have an extra £100 on their pocket for buy games than have some extra polygons on their screens.

Console have succeeded because they bring a cheap, level playing field to gaming. You don't need to worry about upgrading just to run the latest games, it comes as a guarantee. Microsoft recognise this.

And I was quoting and meant you ^

Spirit Dave said,

And I was quoting and meant you ^


Clearly not, there is no way the Durango GPU with 12CU that has to do Physics as well as reading from bandwidth starved memory with 6 cores out of 8 available from the CPU is going to be able to do 1080p60 at the graphical fidelity next gen is aiming at.

1080p30 is the baseline.

TheLegendOfMart said,

Clearly not, there is no way the Durango GPU with 12CU that has to do Physics as well as reading from bandwidth starved memory with 6 cores out of 8 available from the CPU is going to be able to do 1080p60 at the graphical fidelity next gen is aiming at.

1080p30 is the baseline.

I wonder if those future Xbox documents that were leaked still hold true.

VR or Augmented Reality headsets ... would be interesting to see if that will come as a midlife refresh taking into account the performance overall.

I don't see why not with the occulus rift on the way, I think they might hedge their bets and see if Occulus sells.

I'd love both consoles to have some kind of VR like Occulus.

TheLegendOfMart said,

Clearly not, there is no way the Durango GPU with 12CU that has to do Physics as well as reading from bandwidth starved memory with 6 cores out of 8 available from the CPU is going to be able to do 1080p60 at the graphical fidelity next gen is aiming at.

1080p30 is the baseline.

Sounds like you're just pulling that out of thin air?

Where do you get the 6 cores out of 8 bit? It doesn't say that anywhere in the specs. All it says is that each core out of the 8 has one independent "thread" that's not shared. It doesn't say anything about 2 cores being reserved for just the OS.

TheLegendOfMart said,

Clearly not, there is no way the Durango GPU with 12CU that has to do Physics as well as reading from bandwidth starved memory with 6 cores out of 8 available from the CPU is going to be able to do 1080p60 at the graphical fidelity next gen is aiming at.

1080p30 is the baseline.

Sorry man, my point was about consumers/gamers giving a crap. Gaming is not as much about polygons and shaders as it is about gaming. So I think the majority don't care that it's 1.6ghz or 2ghz, as long as the games rock. And I'm sorry but I still sit in awe when I look at PS3 Battlefield 3 ... it's gorgeous, and no one I know sits there complaining about the resolution or lack of polygons when they're playing it. I think the art team was exceptional and that's all that matters.

TheLegendOfMart said,
I don't see why not with the occulus rift on the way, I think they might hedge their bets and see if Occulus sells.

I'd love both consoles to have some kind of VR like Occulus.

Like how the 360 is inferior to the PS3 generally, yet it's the most popular because content > slightly better specs.

>Sorry for quoting the wrong post.

I really wouldn't mind myself! I was just hoping that the GPU specs might reveal more to whether it would be possible to support such add-ons.

I think the Rift has to run at 120FPS but with an obvious lower resolution. Dev Kit runs on 720p and their aim being 1080p for the consumer version.

I wonder if it is somehow possible to attach an external GPU through the HDMI port like some laptops do with pci-e.

And watch Sony's R&D screw up all that horsepower by laying out the system board while drunk and bottlenecking the hell out of it.

Brandon Live said,

Sounds like you're just pulling that out of thin air?


No I just love this kind of stuff and have done a lot of reading on the internet, including lots of corroborating rumours and information.

AR556 said,
And watch Sony's R&D screw up all that horsepower by laying out the system board while drunk and bottlenecking the hell out of it.

Sony's stuff is always ridiculously difficult to program for. You'd think after all this time that they would have learned.

Um they have, the PS4 is going to have the same if not a very similar CPU, 8x 1.6Ghz Cores.

The PS4 also has GDDR5 as system ram which has way more bandwidth than Durango.

Xerax said,

Like how the 360 is inferior to the PS3 generally, yet it's the most popular because content > slightly better specs.

>Sorry for quoting the wrong post.

As it happens, of the three consoles, the Wii has sold more worldwide than both PS3 and Xbox360, yet PS3 is slightly ahead of Xbox So in fact, the Xbox is the least selling games console of this last generation.

Spirit Dave said,

As it happens, of the three consoles, the Wii has sold more worldwide than both PS3 and Xbox360, yet PS3 is slightly ahead of Xbox So in fact, the Xbox is the least selling games console of this last generation.

I was talking purely about 360 vs. PS3. But I had no idea the PS3 was slightly ahead of the 360. I guess I forgot about how well it sells in japan and the rest of asia.

Thanks for the correction.

Xerax said,

I was talking purely about 360 vs. PS3. But I had no idea the PS3 was slightly ahead of the 360. I guess I forgot about how well it sells in japan and the rest of asia.

Thanks for the correction.

The Japan thing is precisely it. MS can't shift units there for love nor money. The Japanese seem to hate Microsoft. Dunno why to be honest. Good console generally, with good games. I was a PS3 guy when I played games. Don't get me wrong, I owned Xbox 360, Wii and PS3, but I loved using my PS3 for the controller, and the exclusive games. But it doesn't seem to make any sense beyond a racial or cultural lack of tolerance as to why MS doesn't sell in Japan.

Spirit Dave said,

As it happens, of the three consoles, the Wii has sold more worldwide than both PS3 and Xbox360, yet PS3 is slightly ahead of Xbox So in fact, the Xbox is the least selling games console of this last generation.

VGChartz has Xbox 360 at 73.8M vs. 72.2M or PS3.

Spirit Dave said,

As it happens, of the three consoles, the Wii has sold more worldwide than both PS3 and Xbox360, yet PS3 is slightly ahead of Xbox So in fact, the Xbox is the least selling games console of this last generation.

Wii sold more units but Nintendo wasn't very happy about it long-term performance. They only made a bit of profit on each Wii and the game-attachment rate was incredibly low compared to the others. The average X360 owner buys 12 games, the average PS3 owner buys 6 games and the average Wii owner buys 3 games.

The reason why PS3 is fo far behind X360 is because core gamers are on X360. Right now PS3 is outselling X360 but at cost price and people who are buying now aren't very active gamers.

And considering where they were comming from, Microsoft has had the best run this generation.

Seems highly speculative to based performance on 16 vs 12 shader units when there's so much more to doing games. Furthermore, why would the physics be offloaded to GPUs when there are 8 CPU cores?

helios01 said,
Seems highly speculative to based performance on 16 vs 12 shader units when there's so much more to doing games. Furthermore, why would the physics be offloaded to GPUs when there are 8 CPU cores?

Because he doesn't know as much as he thinks he does. and his memory information is faulty at a basic level as well.

but this has been argued elsewhere.

Spirit Dave said,
The Japanese seem to hate Microsoft. Dunno why to be honest.

It's not really about hating MS. Fact is that PS just has the exclusives that are most wanted/played in Japan. That's not going to change.

The XBox360 is an amazing console, arguably better than PS3. And because of it's architecture/OS, it's easier to develop for and can output same/better performance than PS3, which has more powerful hardware.
That sad, I still prefer the PS3, just because it has the games that I want.

TheLegendOfMart said,
Haha weak sauce.

The Durango GPU at 12CUs (Orbis has 18CU) is already weaker, but the fact that each 'core' will ALSO have to do physics to offload some of the load from the CPU that will have 2 cores reserved for the OS, where the Orbis has an APU that is capable of physics offloading from the CPU AND GPU means Microsoft is already on the backfoot.

Every line of that is rabid fanboyism added to pure speculation. It's best not to state such things until there are officially released facts, otherwise you may appear foolish

TheLegendOfMart said,
Haha weak sauce.

The Durango GPU at 12CUs (Orbis has 18CU) is already weaker, but the fact that each 'core' will ALSO have to do physics to offload some of the load from the CPU that will have 2 cores reserved for the OS, where the Orbis has an APU that is capable of physics offloading from the CPU AND GPU means Microsoft is already on the backfoot.

If by APU you mean AMD's APU, then an APU is the combo of a CPU and GPU, so you're basically saying the PS4 has it's normal CPU and GPU plus a secondary CPU/GPU?

TheLegendOfMart said,
Um they have, the PS4 is going to have the same if not a very similar CPU, 8x 1.6Ghz Cores.

The PS4 also has GDDR5 as system ram which has way more bandwidth than Durango.

GDDR5 is actually just the Graphics version of DDR3. So I don't think it will any more bandwidth.

evilsushi said,

GDDR5 is actually just the Graphics version of DDR3. So I don't think it will any more bandwidth.


It actually does, because by nature GDDR5 runs at much higher clockspeeds (think 7-9GHz) and thus is able to move a crap ton more data.

This is true but by having more to work with they're not in such a need to keep refilling it over and over. At least that's one possible way to look at it. These numbers are rumors still, we'll have to see if they hold up in the end.

TheLegendOfMart said,
Um they have, the PS4 is going to have the same if not a very similar CPU, 8x 1.6Ghz Cores.

The PS4 also has GDDR5 as system ram which has way more bandwidth than Durango.

Not only are you comparing one unsubstantiated rumor to another, but you're overlooking the claim about embedded SRAM, much like the 360 has today which makes a huge difference (the 360 has 10MB of EDRAM/SRAM which is sized ideally for 720p output, but usable for a 1080p framebuffer with more complexity, i.e. tiling). That has the effect of taking the most bandwidth-intensive memory usage out of the equation for the main system RAM.

FunkyMike said,
x64 will have a nice waterfall effect on PC gaming )

Why? the XBOX 360 was already a 64-bit machine, so was the PS3, we've been coding in 64-bit in the gaming world since the N64 technically

neufuse said,

Why? the XBOX 360 was already a 64-bit machine, so was the PS3, we've been coding in 64-bit in the gaming world since the N64 technically


Because they are x86-64 CPUs in the consoles.

neufuse said,

Why? the XBOX 360 was already a 64-bit machine, so was the PS3, we've been coding in 64-bit in the gaming world since the N64 technically

x64 is a specific instruction set. It's used by most Intel and AMD CPUs in desktops and laptops. Xbox 360 was 64 bit, yes, but it used a Power PC instruction set.

we have no 100% confirmation this is the Intel/AMD x64 spec, and x64 has been a marker on all CPU's for a long time.... we had Alpha x64 chips, not the same Instruction set as x86-64 or AMD64...

and these could be PowerPC chips again, at 1.6GHz I'd almost expect them to be PPC not Intel / AMD based

neufuse said,
we have no 100% confirmation this is the Intel/AMD x64 spec, and x64 has been a marker on all CPU's for a long time.... we had Alpha x64 chips, not the same Instruction set as x86-64 or AMD64...

and these could be PowerPC chips again, at 1.6GHz I'd almost expect them to be PPC not Intel / AMD based

Yes of course we have no confirmation, but pretty much for the last 13 years the term "x64" has been related to the commonly used x64 instruction set. I don't know any computer scientist who would say "x64" when they meant "64-bit", and I know a lot of computer scientists. Those are completely different concepts.

neufuse said,
we have no 100% confirmation this is the Intel/AMD x64 spec, and x64 has been a marker on all CPU's for a long time.... we had Alpha x64 chips, not the same Instruction set as x86-64 or AMD64...

and these could be PowerPC chips again, at 1.6GHz I'd almost expect them to be PPC not Intel / AMD based


They are AMD Jaguar chips, the fact that they are in groups of four cores with L2 cache shared amongst the 4 cores means the CPU will have two Jaguar CUs in the CPU.
http://www3.pcmag.com/media/im...ar-compute-unit.jpg?thumb=y

The only waterfall effect it'll have is encouraging developers (at least on the PC side) to not worry about keeping textures and content under 2 GB (virtual address limit on x86).

I would be surprised if Microsoft went back to the Intel platform for it's console. If they do, I'd be interested in anti-piracy technology they plan to use.

ModernMech said,

Yes of course we have no confirmation, but pretty much for the last 13 years the term "x64" has been related to the commonly used x64 instruction set. I don't know any computer scientist who would say "x64" when they meant "64-bit", and I know a lot of computer scientists. Those are completely different concepts.

yet IBM calls the 64-bit version of a PPC chip a PPC x64 chip, using the x64 term... the "correct" term is x86-64 and AMD64 not x64 which is a generic marker for 64-bit

yet IBM calls the 64-bit version of a PPC chip a PPC x64 chip, using the x64 term... the "correct" term is x86-64 and AMD64 not x64 which is a generic marker for 64-bit

Obviously I cannot prove IBM has never done this, but do you have any citations to support your claim? I cannot find anything searching Google. x64 isn't even mentioned in the Instruction Set Architecture manual, although "64-bit" is frequently mentioned.

Again, as I said in my last post, as a computer scientist who knows many computer scientists, we never use "x64" as a "generic marker for 64-bit". Never.