EA VP expects Xbox Two and PS5 to launch in the next 5 to 6 years

One of the first people to get a PS4, seven years after the PS3 launch.

Sony launched the PlayStation 4 in the U.S. two weeks ago, seven years after the PS3 was first released. The wait for Microsoft's Xbox One was even longer, as it launched last week eight years to the day after the Xbox 360 first went on sale.

Before now, the gap in between console hardware launches was about five to six years and at least one game publishing executive believes we will return to that same time frame for the successors of the Xbox One and PS4. In a chat with MCV, EA Studios executive vice-president Patrick Söderlund stated, "This console cycle may have gone on a little bit longer than I would have wanted." However, he added, "... a five, six year gap is what I expect going forward.”

Although the Xbox 360 and PS3 have had a longer lifespan than most other consoles, Söderlund also says that has caused some great games to be released for those platforms just before the Xbox One and PS4 launched. He stated, " ... you have seen games like The Last of Us and GTA V at the end of a cycle which perhaps you would not have expected a few years ago."

Source: MCV

Report a problem with article
Previous Story

Sony prepares for European PS4 launch by temporarily disabling two PSN features

Next Story

UK man throws away hard drive containing over $7.5 million in Bitcoins

45 Comments

Commenting is disabled on this article.

This is something I covered a couple months ago that also states that we should expect to see something from Sony by 2020. With the Wii U not truly a nexgen system, Nintendo will probably release something around 2015/16 and I don't see Sony and Microsoft waiting another five or six years to answer back. Here's the link, it's very basic info. <snipped>

Edited by Eric, Dec 8 2013, 9:35pm :

I somehow have my doubts. His whole premise is based on the idea that everything remains "business as usual". The industry is anything but.

Cloud gaming plus the likelihood of both staying on the x86 platform means that they can update the hardware and still retain backwards computability.

I'm not not so sure about 5-6yrs, if they can milk more years out selling consoles, games and subscriptions why bother spending money on r&d making a new console.

I'm hoping that Steambox and Nintendo Wii 3 take a decent market share to give them box some competition.

If nintendo wants any chance of staying alive in the non-handheld console market they are going to have to release a console with specs better than the ps4 and use a regular controller.

I'm pretty sure the Xbox One / PS4 will stay around longer than the 360 / PS3 personally, for a simple reason: CPUs/GPUs aren't evolving as fast as they did when the 360 / PS3 were launched.

Another reason: Launching a new console is expensive, profits aren't there for the first 2 years for the simple reason that not enough people own the console and buy games, etc. Consoles generates profits on the long term.

Xbox Two? More like Xbox Four...

Xbox: Launched 2001
Xbox 360: Launched 2005
Xbox One: Launched 2013

Thou, going by the above pattern (4 years between Xbox and 360, 8 years between 360 and Xbox One), one would expect the next version 12-16 years from now

It might be supported for a decade, but likely MS and Sony will release "hardware revisions" that push the specs up just a bit more, gradually obsoleting the current-gen consoles.

spenser.d said,
I hope not, and kinda doubt it. Worked out pretty well for them this time around.

I don't know, when you think about it, they're not selling the hardware at a loss from the start, also upgrading the SoCs in these systems should be easier to do and require very little R&D spending to work out. Hell, they could easily upgrade just the CPU part from the 1.6-1.7GHz to something like 2.4-2.6Ghz with little effort in 5-6 years.

That aside, staying x86 based means you don't make a break with the established ecosystem like they've been doing. Keeps developers happy, costs go down and they don't have to learn something new.

alwaysonacoffebreak said,
Oh give me a break with the XBO chipset already. Just because MS had a say in the SoC it's still an Jaguar but a little tweaked.

Its not an off the shelf Jaguar SoC, not by far. MS is deploying a few new things in the XBO chipset what isnt really seen before in a chipset...

Nono, development has been free ofc with a slightly customized default chipset

in 5 years, you wont need a new console. games will be rendered in the cloud,and you could use many types of devices to play.

vcfan said,
in 5 years, you wont need a new console. games will be rendered in the cloud,and you could use many types of devices to play.

But people will complain because that need an always-on connection, not everybody have fast internet, NSA Spying and blah blah blah...

IvoFajardo said,

But people will complain because that need an always-on connection, not everybody have fast internet, NSA Spying and blah blah blah...

in 5-6 years though. look at tech and infrastructure 5-6 years ago. stuff moves fast.

vcfan said,
in 5 years, you wont need a new console. games will be rendered in the cloud,and you could use many types of devices to play.

That already exists, it's called OnLive.

Prosidius said,

That already exists, it's called OnLive.

I mean with the infrastructure advancements, it will actually work,and be more accessible to a lot of people. onlive is pretty shoddy.

vcfan said,
in 5 years, you wont need a new console. games will be rendered in the cloud,and you could use many types of devices to play.

Latency will still be an issue and that's dictated by the speed of light, so unless you can make light go faster (quantum entanglement is an option) you would need at least one cloud server per city to make this viable for gamers.

Take this study for example and you will see what I'm talking about: http://chimera.labs.oreilly.co...h01.html#LATENCY_COMPONENTS
Even in the ideal case you have 14ms latency between New York and San Francisco (28ms roundtrip) but being realistic with current fiber optics you get a 42ms roundtrip. Thats about 40-50% of extra input latency that you are adding over most gaming grade monitors or even double of the input lag that the well renowned Asus VG248QE 144hz screen has (the Asus screen has around 23-25ms input lag). And all this without taking into account every other possible source of lag like the load balancer, server processing, network hiccups, and whatnot... and in a single country.
Now try to go global with this solution.

gonchuki said,

Latency will still be an issue and that's dictated by the speed of light, so unless you can make light go faster (quantum entanglement is an option) you would need at least one cloud server per city to make this viable for gamers.

Take this study for example and you will see what I'm talking about: http://chimera.labs.oreilly.co...h01.html#LATENCY_COMPONENTS
Even in the ideal case you have 14ms latency between New York and San Francisco (28ms roundtrip) but being realistic with current fiber optics you get a 42ms roundtrip. Thats about 40-50% of extra input latency that you are adding over most gaming grade monitors or even double of the input lag that the well renowned Asus VG248QE 144hz screen has (the Asus screen has around 23-25ms input lag). And all this without taking into account every other possible source of lag like the load balancer, server processing, network hiccups, and whatnot... and in a single country.
Now try to go global with this solution.


Latency is only a real problem only if you're "rendering" and streaming games sequentially as you play, which was your assumption. But you could also download small parts of a game (2-5% of the total game for example) ahead of time, then download latency insensitive parts later on as you play. OnLive didn't use that model, which was why the service was slow a lot of times.

Games with latency not being an issue could resort back to the other model: streaming on demand without an initial download.

vcfan said,
in 5 years, you wont need a new console. games will be rendered in the cloud,and you could use many types of devices to play.
I disagree, I think the trend in the next few years will be towards higher framerates and lower latencies; G-Sync, 60fps console games, 120+hz monitors and VR are all gaining traction. Expectations will increase and that will make it even harder than it has been so far to stream games across the internet. Even with internet infrastructure on the level of South Korea, internet latencies are simply too high to provide the kind of reactivity gamers will expect.

Andre S. said,
I disagree, I think the trend in the next few years will be towards higher framerates and lower latencies; G-Sync, 60fps console games, 120+hz monitors and VR are all gaining traction. Expectations will increase and that will make it even harder than it has been so far to stream games across the internet. Even with internet infrastructure on the level of South Korea, internet latencies are simply too high to provide the kind of reactivity gamers will expect.

all those things may matter to pc gamers,which make up a small market compared to consoles. just until a few weeks ago,console gamers were still playing a lot of sub hd games,with tearing and disc drives that sound like jet engines.

local hubs and servers, advancement in tech,and cost feasibility will kill the latency problem. chips are getting smaller, cheaper,faster. it would be more convenient these machines be held offsite,and all we need are dumb terminals that can have access from anywhere, anytime.

gonchuki said,

Latency will still be an issue and that's dictated by the speed of light, so unless you can make light go faster (quantum entanglement is an option) you would need at least one cloud server per city to make this viable for gamers.

Take this study for example and you will see what I'm talking about: http://chimera.labs.oreilly.co...h01.html#LATENCY_COMPONENTS
Even in the ideal case you have 14ms latency between New York and San Francisco (28ms roundtrip) but being realistic with current fiber optics you get a 42ms roundtrip. Thats about 40-50% of extra input latency that you are adding over most gaming grade monitors or even double of the input lag that the well renowned Asus VG248QE 144hz screen has (the Asus screen has around 23-25ms input lag). And all this without taking into account every other possible source of lag like the load balancer, server processing, network hiccups, and whatnot... and in a single country.
Now try to go global with this solution.

This is all true, except in practice it only applies to multi-player games. Which are already dealing with latency and having the game being hosted on a centralized cloud server helps 99% of the time.

This applies to both server 'hosting' and server 'rendering/streaming' a game. However with server 'rendering/streaming', bandwidth becomes the bigger concern, to handle high resolution video.

In single player games using server 'rendering/streaming' the latency/lag can usually be compensated for so that the rendering and the user reaction time can be coordinated. This lets the user feel like the system is responding to them, when it is actually compensating/waiting for their input to adjust for latency.


In multi-player this is where latency is already a major problem, especially with peer hosting, and is why some rounds of CoD or Halo you feel like a god that cannot die if you are on the good side of the latency compensation or feel like you die easily if you are on the bad side of the latency compensation.

In multi-player, 99% of the time a server 'hosted' game is better and more consistent as latency can be managed by the authority/server and slight compensations can be made to each client equally.

So 'cloud' based hosting for mutli-player is better for managing latency.

This also applies to 'cloud' based 'rendering/streaming' as it can equally compensate for latency in the same way; however, its bigger problem is bandwidth.

So multi-player server 'rendering/streaming' of a game has 'two' issues to deal with, reducing how well it works.

This is where both end user bandwidth handling a huge chunk of video and the inherent latency of the speed of light becomes a compounded problem.

Cloud/server hosted gaming is going to be better than the world of mostly peer solutions we are used to having, and even 'cloud/server' rendered/streaming gaming won't be horrible, but not 'ideal'.

I do think there is a near future where we will have insane bandwidth and no latency by having consoles with entangled particles to the server. They will also have essentially 100% secure communication between the server and the clients.

vcfan said,
in 5 years, you wont need a new console. games will be rendered in the cloud,and you could use many types of devices to play.

Much of the US is still on dial up.

Who knows, now that Microsoft and Sony are making a profit out of the gate with each console they might not have to make each generation last as long.

McKay said,
Who knows, now that Microsoft and Sony are making a profit out of the gate with each console they might not have to make each generation last as long.

It should be easier for them to update the hardware now that it's basically x86 with a few tweaks really. For example, it shouldn't be hard to upgrade the CPU part to a faster one and also the GPU.

Shadowzz said,
Yeah cause releasing a PS4 with a much improved Cell CPU would've been such a hard task.

edit: /s just in case

Maybe they were fed up with inferior ports due to devs having difficulty with it.

McKay said,

Maybe they were fed up with inferior ports due to devs having difficulty with it.


Or the power of X86 increased plenty to carry the weight of the games for years to come.

I totally expect this. Mostly the same machines, running on faster SoCs. This would allow for full backwards compatibility, without holding back progression.

SikSlayer said,
I totally expect this. Mostly the same machines, running on faster SoCs. This would allow for full backwards compatibility, without holding back progression.

That's what I was going to say, it's possible to update both these systems and not lose backwords compatibility this time around since they're both x86 based. All they'd have to do is update the CPU/GPU specs like any PC and we'd be good to go.

SikSlayer said,
I totally expect this. Mostly the same machines, running on faster SoCs. This would allow for full backwards compatibility, without holding back progression.

It's what makes most sense going forward now that they all converged to x86 and desktop class GPUs.

That is so not true... Even if they maintain the same processor, they can switch the architecture and break compatibility... There's a lot that makes a computer/console architecture, like the cpu, gpu, memory controller, sound controller... I don't think they will maintain compatibility in the future and for the look of things I think that probably the next generation is going to be even further away than this one... We are likely see more mobile and network/cloud in the meanwhile...

I sure hope so. Delaying the next generation too long prevents technical advancements and holds gaming back as a whole. 64-bit gaming is way overdue, for instance, and only begins to happen now because of the new consoles.

DClark said,
64-bit gaming? Nintendo 64 came out in 1996.
Yes, but all pc games were 32-bit back then, and they were still all 32-bit last year, because all these consoles used small amounts of memory and moving to 64-bit was irrelevant on PC. Now that the new consoles have 8GB of RAM, we're finally seeing games cross the 2GB barrier on PC (requiring a move to 64-bit) and that's a major technical advancement for gaming.

Andre S. said,
Yes, but all pc games were 32-bit back then, and they were still all 32-bit last year, because all these consoles used small amounts of memory and moving to 64-bit was irrelevant on PC. Now that the new consoles have 8GB of RAM, we're finally seeing games cross the 2GB barrier on PC (requiring a move to 64-bit) and that's a major technical advancement for gaming.

One slight note.

Moving to 64bit is not just about accessing more RAM. This is the crap Apple spread for years when people complained OSX was only 32bit on a 64bit CPU.

With the 64bit CPU and 64bit OS, there are a lot of performance improvements.

With Windows x64, it also speeds up 32bit applications with various tricks like combining RAM read/writes in addition to the OS and its drivers just running faster.

The last detail with Windows x64 is why game developers didn't focus as much on 64bit as they could have, as their 32bit versions were getting a boost via the Windows x64 OS already.

However, you are correct that there is a place for 64bit gaming that can bring a bump in performance and handling various game complexities.

Mobius Enigma said,
Moving to 64bit is not just about accessing more RAM. This is the crap Apple spread for years when people complained OSX was only 32bit on a 64bit CPU.

With the 64bit CPU and 64bit OS, there are a lot of performance improvements.

With Windows x64, it also speeds up 32bit applications with various tricks like combining RAM read/writes in addition to the OS and its drivers just running faster.

Could you provide some kind of reference for your statements? This is not what I've been usually reading. From Intel's developer reference (http://software.intel.com/en-u...-to-64-bit-applications-en/):

Do we have to move to 64 bits?
Like with any other question related to performance, the answer depends on a particular situation. Anyway, the following pros and cons should be taken into account.

Pros:

the most important advantage of 64-bit processes is increased address space;
optimized 64-bit mathematics;
the 64-bit kernel of an operating system uses a larger amount of available memory to improve many aspects of work.

Cons:

you need more memory for many operations (pointers occupy a larger size, especially in managed code that contains references all over the code);
the effective part of processor [cache] is smaller (if we compare 32-bit and 64-bit modes) due to the same reason;
the size of code also increases because of additional prefixes and instructions containing 8-byte operands instead of 4-byte ones.

Therefore, any code that works well on 32 bits, contains no 64-bit arithmetic (i.e. in no other ways uses new capabilities of the 64-bit processor) and does not require more than 2 GB of available memory to have only disadvantages when being launched in a 64-bit operating system: a larger size of memory consumed and some operation slow-down.

However, in many cases the benefits outweigh the above mentioned disadvantages that is of crucial importance for developers. For instance, many applications reach the memory limit. Besides, port to 64-bit mathematics provides a significant performance gain for some applications. For instance, it holds true for applications working with graphics, video coding, etc.

Andre S. said,
Could you provide some kind of reference for your statements? This is not what I've been usually reading. From Intel's developer reference (http://software.intel.com/en-u...-to-64-bit-applications-en/):

This article is talking about development choices, and is an opinion piece that isn't 100% accurate.

This article is NOT talking about existing 32bit software and how it runs on Windows x64.

This can be a really deep and complex subject, as there are thing in the compatibility layers of Windows that adjust 32bit process flags all the way down to kernel transparently using 64bit calls under the WOW64.

The best way to research this, would be to start with finding out how WOW64 works and how 32bit processes are handled, especially ones that are driver heavy, like games.

Be careful not to get caught up in older articles that are dealing with IA64, as WOW64 had a considerable amount of overhead to run 32bit software.

This is an easy place to start with a more general public type read: http://msdn.microsoft.com/en-u...ktop/aa384274(v=vs.85).aspx

Do your own searches on WOW64 and also try to find more 'technical' articles by some of the Microsoft Windows gurus that will often go into greater detail and why it works and the overall effect.

Look for stuff around the mid 00s that go into great detail. There are also some good Channel 9 and other videos that hit on this topic. However, some of the more comprehensive videos are on a wider topic, which makes finding the specific references to 32bit process on 64bit not a quick search. So look for discussions on the NT kernel, subsystems, WDDM/WDM, as well.

There are some general information that AMD and others have put out over the years about 64bit advantages beyond just address space.

Good luck.