can't see PS4 or Xbox One lasting more than 5 years.


Recommended Posts

Didn't quote you because i don't agree with you. Vague bullet points do not establish that somehow the 360 and PS3 were different from any other cycle. They never exceeded medium to high end PCs in terms of performance, nor did they spur technology development, but you obviously disagree. And yes, I also talk to TSMC and Global Foundries on a regular basis, i know the process is having issues. A 280X is the same as my 7950, i don't care that they did some tweaking to improve power efficiency or that it has some other minor improvements. We were supposed to have 22nm GPUs by now. And well said, Intel do their own chips - but they too are having problems. They can keep their tocks, i'm waiting for 14nm Broadwell. As i said, this discussion is moot. You don't like X1 and PS4, don't buy them, you should not spend money on products you feel are inadequate.

 

EDIT: if you mean specifically launch window, then you are being very pedantic. 360 was basically a 1950XT and the PS3 maybe a 6850GT in terms of graphics processing. By the time both arrived to market, i already had a 7900GT which ran circles around both of them. Combined. The problem is expectation management. How can consoles possibly hope to outsrip conglomerate machines that cost much more and even to a layperson such as myself clearly have much more processing power? And i really don't think either Sony or MS tried to say the new machines are the epitome of power, so not sure why many are acting surprised. I'm not happy doing 1080p is still not 100% the norm, but am willing to give them the benefit of the doubt. No one forced me to buy anything, did so willingly. I do not feel deceived.

 

EDIT EDIT: wasn't the Xbox 360 GPU of the 128-bit interface disposition? I always thought it was 128-bit. The 800XL i had back in summer 2005 was already 256-bit and had 512MB of GDDR3 (i.e the graphics card alone had the same memory buffer as was on the whole of each one of the those two consoles), so i fail to see the point of the 360 and PS3 somehow being superior to the PCs of their era. But you do make a very interesting point, you appear to despise the X1 and PS4 while heralding their predecessors as something they were not. That's quite intriguing actually!

Link to comment
Share on other sites

  • XBox360 launched in Late-2005.
    • One of the earlier applications of eDRAM and unified memory architecture (UMA) -- the former yielded a much higher bandwidth to theGPU than was found on discrete cards of the era (256GB/s).

 

Technically that 256GB/sec was just the internal bandwidth of the edram die from the memory to the rops on the chip. The bandwidth between the gpu die and the edram die was much lower at around 32GB/sec.

Link to comment
Share on other sites

Before, I get started: let's be clear here: I never said that the consoles were 'objectively' better in terms of raw performance numbers in all cases. What I said was: The last few generations of consoles spurred the development of new technology both in terms of graphics processing and architecture. Though, you may now want to make this is an argument of raw performance, it was not, and never has been. What is is an argument about the technical capabilities of the systems in comparison to the PCs of the era and it will remain strictly that.

 

Didn't quote you because i don't agree with you. Vague bullet points do not establish that somehow the 360 and PS3 were different from any other cycle. They never exceeded medium to high end PCs in terms of performance, nor did they spur technology development, but you obviously disagree.

I'm not saying that you didn't quote me... I'm saying that I've given you the means to see how the architectures differ, what was novel about them, and was asking you to please research the topic and to not restate factually incorrect information.

 

It's neither here nor there whether you agree -- this isn't an opinion based discussion and this isn't and never was an argument about PC superiority -- it's only ever been a discussion of factual capabilities and architecture differences on my part. Many of these are specific statistics (for example: large differences in processor count, and pipelines), others are simple facts that you can verify easily (roadrunner, the uses of Cell in HPC, ...). You made the claim that these systems didn't spur technological development -- I've given you factual evidence showing otherwise.

 

 A 280X is the same as my 7950, i don't care that they did some tweaking to improve power efficiency or that it has some other minor improvements..

I'm not talking about a contrived example of a card that is well known to be rebranded 7970 with tweaks. I'm talking when you look at things in comparison to top of the line cards. For example, for R9 you get a 290x which is better than the 7970 (the top of the line single die GPU of that generation). The same relationship holds true for Nvidia (I'm not going to bother going into detail here). This is how they've always done it: offering new generation cards with similar performance, better efficiency, and new features to the prior generation + a top card that performs better.

 

You'll note that I've specifically ignored the HD7990 and gtx690. Why? Because these were just two dies packed on a single board (7970 dies and gtx680 dies I believe). Both companies just didn't bother to make newer board designs with dual dies.

 

EDIT: if you mean specifically launch window, then you are being very pedantic. 360 was basically a 1950XT and the PS3 maybe a 6850GT in terms of graphics processing. By the time both arrived to market, i already had a 7900GT which ran circles around both of them. Combined.

I'm not being pedantic with launch dates: the GPU you are referring to was launched months after the 360 (in early-2006). You can check the release dates yourself. Interesting proposal there about it being better than two GPUs combined though.  :laugh:

 

I'll note that this discussion of your future XYZ card being better than the 360's is neither here nor there. I was talking about what was available at the time and discussing new architectural features that weren't available in current cards of the era. These of course were readily swallowed and ended up in the retail GPUs in the subsequent months and years that followed. Let's be clear, you initial claim was that these systems were 2-3 years behind the top of the line systems of the era in terms of GPUs and CPUs and that these systems didn't spur technological developments. Both are simply false.

 

EDIT EDIT: wasn't the Xbox 360 GPU of the 128-bit interface disposition?

To main memory? Yes. To eDram or L2? No. But, that's not important anyway regardless of the bus width size. What's important is the bandwidth and you can't determine that by looking at the bus width. I already stated that the bandwidth to eDRAM was considerably better than the memory bandwidths offered on discrete GPUs of the era (256GB/s vs 25.6GB/s). Main memory bandwidths on the 360, on the other hand, weren't (22.6GB/s), but there is an important distinction here with how the system operated. Much of the processing work was done between the eDRAM and secondary GPU IC (I mention this at the end of the post), effectively leaving only texture movements to be done using the main memory bandwidth. In addition, there were also numerous new texture compression algorithms used in the system to aid in maximizing bandwidth.

 

So in terms of factual statistics? eDRAM (256GB/s) and L2 accesses (51.2GB/s) had higher bandwidths than discrete GPUs (25.6GB/s), but the 360's main memory bandwidth was less (22.6GB/s).

 

To frame this in terms of what I've been arguing, the 360 GPU has numerous technological differences from your run of the mill GPU in addition to what I mentioned in my previous post about the unified pipelines that boost raw processing capabilities. And it should be fairly evident from these numbers above that the bandwidth was not years behind GPUs of the era; and actually significantly better in a number of use-cases.

 

The 800XL i had back in summer 2005 was already 256-bit and had 512MB of GDDR3 (i.e the graphics card alone had the same memory buffer as was on the whole of each one of the those two consoles), so i fail to see the point of the 360 and PS3 somehow being superior to the PCs of their era.

Again, I never tried to make this point. My argument has been focused on capabilities and architectural features. But, I'll humor it somewhat anyway.

 

In terms of multi-core capabilities? This would certainly be true. I've already mentioned the notable differences of PC vs the systems: 2 core (2 HW threads) vs 3 core SMT (6 HW threads) vs 1PPE/7SPE(effectively 8 HW threads -- not quite, but that would be a discussion in itself). I'll note that you haven't yet made any argument to say otherwise.

 

In terms of GPU performance? It's arguable. I've noted the drastic difference in pipeline width (24 in the top of the line ATI vs 48 in the XB360) at the time the XBox 360 landed. I've noted the bandwidths differences in detail. The example that you gave of a better performing card is just a card that came out later. The difference in memory sizes? I don't care about, as it's not relevant to performance.

 

In terms of GPU capabilities? It's certainly true. The unified pipelines debuted here and only discrete cards in the future generations contained them.

 

But you do make a very interesting point, you appear to despise the X1 and PS4 while heralding their predecessors as something they were not. That's quite intriguing actually!

Please refrain from goading me. This isn't that kind of discussion...  :rolleyes:

 

Technically that 256GB/sec was just the internal bandwidth of the edram die from the memory to the rops on the chip. The bandwidth between the gpu die and the edram die was much lower at around 32GB/sec.

Good point, that's true. For everyone else, there were two ICs, the primary GPU core, and a secondary IC to do z transfers (to the buffer), stencil transfers, anti-aliasing. The secondary IC was connected at 256GB/s; the primary 32GB/s. The data that went over the 32 GB/s bus was compressed so it would achieve higher throughput than the numbers themselves show though. The idea is that you can process the transfers (to the display buffer) and anti-aliasing at much faster rates.

 

EDIT: Correction - last rates should be 32GB/s --> 22.6GB/s.

Link to comment
Share on other sites

You think consoles are computer desktops that can be replace every 3 to 4 years? Consoles last more longer because they can be updated with firmware to improve the GUI and fix any issue with lag or any bug to speed things up. Plus the companies behind the games they need time to create very good games, a game can take years to create... imagine a console now that came last year and now will be replace in 2016 and there are several games that will come to that platform but it will be release in 2016... so now they are going to release a game in an already obsolete platform. This is why consoles last usually almost 8 years.

 

Consoles don't usually last 'almost 8 years' at all in terms of the gap between next gen releases.

 

The last generation did, and the reasons for that is plainly obvious.

 

It was the only console generation which had such a lengthy gap and caused a lot of problems for a lot of people.  Even a sony exec (I forget which one now) said they don't expect to have such an enormous gap next time.

Link to comment
Share on other sites

It's been said but hardware specs, for consoles, are just half the picture,  the games are ultimately what will matter more and we're just in the start of it.   You can argue how they're behind the PC, that's fine, but it's not like many games, or any games, depending on what res you want to play them at, are pushing the PC forward like back in the old days.   It's been said time and again that a PC that's 3+ years can play most if not all of todays games.  My current i7 920 + HD7870 played BF4 fine, I'm playing AC4 just fine right now.   So the hardware bit is just one piece of the puzzle, and people are right, in a few cases the consoles have introduced new tech before they made their way to the PC.  The 360 having the first GPU with unified shaders is true, part of the ability MS has to work with the hardware makers and push designs forward and toss in DX to bring it all together.

 

I'm sure that some of the custom parts in the XB1's SoC will find their way back into AMDs APUs for PCs as well.   Regardless of the technical aspects though, the games will be the factor, right now the games still target the 360/PS3, hard not to when both have 80million units out there, iirc, it's a big market still.   Once they start to cut down on those and can focus just on the new systems (something exclusives can do from the start so those aren't an issue), then the games will get better overall and the hardware will lose focus.  Because we don't see much of a difference right now we're questioning the hardware choices, that's what it comes down to.

 

How long will they last, it depends on the money flow, both have sold way more in the first few months than the originals ever did, mostly because people were ready for something new after 7 years.  If both make money back quicker, and that initial cost to bring them to market is covered, then we can move ahead to something new.   The price of the systems is what you have to look at, once their price drops to $200 then you know a new system is on the way so they can start at $400-$500 again and make more money from us.

  • Like 1
Link to comment
Share on other sites

No, it's not. Ignorance or apathy simply aren't valid justification, such thinking is regressive and harmful to the industry as a whole.

 

 

But what part of being able to enjoy games despite their technical proficiencies is regressive & harmful to the industry?

 

That doesn't mean people don't want a new generation of better playing & better looking games. Hell, it doesn't even mean they don't expect it. But it also doesn't nesscarily mean folks should be boycotting games that aren't 1080p & 60fps.

  • Like 3
Link to comment
Share on other sites

Very well said. I dont even know what this is about anymore, with snaphat trying to pull me back into the maelstorm that is forum debates. Yes i lost my cool, but i really dont remember what we were supposed to be talking about. At any rate, agree with Lamp, we can enjoy gaming while asking for more. Just because PS3 and 360 got long in the tooth doesnt mean the entertainment wasnt there. And we always have options. You can now game in 4K if you want to and have the resources, and why not? Enjoy it all. Right now the one i crave is Wii U. We can all agree that is not power incarnate...but i need to have her. After two console purchases since Nov tho...cant justify it!

Link to comment
Share on other sites

I think the X1/PS4 will have a shorter life BUT the X2/PS5 will support the games, which will make a diffrence.

The next gen switched to a diffrent CPU, but let's say the X2/PS5 will work on the same hardware, but with better GPU/CPU 3-4 years from now ? You could still play your PS4/X1 games on it, and theoratically you could even play PS5/X2 games on PS4/X1, but with lower res or frame rate. So pubishers could release a new game with the same code for both consoles, and gamers could upgrade - if they want - to the new console for better graphics and other imporvments, all while still being able to play the games they own.

Link to comment
Share on other sites

BTW much of the so-called factual information you posted is blatantly wrong as pointed out to you before. And launch early 2006? So Nov 2005 launch already outdone by a card that came out in spring 2006? That's not the same situation as now? That's not launch? Mind you the 360 did not launch worldwide in November.

If the information is incorrect, you'll have to be more concrete. Anyone here is free to correct me if I've said anything blatantly wrong in my posts. Aside from the 32GB/s listed at the end (it should have been 22.6GB/s), I don't see how there could have been.

 

And no, it's not the same situation as now, these current systems took older generation designs at the GPU level and low-power cores which is why they were well matched by mid-range equipment even upon release. The big difference is that the newer generation didn't drive R&D like the originals had in their designs. Instead, their design is much more a case of off-the-shelf components put together. That's a summation of the details I was getting at in my prior posts.

Link to comment
Share on other sites

Hopefully you are right about compatibility. It was hoped for in the current generation as well, but switching to a new architecture apparently provided a good reason to skip that. Not that cost and potential sales had anything to do with it...but we can hope that between streaming and shorter cycles, maybe compatibility will be considered.

Link to comment
Share on other sites

The thing is.. not every gamer has a gaming PC.  I built my computer like 6 years ago.  Cost me 1500$ when I did it.  It is still a good machine.. but I can play games on my PS4 with much higher quality, and framerate than my PC.  The parts of my computer would still cost 300-400$ about the cost of my PS4.. which can graphics and game wise out preform it.  I mean hell when I built my pc the graphics card alone was like 300$  I can still play most games on medium settings no problem... but definitely not a 1080p 60fps on high/ultra like my ps4 can.

 

So to compare a machine that can cost twice as much (and easily more) to a console at 1/2 or 1/3 or less price.. is a poor comparison.  It's like trying to compare a cell phone to a gaming computer "But they both support gl".  Or hell, compare a consumer grade 400$ pc with an embedded graphics card.. to a 1500$ gaming machine.  Both built in 2014.. the 1500$ machine will destroy the pc that costs the same as the console.  The console would destroy the 400$ pc.

  • Like 1
Link to comment
Share on other sites

People have been saying long time PC gaming is dying for long time. And time and time again they are wrong. We have Star Citizen the largest kickstarter project in history. We Occulus Rift.  I can't remember a time when there is so much activity in PC gaming. 

 

PC gaming is bigger now than it has ever been.

Link to comment
Share on other sites

But what part of being able to enjoy games despite their technical proficiencies is regressive & harmful to the industry?

 

That doesn't mean people don't want a new generation of better playing & better looking games. Hell, it doesn't even mean they don't expect it. But it also doesn't nesscarily mean folks should be boycotting games that aren't 1080p & 60fps.

 

I have answered this already.

 

Quite simply put, if you are an early adopter of the newest console generation, and you are making statements that "30fps is enough for most people" you are practicing a clear contradiction.

 

The objection is nothing more than the application of cognitive bias.

Link to comment
Share on other sites

I'm getting so tired of the which console is more powerful. The spec that both system has I Dont see it lasting more than 5 years. Technology is going so fast and the new consoles are already outdated by a few years now. Can't see myself getting either. Does anyone feel this way?

 

 

They seem stale and underwhelming already. I think 5 years is an overly optimistic expectation for either of them.

 

Compared to PCs, and, surprise!! a lot of people don't give a *** about gaming on them :D

Link to comment
Share on other sites

Opinion is irrelevant when it's based on a position of ignorance, I fully agree that many people might not be aware of the difference between 30fps and 60fps, but that's more a matter of further ignorance combined with the lack of equipment to perform comparisons.

I'm not talking about those that have an opinion based on ignorance. I'm referring to those that are fully aware of what the numbers mean and yet are willing to make trade offs on one aspect or another. Ignorance is a completely separate point.

Just because they are willing to accept certain things does not mean they are happy with things staying as they are. It just means they are able to enjoy a game that may not hit all marks at the highest level but at the same time appreciate games that push the boundaries of what had been limitations in the past. Its not all or nothing here, its a mix.

In a way you are, "taking it too seriously" is a common hurdle to many new forms of entertainment. If you want a prime example happening today, just look at esports.

Look at the comments section on articles in mainstream gaming news sites covering big esports events like Valve's "The International" Dota 2 tournament, and you'll get a wall of "lol nerds taking games too serious lol get laid xDDD".

Oh sure, those guys are jerks, but when I said that, I was not making that point. My point was that sometimes people will get stuck on one piece of the puzzle and be unable to look at the entire picture. The internet allows us to lay all of this stuff out, which is great, but some people seem to get overexposed to it and lose sight of the bigger picture.

 

 

Result of running a TF2 server for many years, and encountering many kids who had "your gay" as the first port of call on their small vocabulary of insults.

 

They never were able to finish asking their question, I would ask them "what about my gay?" and they'd usually just scream incoherently and ragequit. :(

:laugh: Oh I know the feeling. I've moderated my share of forums over the years and it can be downright ugly. Makes you question the value of humanity sometimes :laugh:

 

Oh most certainly, but at the same time it doesn't mean you can't still choose your wording carefully. ;)

No arguments there. I've been trying to master that 'art' for a while. Be honest without sparking the bs :laugh:

Link to comment
Share on other sites

Compared to PCs, and, surprise!! a lot of people don't give a *** about gaming on them :D

Exactly.  I swear I have 5 friends locally that have xbox ones and 3 have pcs but not a single one games on them for one reason.  They think its to dang complicated to install the games and hope they work on your system and what the hell to do when they dont.  PCs are way to fragmented as almost no 2 pcs are alike.   Its hard explain on a tech forum though because everyone here is obviously going to think its easy because we have done it a million times.  I would not be suprised if most console gamers dont have a freaking clue what the specs of their console is besides "it looks better than my ps3" and some gamestop employee screaming "but 1080p 60fps!" and they go with it cause they dont want to look stupid lol at least thats how most the people i know are that are console gamers.

Link to comment
Share on other sites

Exactly.  I swear I have 5 friends locally that have xbox ones and 3 have pcs but not a single one games on them for one reason.  They think its to dang complicated to install the games and hope they work on your system and what the hell to do when they dont.  PCs are way to fragmented as almost no 2 pcs are alike.   Its hard explain on a tech forum though because everyone here is obviously going to think its easy because we have done it a million times.  I would not be suprised if most console gamers dont have a freaking clue what the specs of their console is besides "it looks better than my ps3" and some gamestop employee screaming "but 1080p 60fps!" and they go with it cause they dont want to look stupid lol at least thats how most the people i know are that are console gamers.

If we are being reasonable tech folks, we really shouldn't be assuming it's necessarily easy for AAA games. I'd pretty much feel obligated to tech support someone if I told them to buy a gaming PC over a console. I think anyone who is doing that is mostly just trying to push an agenda one way or the other. It's kind of like forcing someone to use your preferred OS. It's kind of totally an evil thing to do  :)

Link to comment
Share on other sites

Exactly.  I swear I have 5 friends locally that have xbox ones and 3 have pcs but not a single one games on them for one reason.  They think its to dang complicated to install the games and hope they work on your system and what the hell to do when they dont.  PCs are way to fragmented as almost no 2 pcs are alike.   Its hard explain on a tech forum though because everyone here is obviously going to think its easy because we have done it a million times.  I would not be suprised if most console gamers dont have a freaking clue what the specs of their console is besides "it looks better than my ps3" and some gamestop employee screaming "but 1080p 60fps!" and they go with it cause they dont want to look stupid lol at least thats how most the people i know are that are console gamers.

 

Completely agree, you have to be too much narrow minded to think that current console gen will not last more than 5 years because of PCs being uber powerful and advancing so much, yeah, they do, good for them, I have a modest one, have some games on it (mostly moddable ones), but that doesn't stop me to enjoy all platforms, even the Wii, I'm a gamer, not a graphics snob :yes:

Link to comment
Share on other sites

I have answered this already.

 

Quite simply put, if you are an early adopter of the newest console generation, and you are making statements that "30fps is enough for most people" you are practicing a clear contradiction.

 

The objection is nothing more than the application of cognitive bias.

 

 

Maybe it's just me, but it seems like you are half talking in riddles. I am finding it quite hard to grasp exactly what you're saying & how it applies here. I mean... I am not an early adopter; I don't have a new machine.

  • Like 2
Link to comment
Share on other sites

If the information is incorrect, you'll have to be more concrete. Anyone here is free to correct me if I've said anything blatantly wrong in my posts. Aside from the 32GB/s listed at the end (it should have been 22.6GB/s), I don't see how there could have been.

 

And no, it's not the same situation as now, these current systems took older generation designs at the GPU level and low-power cores which is why they were well matched by mid-range equipment even upon release. The big difference is that the newer generation didn't drive R&D like the originals had in their designs. Instead, their design is much more a case of off-the-shelf components put together. That's a summation of the details I was getting at in my prior posts.

 

Understand your point, just really didn't like the lecturing tone you assumed, but i regret losing it. To what you are saying: i can see why you think these two machines are further behind current PC levels than 360/PS3 were in 2005/2006, but i very much don't agree with you as someone who has been consuming this technology (and therefore supporting it) since the late 70's. The current consoles are a huge step over their predecessors, which i think is the reference for Sony/Microsoft, more so than comparing them to PCs. As a technical person, you should know the compute power is very impressive.

 

Since the specs are no secret on both, we know X1 has an 18-core chip with 5 billion transistors, and the PS4 a 20-core chip with over 7 billion transistors. Same 28nm process as NV and AMD GPUs. I can't remember the TFLOP numbers for either, but they are impressive. Off the top of my head, an i7 4770K has like 3 billion transistors? And the toppest of the line discrete GPU has what, 6-7 billion? I think transistor count is an indication of performance, so both consoles compare favorably to PC components relative to their price.

 

Do they compete with PCs? Sure, if consumers have a choice, then there is competition. I recommend PC if you must have just one platform. However, i don't think Sony and Microsoft intend for them to take on PCs, especially not MS. As engineers they will know there is no point in fighting PCs. Their goal was to deliver machines much more powerful than their predecessors (which they are), at prices that are tolerable (in the age of the $700 smartphone, they are), and that won't catch on fire (hence the sub 2GHz SoCs).  

 

Half the people in MS still wake up screaming about thermal paste and overheating in the middle of the night, so can you blame them? Regarding off the shelf, PS4/X1 are slightly more so than PS3/360, but those were also just modified PowerPC CPUs and their GPUs were versions of existing ATI and NVIDIA silicon. I'm not an engineer, but i think you are overstating the previous generation's contribution to technical progress. It was a monster engine for sales and software/services innovation yes, but even i who loved those two consoles knew from day one they WERE underpowered. A mere 512MB RAM in 2005 was already low, 8GB today is average-high.

 

Staying on the OP, bottom line is these consoles will not be the main models for as long as their predecessors. But i don't agree that they are underpowered, i like what i see a lot right now, they're off to a much, much better start than any "generation" before, and i've had them all. PC hardware made huge strides between 2008-2011, but the 22nm/28nm phase has also overstayed its welcome.

Link to comment
Share on other sites

These things are so 'underpowered' compared to where they SHOULD be in today's world, but compared to the last gen consoles, they are miles ahead.

 

For me, I just want more multimedia and other uses, games... I got a PC for that, and if I didn't, a phone, laptop or tablet are all within purchasing reach.

  • Like 1
Link to comment
Share on other sites

I'm not talking about those that have an opinion based on ignorance. I'm referring to those that are fully aware of what the numbers mean and yet are willing to make trade offs on one aspect or another. Ignorance is a completely separate point.

Just because they are willing to accept certain things does not mean they are happy with things staying as they are. It just means they are able to enjoy a game that may not hit all marks at the highest level but at the same time appreciate games that push the boundaries of what had been limitations in the past. Its not all or nothing here, its a mix.

 

Apathy is no less harmful, it's just the same thing wrapped up in a different package.

 

Maybe it's just me, but it seems like you are half talking in riddles. I am finding it quite hard to grasp exactly what you're saying & how it applies here. I mean... I am not an early adopter; I don't have a new machine.

 

I think it's more a case of not wanting to understand rather than not understand quite honestly. I was quite clear in my reasoning.

Link to comment
Share on other sites

Apathy is no less harmful, it's just the same thing wrapped up in a different package.

Oh come on, its not apathy.

Its in in fact possible to enjoy games of varying levels of quality in certain areas. Its not a negative thing.

Heck, much of the 'indie' gaming scene is fueled by people that enjoy games of a certain type that may not be progressive in regards to graphics or even gameplay.

Link to comment
Share on other sites

Oh come on, its not apathy.

Its in in fact possible to enjoy games of varying levels of quality in certain areas. Its not a negative thing.

Heck, much of the 'indie' gaming scene is fueled by people that enjoy games of a certain type that may not be progressive in regards to graphics or even gameplay.

 

False equivalence. Triple-As and Indies are defined genres of games, that is not the same as "tolerating" poor performance in terms of framerate.

Link to comment
Share on other sites

False equivalence. Triple-As and Indies are defined genres of games, that is not the same as "tolerating" poor performance in terms of framerate.

Wait, are you talking about games that are meant to be running at a certain frame rate but instead perform lower or are you talking about games that are built to run at a frame rate that is lower than say what the pc platform can support?

For instance, if a game is made that is released at 30fps and indeed runs steady at that vs a game released to run at 60 but instead jumps from 30-60 regularly?

I think gamers tolerate poor performance when that performance hits an acceptable level. Now, if that poor performance means a frame rate that is all over the place or some other kind of visual performance issue, then I would say there is more universal disdain for that.

Removing the indie part of my comment, the rest still holds true.

Link to comment
Share on other sites

This topic is now closed to further replies.