can't see PS4 or Xbox One lasting more than 5 years.


Recommended Posts

No, it's not. Ignorance or apathy simply aren't valid justification, such thinking is regressive and harmful to the industry as a whole.

God forbid people enjoy what they have :rolleyes:

Link to comment
Share on other sites

How is it harmful to the industry at all?

 

It should be blatantly obvious why any form of regressive thinking is harmful to any industry. It's especially ironic considering that we're having such a discussion in the context of a whole new generation of consoles.

 

God forbid people enjoy what they have :rolleyes:

 

God forbid people actually advocate for progress rather than stagnation.

  • Like 2
Link to comment
Share on other sites

It should be blatantly obvious why any form of regressive thinking is harmful to any industry. It's especially ironic considering that we're having such a discussion in the context of a whole new generation of consoles.

 

God forbid people actually advocate for progress rather than stagnation.

One question: Do you believe the PC side of gaming is about progress?

Link to comment
Share on other sites

One question: Do you believe the PC side of gaming is about progress?

 

I fail to see the relevance of your question to the topic at hand, are you attempting to construct a strawman perchance?

Link to comment
Share on other sites

It looks like Infamous will only be 30fps.  Talk about last gen. This gen is really looking pathetic.

 

Anyone who expected this gen to be able to always do 1080p60 is delusional.

 

Most single gpu PC can't do it properly (fps drop under 60) why would the consoles be able to do it?

 

Most single gpu PC experience drop under 60 when running graphic intensive games. The dev on console has 2 choices. Either it locks the fps at 30 or let it vary between 30 and 60 with triple buffering vsync enabled. Honestly i don't think there's much of a difference between 30 and 40 something. Sometime a constant fps is better if the variation is too big (goes from 30 to 60 on a regular basis). Lot of PC games these days let you lock the fps.

 

For online fps, racing games and fighting games you want 60fps. For the other games 30 is easily playable. I prefer unlocked fps with triple buffering vsync. But when it comes to offline adventure/action games i don't mind 30 fps.

 

I'll take 1080p30 before 720p60 any time of the day for offline games where the action is not too fast which is the case with Infamous.

Link to comment
Share on other sites

I fail to see the relevance of your question to the topic at hand, are you attempting to construct a strawman perchance?

I was just curious to see if your same line of thought applied to PC gaming, I merely meant to use it for a comparison.

The way I see it is that we have newer consoles capable of more than their predecessors and yet you find that is stagnation and not progress, so I wanted to find out if you think that each iteration of PC hardware is also stagnation and not progress.

Link to comment
Share on other sites

Anyone who expected this gen to be able to always do 1080p60 is delusional.

 

Most single gpu PC can't do it properly (fps drop under 60) why would the consoles be able to do it?

 

Most single gpu PC experience drop under 60 when running graphic intensive games. The dev on console has 2 choices. Either it locks the fps at 30 or let it vary between 30 and 60 with triple buffering vsync enabled. Honestly i don't think there's much of a difference between 30 and 40 something. Sometime a constant fps is better if the variation is too big (goes from 30 to 60 on a regular basis). Lot of PC games these days let you lock the fps.

 

For online fps, racing games and fighting games you want 60fps. For the other games 30 is easily playable. I prefer unlocked fps with triple buffering vsync. But when it comes to offline adventure/action games i don't mind 30 fps.

 

I'll take 1080p30 before 720p60 any time of the day for offline games where the action is not too fast which is the case with Infamous.

 

 

 

I'd rather have 720p60 FPS then 1080p 30.   On some TV its very hard to tell the difference between 1080p and 720p  to each.  Infamous doesn't look that great anyways. 

Link to comment
Share on other sites

I was just curious to see if your same line of thought applied to PC gaming, I merely meant to use it for a comparison.

The way I see it is that we have newer consoles capable of more than their predecessors and yet you find that is stagnation and not progress, so I wanted to find out if you think that each iteration of PC hardware is also stagnation and not progress.

 

PC or Console doesn't come into it, what I was doing is pointing out the irony in people using such a defence in the context of a brand new generation. You simply cannot use such a defence while being an early adopter, it's a fundamental contradiction.

 

It's because of this that I think if we're all brutally honest here, the real reason behind such statements is because of the cognitive bias people have towards their console of choice - and one of those said consoles is going through a rather rough spot in that regard.

 

The point rings true for every industry, some people don't notice the difference between 480p and 1080p, yet I doubt many here would agree 480p is acceptable today.

Link to comment
Share on other sites

God forbid people actually advocate for progress rather than stagnation.

Progress is a loaded term that can mean different things to different people.

Does everyone agree that visuals in gaming are the most important thing to 'progress'? Look around and you will see there is not a single answer.

Honestly, I think we are taking things way too seriously when it comes to gaming.

The new consoles are many times more powerful then their predecessors. They offer advanced software and hardware components that are evolutions of the past designs. They are in fact progressing forward. If you want to be precise, you could claim that they have not progressed enough, but to say they are stagnate would be misleading.

 

 

It's because of this that I think if we're all brutally honest here, the real reason behind such statements is because of the cognitive bias people have towards their console of choice - and one of those said consoles is going through a rather rough spot in that regard.

 

The point rings true for every industry, some people don't notice the difference between 480p and 1080p, yet I doubt many here would agree 480p is acceptable today.

Why not be really brutally honest. Your claiming that anyone that does not agree with your point is simply falling prey to their fanboy tendencies regarding the X1. No need to beat around the bush here. You certainly bring up an interesting point. Is it a true point? Not sure.

Link to comment
Share on other sites

Anyone who expected this gen to be able to always do 1080p60 is delusional.

I wasn't aware that this was even the case these days. I thought many of the AAA titles with high fidelity graphics were already doing 30fps even on the ps4:

 

http://www.ign.com/wikis/xbox-one/PS4_vs._Xbox_One_Native_Resolutions_and_Framerates (this link has horrible formatting for me, you have to scroll to the right to see the ps4).

Link to comment
Share on other sites

I know I don't speak for everyone and no doubt someone is going to reply and with "you're in the minority", but I was more than ready for new consoles by 2010. Especially when every game started to ship on UE3 and they all looked the same, XBL and PSN were being pushed to their limits with possibilities and I wouldn't say the visuals they are pulling off with PS4/X1 were impossible 4 years ago. It was enough to push me back to PCs and as a consequence MS/Sony miss out on my ?. It wasn't because they didn't want to make new consoles, it's because the industry is a complete mess with growing dev costs. I'll never understand how they managed to ship 2 unfinished, underpowered products when R&D started almost 8/9 years ago :no: The very fact they've chosen the cheap parts is probably the biggest hint that they don't want another long generation. Shifts in technology and how we game change too much to for so long. Both MS and Sony have said it themselves that they released PS3 and 360 at bad times when everything started to shift because their eyes were pretty much just set on the SD to HD leap. Everything else had to be hacked into the OS which we all know ended up with sluggish experiences. They won't repeat that and I'm sure future proofing is very much planned for but there's only so much you can do, especially if something totally left field comes in and changes everything. The obvious pick right now would be VR but who knows what else is around the corner.

 

Yes, we were screaming for new hardware back in 2010-2011 but of course both platform holders and devs/publishers were too busy making money hand over fist to notice. Like Andy and many others i was pushed back to PC gaming because i wanted to move forward and support newer tech. I may not be an engineer or a tech creator, but i'm a great minion, i make those guys in Titanfall look bad. I support new tech all the time, this is how things move forward and humanity accomplishes more across different fields.

 

Remember that back in 2005-2006 spending $400 on a new device was considered a huge deal. Now people don't bat an eye at spending twice that on a new smartphone or tablet every freaking year. Sure, it's mostly subsidized with contracts and stuff, but not all - we all know plenty of people that change mobiles every year or even more often. I am one of those people!

 

Stop with the underpowered mantra - it is simply not true. Use your common sense, look at the specs. We have not seen what PS4 and X1 are capable of, with every passing day it's more obvious that developers are the culprits. They were so spoiled by the 360/PS3 aeon that they forgot what it's like to work beyond those specs - to wit horrible PC ports and now stuff like Titanfall that doesn't even have v-sync. Seriously, anyone think X1 REALLY can't do v-sync in a game as visually mediocre as Titanfall? It is a gameplay triumph but visually...oh boy.

 

So yes, bottom line is these two devices were not meant to be a repeat of the 360/PS3. That era has passed, its success has helped us and the industry immensely, but as we all know the leading cause of problems in any long marriage is a STALEMATE.

Link to comment
Share on other sites

Progress is a loaded term that can mean different things to different people.

 

Progress is a loaded term when it comes to subjective matter like politics, however we're discussing a matter of objective technological advancement, progress is progress.

 

Does everyone agree that visuals in gaming are the most important thing to 'progress'? Look around and you will see there is not a single answer.

 

I'm not sure the relevance of this statement outside of the implication that advancement of graphical fidelity precludes that of gameplay and narrative. It also presumes that such advancement cannot impact in those categories either.

 

Ask any competitive Quake or CS player about the importance and impact of frame rate on gameplay for example.

 

Honestly, I think we are taking things way too seriously when it comes to gaming.

 

You could say that about any form of entertainment media. At the end of the day though, it's just more regressive thinking.

 

The new consoles are many times more powerful then their predecessors. They offer advanced software and hardware components that are evolutions of the past designs. They are in fact progressing forward. If you want to be precise, you could claim that they have not progressed enough, but to say they are stagnate would be misleading.

 

That's why I didn't say the new consoles are stagnant, I said the notion of "x is enough for most people" leads to stagnation.

 

Why not be really brutally honest. Your claiming that anyone that does not agree with your point is simply falling prey to their fanboy tendencies regarding the X1. No need to beat around the bush here. You certainly bring up an interesting point. Is it a true point? Not sure.

 

(Quick aside, the contraction of "You are" is "You're", not "Your" - the latter is possessive)

 

That's not being more honest, that's just being more confrontational. It's a touchy topic so I worded it as passively as possible.

  • Like 1
Link to comment
Share on other sites

Stop with the underpowered mantra - it is simply not true. Use your common sense, look at the specs. We have not seen what PS4 and X1 are capable of, with every passing day it's more obvious that developers are the culprits. They were so spoiled by the 360/PS3 aeon that they forgot what it's like to work beyond those specs - to wit horrible PC ports and now stuff like Titanfall that doesn't even have v-sync. Seriously, anyone think X1 REALLY can't do v-sync in a game as visually mediocre as Titanfall? It is a gameplay triumph but visually...oh boy.

 

They are underpowered, there's no getting around it. They sport GPU's with the peak performance of higher end PC GPU's from 3-5 years ago, and they have horribly low end, low performance cpu cores that you would find in an ultra low end laptop, not a gaming machine.

Link to comment
Share on other sites

Nonsense, this is not fact. They have compute power equal to a nice gaming notebook, and their GPUs are above the min requirements for the latest PC releases.

 

EDIT: additionally, consoles have always been PC/home computer technology scaled down to equivalent 2-3 years back. PCs from 2-3 years ago could easily do 1080p, v-sync etc. Therefore we are seeing the results of poor development rather than poor hardware. What exactly did you guys expect? How much are you willing to spend? You thought a $400 device will give you performance beyond a PC build where the graphics card alone costs more than that? Get real.

Link to comment
Share on other sites

the whole problem here is we are all on a tech forum so obviously 99% of us tech spoiled and are going to look at specs unlike the average person.  360 and ps3 came out at a time right before pc gaming really started advancing pretty much right around the time the NVidia released the 8800 and things started exploding from there.  At the rate pcs are advancing any console would be outdated after a year.  Specs are only everything to  pc users on tech forums that buy the latest videos cards every year and cant look past specs.

 

I will just add that yeah 3 years ago pcs might of been pushing 1080 easy but I seriously doubt hey would run games at 1080 with all the features and crap of todays games. Unless of course you were running 600$ video cards.  Seems like 3 years ago I had a 480 or somewhere in there and the games I ran didn't look near as good as he games on todays consoles but I could play the at 1080p easily. Cant remember cause I m one of those people that buys new video cards every year but I still love my console games.

  • Like 1
Link to comment
Share on other sites

Nonsense, this is not fact. They have compute power equal to a nice gaming notebook, and their GPUs are above the min requirements for the latest PC releases.

You are talking about minimum requirements though. Mid-range cards from 2-3 years ago can do the same frame rates they do at 1080p. Go back 3-5 years and it skews to top of the line cards as Blackhearted said.

 

EDIT: actually a 100 pound card (~$166) can hit similar performance. I had forgotten about this: http://www.eurogamer.net/articles/digitalfoundry-2014-r7-260x-vs-next-gen-console

 

EDIT: additionally, consoles have always been PC/home computer technology scaled down to equivalent 2-3 years back. PCs from 2-3 years ago could easily do 1080p, v-sync etc. Therefore we are seeing the results of poor development rather than poor hardware. What exactly did you guys expect? How much are you willing to spend? You thought a $400 device will give you performance beyond a PC build where the graphics card alone costs more than that? Get real.

Consoles weren't always the way they are these days. The last few generations of consoles spurred the development of new technology both in terms of graphics processing and architecture. Examples of these are NV2A in the XBox, the Xenon in the Xbox360, and the Cell processor found in the PS3 (you can actually get more specific with the GPUs in terms of particular hardware features that were later pushed into Discrete GPUs, but I don't know these off of the top of my head). How many people had truly multi-core systems (other than HT) when these launched? Basically nobody. Pentium D's didn't launch until the same year and those were only dual cores.

 

I'm not following your logic here: from best I can tell you are saying that because the older consoles were equivalent to 2-3 year old PCs (which again isn't true), that what we are seeing in today's consoles is the result of poor development efforts and not hardware for these newer consoles? Are you trying to say that back then they didn't have issues with performance in comparison to the gaming systems of that time?

 

 

the whole problem here is we are all on a tech forum so obviously 99% of us tech spoiled and are going to look at specs unlike the average person.  360 and ps3 came out at a time right before pc gaming really started advancing pretty much right around the time the NVidia released the 8800 and things started exploding from there.  At the rate pcs are advancing any console would be outdated after a year.  Specs are only everything to  pc users on tech forums that buy the latest videos cards every year and cant look past specs.

Spec's are still important when it comes to those translating into display resolution and framerates up to 30fps even for non-enthusiasts. I've seen normal folk complain about 720p on the XBone anyway. Numerical values per-say? Not so much. But, stuttering or things like that? Yeah

Link to comment
Share on other sites

Progress is a loaded term when it comes to subjective matter like politics, however we're discussing a matter of objective technological advancement, progress is progress.

Exactly and the ps4 and x1 have progressed from the last gen. Argue it is too little, cool, but to deny any progress is incorrect.

 

 

I'm not sure the relevance of this statement outside of the implication that advancement of graphical fidelity precludes that of gameplay and narrative. It also presumes that such advancement cannot impact in those categories either.

 

Ask any competitive Quake or CS player about the importance and impact of frame rate on gameplay for example.

The relevance is that gamers have different opinions on what is acceptable in the various aspects of a game. Of course all aspects, including visuals, have an impact on the overall feel of a game.

 

 

You could say that about any form of entertainment media. At the end of the day though, it's just more regressive thinking.

No, its not regressive thinking. Regressive thinking would be to advocate for no improvements. That is not what I was talking about at all.

 

That's why I didn't say the new consoles are stagnant, I said the notion of "x is enough for most people" leads to stagnation.

On this point, I agree. If your thought is that something is 'enough' and then go the next step and say that no improvements should be pursued, then you you are not advocating for progress.

However, it is possible to claim that we should strive for improvements and progress, while at the same time accepting certain minimum levels.

 

(Quick aside, the contraction of "You are" is "You're", not "Your" - the latter is possessive)

You got me, sorry about that. That must be a pet peeve of yours :shifty:

 

That's not being more honest, that's just being more confrontational. It's a touchy topic so I worded it as passively as possible.

Brutally honest can often be confrontational, so when you said you were trying to do that, I just thought I would help you clear it up.

Link to comment
Share on other sites

You are talking about minimum requirements though. Mid-range cards from 2-3 years ago can do the same frame rates they do at 1080p. Go back 3-5 years and it skews to top of the line cards as Blackhearted said.

 

EDIT: actually a 100 pound card (~$166) can hit similar performance. I had forgotten about this: http://www.eurogamer.net/articles/digitalfoundry-2014-r7-260x-vs-next-gen-console

 

Consoles weren't always the way they are these days. The last few generations of consoles spurred the development of new technology both in terms of graphics processing and architecture. Examples of these are NV2A in the XBox, the Xenon in the Xbox360, and the Cell processor found in the PS3 (you can actually get more specific with the GPUs in terms of particular hardware features that were later pushed into Discrete GPUs, but I don't know these off of the top of my head). How many people had truly multi-core systems (other than HT) when these launched? Basically nobody. Pentium D's didn't launch until the same year and those were only dual cores.

 

I'm not following your logic here: from best I can tell you are saying that because the older consoles were equivalent to 2-3 year old PCs (which again isn't true), that what we are seeing in today's consoles is the result of poor development efforts and not hardware for these newer consoles? Are you trying to say that back then they didn't have issues with performance in comparison to the gaming systems of that time?

 

 

 

Consoles were always like this. Amiga snobs used the same logic against Genesis back in 1989, and it hasn't changed since. Xbox 360 and PS3 didn't spur tech development by specs, they did so by sales and injecting much needed revenue into the industry. They were outstripped by PCs of their time from day one. Who was using multi-core? Not sure about you, but i built a new PC with a Core 2 Duo less than a month before PS3 launched. Again, nothing new.

 

Your $150 card cannot outdo a PS4 or X1. To claim those idiotic master race rights you need a $300 card minimum, otherwise no high or ultra settings for you. Medium at best. And i am specifically saying the games we have on the new consoles now are the way they are due to bad development, not bad hardware. They're all PS3/360 games pushed back and jazzed up a little. Be patient, give it a year. Every console cycle started like this, and it's the same on PC. DX11 GPUs launched in 2010, but how many games even noticed? Metro 2033 and...Metro 2033?

Link to comment
Share on other sites

Exactly and the ps4 and x1 have progressed from the last gen. Argue it is too little, cool, but to deny any progress is incorrect.

I never denied any progress either. 

 

The relevance is that gamers have different opinions on what is acceptable in the various aspects of a game. Of course all aspects, including visuals, have an impact on the overall feel of a game.

Opinion is irrelevant when it's based on a position of ignorance, I fully agree that many people might not be aware of the difference between 30fps and 60fps, but that's more a matter of further ignorance combined with the lack of equipment to perform comparisons.

 

No, its not regressive thinking. Regressive thinking would be to advocate for no improvements. That is not what I was talking about at all.

In a way you are, "taking it too seriously" is a common hurdle to many new forms of entertainment. If you want a prime example happening today, just look at esports.

 

Look at the comments section on articles in mainstream gaming news sites covering big esports events like Valve's "The International" Dota 2 tournament, and you'll get a wall of "lol nerds taking games too serious lol get laid xDDD".

 

However, it is possible to claim that we should strive for improvements and progress, while at the same time accepting certain minimum levels.

That's not really relevant in this case though.

 

You got me, sorry about that. That must be a pet peeve of yours :shifty:

Result of running a TF2 server for many years, and encountering many kids who had "your gay" as the first port of call on their small vocabulary of insults.

 

They never were able to finish asking their question, I would ask them "what about my gay?" and they'd usually just scream incoherently and ragequit. :(

 

Brutally honest can often be confrontational, so when you said you were trying to do that, I just thought I would help you clear it up.

Oh most certainly, but at the same time it doesn't mean you can't still choose your wording carefully. ;)

Link to comment
Share on other sites

Nonsense, this is not fact. They have compute power equal to a nice gaming notebook, and their GPUs are above the min requirements for the latest PC releases.

 

EDIT: additionally, consoles have always been PC/home computer technology scaled down to equivalent 2-3 years back. PCs from 2-3 years ago could easily do 1080p, v-sync etc. Therefore we are seeing the results of poor development rather than poor hardware. What exactly did you guys expect? How much are you willing to spend? You thought a $400 device will give you performance beyond a PC build where the graphics card alone costs more than that? Get real.

 

Not really. The xbox 360, for example, was quite comparable to gaming pc's of 2005.

Link to comment
Share on other sites

We can keep saying stuff back and forth but it won't matter. You must be forgetting that people said exactly the same things about 360 in 2005, that it was underpowered, that 512MB total memory was a joke, that games won't be able to run since the OS will likely eat half the RAM...this all sounds very familiar. When you're 40 like me it sounds very familiar...like really familiar, cause you've been hearing it since 1979 when people compared coin op Space Invaders to VCS 2600 Space Invaders...

 

While this discussion is making me feel young again, i feel it is a moot point. So here is my summary, after which i am pulling rank and all discussion ceases j/k :rofl:

 

Love my X1 and PS4, also love my PC, my phone, my tablet...love my coffee mug, my central air, and fire. Love human technology. So, it's great people are putting pressure on companies to step up and deliver more technology, that is the whole point: progress. However, i don't like double standards. PC snobs rarely complain that NV and AMD have been regurgitating the same 28nm technology since 2011, and keep rebranding it with different price tags.

 

At any rate, i'm not evangelizing anything as i'm platform agnostic...i just think while definitely pushing for better performance we should give new devices a chance. Remember the first months are always ######. Forget the old geezer speeches i give about the 8-bit era, remember first Resistance on PS3? PDZ on 360? Even the first Uncharted compared to the second. The earlier releases always look like someone vomitted on your screen. Be patient and a little more forgiving while voicing criticism, don't write anything off. I love Titanfall as it's a very compelling shooter, but the graphics are a con: clearly 360-spec, and they didn't even bother doing v-sync after upping the resolution to 792. Friggin 72 extra vertical pixels and they're too lazy to give us v-sync because "60fps reasons" :rolleyes:

Link to comment
Share on other sites

People have been saying long time PC gaming is dying for long time. And time and time again they are wrong. We have Star Citizen the largest kickstarter project in history. We Occulus Rift.  I can't remember a time when there is so much activity in PC gaming. 

Link to comment
Share on other sites

... Xbox 360 and PS3 didn't spur tech development by specs, they did so by sales and injecting much needed revenue into the industry. They were outstripped by PCs of their time from day one. Who was using multi-core? Not sure about you, but i built a new PC with a Core 2 Duo less than a month before PS3 launched. Again, nothing new.

Please don't just ignore that I gave you means to familiarize yourself with how these architectures differed from what was offered for PC during the era and simply restate the incorrect premise about how they didn't spur or push any sort of technological developments.

 

Here is some relevant information:

  • Pentium D launched in Mid-2005.
    • The designs lacked SMT support which means they were simple 2 cores (2 hardware threads).
  • Core 2 didn't launch until Mid-2006.
    • The initial designs lacked-SMT support which means they were simple 2 cores (2 hardware threads).
  • XBox360 launched in Late-2005.
    • The design is a triple core with SMT support -- meaning it has 6 hardware threads.
    • The GPU had a unified shader architecture that was not found in cards of the era. This unified the processing capabilities for shader processing (pixel and vertex). Prior architectures had to split processing capabilities in an asymmetrical manner in terms of pixel and vertex computations -- very asymmetrical.
    • In terms of processing pipelines it had 48 in comparison to the top of the line discrete card of the era having 24.
    • One of the earlier applications of eDRAM and unified memory architecture (UMA) -- the former yielded a much higher bandwidth to theGPU than was found on discrete cards of the era (256GB/s).
  • PS3 launched in Late-2006.
    • The design used a novel streaming architecture (Cell) with 1 general processing unit and 7 special purpose processing units.
    • The Cell processor was co-designed by Sony and IBM to be a high-performance processor.
    • It was used in high-performance computing (HPC) for that reason and even landed itself in the first Super Computer (Roadrunner) to achieve a peta-flop of performance.

If you want more information just look it up. The fact is that there are notable differences in the technical capabilities and designs of these systems from those of PCs of the era.

 

Your $150 card cannot outdo a PS4 or X1. To claim those idiotic master race rights you need a $300 card minimum, otherwise no high or ultra settings for you. Medium at best. And i am specifically saying the games we have on the new consoles now are the way they are due to bad development, not bad hardware. They're all PS3/360 games pushed back and jazzed up a little. Be patient, give it a year. Every console cycle started like this, and it's the same on PC. DX11 GPUs launched in 2010, but how many games even noticed? Metro 2033 and...Metro 2033?

I just linked you to a ~$166 card that get's around equivalent or better performance than the performance targets of many XBone and PS4 games (at equivalent settings). I'm just going to ignore this master race talk and the insinuations as It's not relevant to what I was talking about or my link.  :huh: I think you've mistaken me for someone else (check my title -- I work in HPC), I rarely play games and only have a HD6870.  :laugh:

 

PC snobs rarely complain that NV and AMD have been regurgitating the same 28nm technology since 2011, and keep rebranding it with different price tags.

They haven't been... there are notable improvements in the architectures every year. You don't get a die shrink just because you feel like it. You get die shrinks when fabs offer them at reasonable prices. Intel gets them because they operate their own fabs.

Link to comment
Share on other sites

This topic is now closed to further replies.