Is the PC upgrade cycle dead?

A somewhat predictable PC upgrade cycle was a welcome fact of life back in the '80s and '90s, but it might be a thing of the past. Or it might not. Analysts are debating it, and we will find out what they conclude next year.

Here's the brief history. In those long-ago days, PC manufacturers--and hence software developers, chipmakers, and computer dealers--would see a spike in demand every three to four years. Microsoft and Intel would come out with major refreshes of their product lines roughly at that cadence. The whole system of steady upgrade cycles culminated with the release of Windows 95 in 1995, still one of the best orgies of technology binge buying on record. Start me up!

After Windows 95, however, operating system upgrades were no longer strong enough on their own to prompt upgrade cycles. A slight bump in sales might occur, but it was hard to say whether it was related to a new OS. Few were buying new PCs so they could get their hands on Windows 98, or 2000, according to analysts at the time. Instead, people were upgrading when their PCs started to seem too slow or started to have problems.

View: Full Article @ C|Net News

Report a problem with article
Previous Story

Windows Vista Power Savings help you save costs

Next Story

Microsoft hits back at Google with Live Search News

24 Comments

Commenting is disabled on this article.

People may not be upgrading like they used to, but they probably should. Considering these points:

1) It is incredibly cheap to build a non-gaming computer that has plenty of power and memory.
2) Onboard graphics chips from AMD and Nvidia are quite good right now.
3) The power usage from a new machine can be significantly lower than that of an old machine if the right parts are used. And could eventually pay for a large portion of itself in time. (if, for example, upgrading from a Prescott and geforce 6600 to a low power Athlon X2 and 780G or 8200 onboard graphics and of course, a higher efficiency PSU)
4) Disk space is always useful and the low end drives are fairly large now.
5) Everyone should have a nice LCD

But I admit that almost nobody needs to get high end parts anymore. The sole purpose of building a high-end rig at the moment is just to see how fast and/or absurd you can go (or trying to play Crysis).

to be honest not everyone can afford to upgrade. If I want to play the latest games on PC i need minimum 9800 nvidia in sli, quad core cpu and 2 or 4gigs of ram. Anything less would just be pointless. Lets be honest about it. Now When i was younger and at home i could afford money on the pc but once you get a car, house mortgage, monthly bills you cant afford to blow loads of money on a pc. Hardware manufacturers are too greedy releasing too many products contributing to electrical waste. Not only that but people dont want to be continuously upgrading when they think oh in 6 months something newer will be along. In the current world economy there just isnt the money for these extravagant expenses. Not only that but considering games consoles have caught up considerably with pc's you think to yourself why bother with pc gaming. I mean lets face it most pc online games involve cheating. More or less everyone on a pc cheats at online gaming. F*ck that ****, play on a console where people cant cheat. Oh and upgrading for operating systems ? hah why bother. If you went out to upgrade your pc for vista id have to quesiton why. Anyway rant over im happy with my 2 year old comp and wont be buying for another 3 or 4 years.

we need the software industry to catch up with the hardware we have today. this way we all can see benefits and the software can make a huge leap. but sadly, this is hard to fulfill because only a handful of companies can afford those investments.

I'm ticking along on my quad core and 4gb of ram and not thinking it's that noticeably faster than my previous dual core with 2gb of memory. There will always be tasks for which strong computational power is favorable, video encoding, etc. but many mainstream users do not demand this much processing power. If operating systems and applications would remain lean and efficient in their coding, then the progressive upgrade requirements would be a mute point. (see Vista) I'm quite happy with my 7900GS and not seeing any reason to go a newer card anytime soon... and for what I pay for a video card alone, I may as well buy a PS3 or X360. Developers spend far much time on graphics in PC games now and forget the storyline and gameplay itself, the console developers are better to keep their focus on those points. (see Crysis) PC games will not die out anytime soon and their are strong genres for the platform that are not on console. (see RTS titles, ex. Total War series)

* Hardware enthusiast.
* PC Gamers (yeah, there are still people who prefer to play games on PC).
* People who do a lot of video editing benefit from newer machines.
* And then there are people who haven't upgraded their computers in a REALLY long time (as in they are still running Win98) and are going to pick up whats new today.

But if your computer is 4-5 years old, and you aren't in any of the above categories then I doubt you are hurting for an upgrade.

Edit: the smiley in win98 was unintentional but made me laugh so i'll keep it.

+1 original post... On a console you don't have to fiddle around with drivers or worry about copy protection schemes, the game is tailored to look good and run well on your hardware automatically, and the controls are properly mapped by default. You used to not have to worry about patching your games but that's changing... Now the enthusiast can worry about bigger TVs and a nice gaming chair.

I used to upgrade every 3 - 6 months too when it really made a difference in games and performance, but these days I game mostly on consoles and when I upgrade it's because I actually need it for specific tasks in applications, and not because "I need a responsive system" ... I already have that.

I haven't upgraded my home computer since 2004...P4 Hyperthread, 2.8 ghz, sata raid array.
Does everything I need it to do, since I don't play games, it's fast enough.

(naap51stang said @ #6)
I haven't upgraded my home computer since 2004...P4 Hyperthread, 2.8 ghz, sata raid array.
Does everything I need it to do, since I don't play games, it's fast enough.

Funny how the article FAILS to mention this. I completely agree with you. Beyond games and specialized applications (HD Video editing, 3d graphics for instance) there is little for the average computer user to gain from a multi-core processor running Windows Vista other than a fancier UI that doesn't necessarily increase productivity at all. The applications people use for home and business are a bit static at the moment where as in the past every iteration of processing power saw new applications that most people could take advantage of.

So, that's why Vista is a resource hog, to drive hardware sales. Luckily I'm off that treadmill with Linux.

I use Linux too. But I guess I use it because I like the way it works with me, not because of some disgust I feel toward Microsoft or their products. Maybe that's the difference between you and I? The compulsion to slag Microsoft at every chance.

(markjensen said @ #5.1)
I use Linux too. But I guess I use it because I like the way it works with me, not because of some disgust I feel toward Microsoft or their products. Maybe that's the difference between you and I? The compulsion to slag Microsoft at every chance.

nice one mark.

They are 5 years late on this article.

This is common knowledge for anybody who has owned PC's throughout the late 80's and 90's.

On a plus note, computers are pocket change compared to previous prices.

I think it just comes down to pure power. Computers are incredibly powerful, the only limits we seem to reach is storage. I have a 2006 macbook and recently upgraded the HDD to 320GB SATA, apart from that the computer is still very quick, 1.83Ghz Core Duo with 2GB RAM. I know this is not exactly an aging bit of kit, however it's not exactly fresh, when they were releasing computers back in the late 90's a 1 1/2 year gap was an incredible difference in speed.

My 2005 Powerbook is still running strong, there are no applications which i can't run on it. It can still play all my iTunes media (music and Video) without a problem.

As i said before the only factor in today's world is storage, which has grown and developed but not at the same rate as processors and most of the rest of the computer. Access time and transfer speed needs to be greatly increased, i would also like to see a bigger jump in hard disk capacity (wouldn't we all )

Yeah, I know people still using G4's (you can get them refurbed for cheap now) and they work fine even with today's applications and Leopard on.

Of course not. What have we around the corner? Blu-ray. HD Camcorders. Then you have the suggestion by a senior Epic developer that DirectX will become software based and run on multicore processors by 2012, which would increase performance for systems with poor graphics (like most Dell systems) and could seriously boost the PC as a gaming platform. Plus I'm sure there are plenty of other factors that have yet to show themselves. People haven't "needed" to upgrade since the Pentium 4 but still they do.

(theyarecomingforyou said @ #2)
... Then you have the suggestion by a senior Epic developer that DirectX will become software based and run on multicore processors by 2012, which would increase performance for systems with poor graphics (like most Dell systems) and could seriously boost the PC as a gaming platform..

This is oppposed to the hardware DX that we all know and love, eh? LOLOLOLOL.

I know what you MEAN to say with this statement.. but you really need to gain a better grasp of what you yourself are saying, before trying to explain it to someone else

Currently DX is hardware based (DX9, DX10 graphics cards) whereas it is predicted it may move become software based (running on the processor). I think the distinction between hardware and software is very clear, hence why I didn't dumb down my comment - obviously I should have to avoided comments like yours.

(theyarecomingforyou said @ #2.2)
Currently DX is hardware based (DX9, DX10 graphics cards) whereas it is predicted it may move become software based (running on the processor).

Intel is going up against both Nvidia and AMD in this and it is not going to win. Graphics cards will remain the primary type of hardware for 3D games. Graphics cards are already close to being a "computer on a card" so the CPU is really becoming less and less important. This is part of the reason the upgrade cycle has been disrupted.

About the only thing pushing sales of new hardware now is Vista and all its bloat. But many people are going to look for alternatives to that. Most people don't play hardcore games, and don't do anything intensive with their PCs.

If the OEMs, Intel and Microsoft want to save the PC market, they have to do much more to promote PC gaming. Microsoft should kill off the Xbox entirely (or make it PC-compatible in the future) and pit the PC against the PS3.

Heck yeah, it's somewhat dead. Not EVERYONE thinks they have to play "keep up with the Jones," so much anymore.

Except for maybe, people that hangout in places like this!