What was it that made Vista so bad?


Recommended Posts

There were multiple reasons.

One reason was the low availability of drivers when Vista was released.

Device manufacturers had plenty of time while Vista was in beta test to build drivers for the operating system. However, mostly all manufacturers procrastinated and were unable to deliver device drivers on time. Even several months after Vista's public release, manufacturers started to release drivers. Because the release of drivers was very slow, people complained that their devices would not work with Vista.

This reason is also applies to software that deeply integrates with the operating system. For this kind of software, customers had to wait several months, a year, or more for the software developer to release an update for Vista.

Pretty much this to be honest. I was a Vista alpha/beta tester, and during that time nVidia released a grand total of 2 drivers (may have been more internally). And those drivers weren't any good for playing games, or anything. The only thing they were "good" for was enabling aero. Hell, as an example to just how poorly written nVidia drivers were, I even remember choosing to stick with the stock Microsoft drivers when Vista released for my 6800 over the nVidia ones because they worked better, delivered better frame rates and crashed about a billion times less.

And don't even get me started with Creative and their crap.

What I don't understand is why didn't these hardware manufacturers get off their asses sooner? Microsoft had documented their driver framework in great detail, and I remember getting a bunch of WDDM development kits / discs mailed to me during testing. Pretty sure OEMs had access to a ton more support / references / code than us testers had but yet they basically did nothing until Vista released and then realized oh **** everything is different.

Link to comment
Share on other sites

I remember that for me, my graphics card driver took about half a year to get stable enough to handle Vista. Also, it was poorly optimized in memory usage, and many sighed of relief when they saw Windows 7's performance.

Edit: I agree Vista "shouldn't of been" Longhorn. Longhorn was some sort of ridiculous concept using .NET code for the Windows Shell among other things, which Vista's performance would have been even less prepared for. Also, the .NET framework itself was far from mature during Longhorn's developmnet. So I'm not sure how Longhorn even came to end up in an OS they were actually developing, and didn't just remain as a project in the Microsoft Research labs.

Link to comment
Share on other sites

The whole "Vista capable" thing is another reason why it got a bad rep. I wouldn't even run XP on those recommendations.

+100

Link to comment
Share on other sites

The original Longhorn concepts were awful. I realize a lot of people like to look at anything that ever "didn't happen" and decide things would have been better 'if only', and that's cute, but naive.

Those early demos had tons of critics at the time, readily forgotten today, and easily shrugged off a cop out like "well it couldn't have been as bad as Vista!!!1" It reminds me of people who praise the glory of XP, completely ignoring its first 2-3 years on the market when it was ridiculed for being the Fisher Price OS and all the cool kids said they were 'downgrading' to Windows 2000.

Those videos make Windows look like Microsoft made an OS out of the old Microsoft Network from Windows 95. Everything about the visual style screams MS's online services and the old way they looked. No thanks.

/also, HAVE. Should HAVE been. There is no form of English in the history of all English-speaking peoples the world over in which "should of" makes sense.

Quoted for ****ing truth!

Link to comment
Share on other sites

what I don't understand is these hardware and software companies cannot create the new versions of thier products even at the same time the new microsoft operating system is released... they wait till much after.

Microsoft releases previews of thier operating systems months before release so things can be ready BEFORE THE RELEASE!!!! so that way everything is good to go.

these companies are very lazy or stupid. or both.

The whole "Vista capable" thing is another reason why it got a bad rep. I wouldn't even run XP on those recommendations.

greedy manufacturers selling underpowered and overpriced hardware. nothing more. instead of loading an older operating system that no-one wants to buy they load the 'latest and greatest' which happened to be shiny new vista.... which made it easy to sell those 200 dollar laptops for 600-700.

manufacturers got really desperate becuase the sales were declining as a result of there being a HUGE gap in between new windows versions. vista was rushed to satisfy that.

Link to comment
Share on other sites

The main reasons Vista was so bad was because it was a huge step change for Microsoft, introducing a new UI, which took a while to catch on. Plus it has extremely heavy on system resources, and most computers couldn't handle it.

Link to comment
Share on other sites

It has always felt slow for me.

Going back to Vista from 7 is a pain, Vista just feels like every setting is out of place, and hard to get to. Maybe I'm just not used to it.

Link to comment
Share on other sites

No WDDM drivers for the Intel GMA900 was my gripe. Yes it was an awful graphics solution but it was capable of running aero glass at the end of the day.

Other than that, a clean install and it ran like a dream on my Dothan Pentium M laptop with 1Gb RAM.

Link to comment
Share on other sites

I've just had to update a vista machine to Windows 7 and I'll go through the issues that I dealt with that I had forgotten about since working with Win 7;

I would have liked to make a partition for so I could install Win7 and then move personal files over before formatting the original and extending the Win7 partition.

Despite having auto defrag on, the hard drive was 30% fragmented. This was my gf's old desktop so I know that it's left on and idle enough for it to do it in the background, but for whatever reason it didn't.

So, after a good 14 hours of defragmenting, the drive was good to do. I went into disk management and then tried to shrink the volume. No go. No error given, just a "not successful". So I downloaded defraggler and used that. The drive was still 10% fragmented with stuff at the end of the drive, despite what the built in defrag said. So I ran defraggler for another 5 hours and tried again. Didn't work.

Next idea was to just do an in place upgrade to Win 7 - backed everything up and attempted to do that. Then I was told that this version of Windows cannot be upgraded. So I just had to format the drive and start from scratch.

Aside from that - the OS is slower on the same hardware than Windows 7, even in the first two to three days of using 7 when it is indexing all the files and whatnot. Plug and play support in Vista is very poor (slightly improved from XP but nowhere near 7).

Link to comment
Share on other sites

Biggest problems I'd say were:

  • Manufacturers shipping PC's that were under specced to handle Vista (Microsoft setting the barrier too low after being forced / coerced by Intel)
  • People upgrading PC's that were simply far too under specced to run Vista
  • Lack of drivers, despite the extended BETA periods designed to allow everything to get everything in working order.

The initial reviews were quite good too - but manufactures shipping PC's with stupid thing's like 1.6Ghz processors and 512MB of RAM destroyed it's image. Windows 7 uses a little less resources than Vista, yet still dictates higher minimum specs - because the Vista ones were simply too low.

Windows 7 isn't all that different from Windows Vista, despite a small coat of new paint. The biggest difference is - the average hardware is now more powerful and better off at running it - and everyone's mostly sorted out their drivers by now.

Link to comment
Share on other sites

I think after Vista everyone learned they had to provide a much better experience. Not only does providing a bad experience hurt MS, but it also hurts the OEMs since that customer could possibly move to another OEM who would happen to take much better care of their installations. By the time 7 rolled around, most have learned by their mistakes and provided computers that looked better, operated better, and performed better. The last few laptop's I've worked on that had 7 had hardly ANY crapware, which surprised me.

Link to comment
Share on other sites

I guess I truly don't understand what it was that had made Vista so bad. I have always been told that it was sooooo bad, however I just see it as a younger Windows 7. The one thing I have heard, and has made since in my head, was that the problem with Vista was that it was implemented on computers that werent ready for it. They were still building computers to the XP specs and tried to load Vista on it.

Since its launch date, Vista had been quite unstable for me until SP1 was out.

The biggest inconvenience for me centred around its incomplete file system. MS changed its plans for WinFS a few times, there were delays, and the final product had many faults. Often times, simple operations like copying and moving files could take ages.

WDDM is another story. Coupled with the bugs in the file system, opening a folder with media files and thumbnail preview setting could easily crash my PC.

Link to comment
Share on other sites

Since its launch date, Vista had been quite unstable for me until SP1 was out.

The biggest inconvenience for me centred around its incomplete file system. MS changed its plans for WinFS a few times, there were delays, and the final product had many faults. Often times, simple operations like copying and moving files could take ages.

WDDM is another story. Coupled with the bugs in the file system, opening a folder with media files and thumbnail preview setting could easily crash my PC.

File system bugs? That crash while viewing media was NOT a result of "file system bugs" or an "incomplete file system", but a result of crappy codecs. It's the reason I stopped using Nero's crap since it would ALWAYS crash explorer. Nevermind the bloat.

Link to comment
Share on other sites

So, from I can tell it was not so much Vista as it was the manufacturers. There somethings that needed changing on Vista, but what new, and I mean NEW, OS hasnt released with a few messed up things here and there right.

Link to comment
Share on other sites

I agree about the crap computers that manufacturers were installing Vista on.

A lot of the laptops I encountered were 512/1 GB RAM and 1.2. - 1.8 Ghz single core Celeron processors. No wonder it ran so bloody slow!

Link to comment
Share on other sites

Some very good points above. I also add:

Viral opinion: Vista became something 'cool' to hate.

Very much so - and it had enough issues both in the basic consumer field and the business field for that to really catch on.

I would say though that I had it running on mid-high end hardware for the time for a year and it was a solid, stable and fast (pre windows 7) operating system.

Link to comment
Share on other sites

I got Vista on launch day and it nothing but trouble,compatibility issues ,driver issues etc etc .After it was service pack 1 it had improved,then when service pack 2 came along things improved alot, its not a bad OS now,I think the trouble was its hardware specs were too high and complex for the time ,the stuff thats around now can handle it easy but back then no, I have 1 tower and 1 laptop on vista ,1 tower on 7 and a laptop on 7 and a trusty old Dell optiplex Gx 280 on xp pro and Im pleased with all of them ,personally I think MS should do one last service pack for Vista and XP cos alot of people wont give them up easily,I know plenty of people still using older windows still like 98/2000 etc (they must be mad with no proper support and no decent antivirus support anymore) oh well its up to them I suppose

Link to comment
Share on other sites

too right my first pc was a time 699(bubble) that had ME on it ,but when I put XP on it it ran much better ,was too old for Vista so I gave it away and its still going after all this time bless it

Link to comment
Share on other sites

The biggest problem with Vista was the driver support. I'll never forget being at the launch event for Vista in NYC, and knowing that NVIDIA still didn't have Vista drivers out for my shiny new 8800GTX video card at the time. I asked the NVIDIA rep when drivers would be out and he stated that he didn't know. So I asked him why the hell he is at a launch for an OS they clearly don't support.

Link to comment
Share on other sites

Yeah there was loads of drivers issues with Vista at launch, though it did have its own niggles as well. The slow file transfers was one, the graphics memory allocation issues, Superfetch (brilliant idea, but did it HAVE to hammer the drive as soon as it booted?), the redundant UAC prompts and the increase in memory usage.

I had built a PCwith a E6600, 2GB DDR2 RAM and a GTS 8800 for when vista was released so only really had issues with the 8800 drivers :/. After SP1 and changing Superfetch to delayed start was Vista a hell of a lot better to use. And Nvidia had stable drivers by that time. YAY!

But wait, Vista sounds like another OS Microsoft released in 2001 that ran like crap on bare min specs and was buggy as hell with a large learning curve as everything had changed and everyone hated it and downgraded to the previous release and wasn't fixed till its SP2. Damn it! the names on the tip of my tounge....

Link to comment
Share on other sites

The version change had no effect on the drivers. It was the security features and other API changes that broke old/poor drivers.

yet to this day MS refuses to change the major version number because it and this is from Microsoft "will break some driver compatability with the NT Kernel".... there is some truth to the version change idea, but it only had a bad impact on some drivers that where coded to work with a specific kernel version like some anti-virus kernel drivers (remember them complaining about having to rewrite them because "the major version changed"?)... the graphics and network driver models changed, but nothing really drastic besides how low level you could hook into the kernel... which ****ed off a few root kit makers...

Link to comment
Share on other sites

but nothing really drastic besides how low level you could hook into the kernel... which ****ed off a few root kit makers...

Norton? :rofl:

Link to comment
Share on other sites

This topic is now closed to further replies.