Bioshock PC Demo


Recommended Posts

Oh, so you bought Bioshock, ANova? What happened to stickin' it to the man and showing him you're not going to support games that don't support three generation old hardware? :laugh:

Two words: Sour Grapes.

Link to comment
Share on other sites

Oh look, most of the shaders have been converted by a few determined people also in my situation.

http://www.paolofranchini.com/shshock/view...p;start=10#p549

I tried it and guess what, the game looks damn near what it should and plays like a dream on my system. Imagine that. :rolleyes:

:o Oh man, those are ulgy GFX! I'm glad my card can do SM3.0 then! Wow, talk about crap....

Link to comment
Share on other sites

Bravo, you know your stuff, but.....you forgot something important. There are variables in everything, so while more memory can certainly help in situations where a bottleneck forms and the processor is stuck there waiting for the memory to catch up when resolutions and/or texture sizes are high, that memory in turn can also end up waiting on the processor if the processor cannot keep up with the demands imposed by the game; meaning instead that the memory has nothing to do because the situations are reversed. This, of course, depends on the game in question, however typically low and some mid range cards that try to run demanding games suffer from the latter situation so lots of ram does them no good at all. I understand what you're trying to say, my X850 XT only has 256 MB of ram which undoubtedly does hurt it's performance in games that make use of high quality texturing, but that is precisely because it's processor is fast enough to keep up even in todays games. You see, I really don't care if I have to turn the texture resolution down to play a game so that it doesn't stutter or cause a framerate drop, I would just like to be able to play the game. Bioshock's minimum specs call for 256MB of video memory, which is what my card has, so technically that isn't really an issue here. Of course your framerates and image quality improved going from a 9700 Pro to an X1650 Pro, the rendering power nearly doubled and the memory quadrupled. Such would not be the case at all in my situation, it would reverse. So yes, while I could waste money and downgrade just to play one game while improving only in image quality because of excessive memory, I do not see that as a worthwhile use of my funds, nor an efficient or smart means of solving the problem, it is instead just a work around and a bad one at that. Sorry, but I will not waste money at my expense because of the developers decisions.

As for the human eye, it is far more complex than anything we can concieve of, much more so than even our best and newest cameras. The 30 fps limit is a myth based only on unproven tests. I, myself notice a difference but once again, how the eye interprets things is very subjective and dependant on the situation.

Current hardware is not the be all end all. My hardware is stuck in limbo, somewhere between the industries latest which they always want you to keep buying and the point at which software drops support. My hardware is not old enough to be outdated, it may not be the fastest by any means but it is fast enough to run everything out there today, including all games. The ONLY, and I mean ONLY, reason this game will not run is precisely due to one aspect of the game's rendering process, a rendering process that makes no difference in quality, speed or otherwise. That, is why I'm annoyed. If my system had a 6800 Ultra in it, I would be playing the game. Guess what, an X850 XT is faster than a 6800 Ultra in all catagories, fillrate, memory bandwidth, you name it.

I never said that you had to upgrade to the latest and greatest (in fact, I pointed out that I specifically *didn't* upgrade that way, and why). However, there comes a point where you either upgrade your hardware, or do without the latest software (whether it be a game or an application). I upgraded my GPU hardware when I literally had no other option (GPU failure), but I most certainly didn't get an X1950PRO (let alone an X1950XT), even though it is *possible* my current system could swallow such a card (and I also explained why I didn't go that far; too expensive, and I am planning a completely new build around Christmas). I gave *several* options for possible upgrades (I specifically did *not* recommend the X1550, though it also supports SM 3.0, because the performance would have been much worse; however, I did include both the X1950PRO and XT because, like the X1650, they are available in AGP and don't require driver upgrades or changes; however, both cards may require PSU changes due to their increased power needs). Instead, you decided to bash 2K Boston for requiring SM 3.0, and resorted to a hack so you could play Bioshock anyway.

Link to comment
Share on other sites

Oh, so you bought Bioshock, ANova? What happened to stickin' it to the man and showing him you're not going to support games that don't support three generation old hardware? :laugh:

I downloaded the demo Mr. Einstein. Is that the best response you could come up with? :laugh: My framerates were in the 50s average as well, not bad for three generation old hardware on a game that "needs" newer SM3 hardware. :rolleyes: Even when you place evidence right in front of people's faces they still don't get it.

:o Oh man, those are ulgy GFX! I'm glad my card can do SM3.0 then! Wow, talk about crap....

Oh come now, the only things that aren't yet working are the distortion shaders and some particle effects. The conversion is a work in progress.

Edited by ANova
Link to comment
Share on other sites

Oh come now, the only things that aren't yet working are the distortion shaders and some particle effects. The conversion is a work in progress.

At least I don't have to wait for a crap conversion to play the game with all details. I got the game and enjoy it. Just stop all that whining and upgrade your damn hardware.

Please stop this madness.

Link to comment
Share on other sites

50s? Is that all, ANova? I know plenty of users with 8600 cards getting far more than that, and it's not even a top of the line card. Your card was at the time. I would expect better frames than that, especially given that you're not running SM 2.0.

And, ANova, you're right... even when you place evidence in front of people's faces, they don't get it. And you didn't even realize that the evidence you're giving implies that you're at least contemplating getting the game when this project is complete, despite the lack of SM 3.0... so much for your moral crusade of fighting for developers to support aging hardware :)

Link to comment
Share on other sites

ANova, you crack me up. You accuse others of personal attacks when you contribute just as many.

And stop dismissing every single post someone says as invalid because you don't agree with them. I could easily say the same thing about all your posts, but that wouldn't make me right. Some of your posts have had valid points, but they're all essentially just opinions. Some of my posts have valid points, but they're all essentially opinions.

Get over it. Step down off your high horse, pal.

Link to comment
Share on other sites

My points are not opinions, they are based on fact. Until you understand that, you cannot understand my situation. If you have money to throw around you are always free to buy me a new system since you think mine is outdated.

I don't own a horse.

Edited by ANova
Link to comment
Share on other sites

Your points were opinions based on facts. As was everything I said. As was everything PGHammer said. But, when it comes down to it, they're all still opinions, which is what you don't seem to grasp. There are always going to be different ways to view things. Doesn't necessarily make any part right or wrong, any party stupid (or "an ignoramus") or smart, any party superior or inferior.

I don't have money to throw around. I saved up money for my new system for almost 7 years, thank you. Maybe you should do the same every once and a while and upgrade in between, since technology is always going to improve.

Link to comment
Share on other sites

Sorry but 2k choosing to support SM3 is not an opinion and Bioshock's shaders being converted to SM2 code without any difference in graphical effects is not an opinion. You blaming ATI for 2k's actions is an opinion and so is you thinking I need to upgrade due to a variation in some code. Just because OSX only supports OpenGL doesn't mean the system running it is too slow for a directx game. UT3 runs with directx in windows and OpenGL in OSX because the developers supported both formats. 2K dropped support for my card because they dropped support for a format in exchange for a format thats practically the same thing. I have every right to be dissappointed, if you can't accept that oh well.

Edited by ANova
Link to comment
Share on other sites

Where are the moderator when you need them?

This is out of hands and nothing to do with the original subject. Please ANova and Ayepecks, clean up your mess in PM....

Link to comment
Share on other sites

Sorry but 2k choosing to support SM3 is not an opinion and Bioshock's shaders being converted to SM2 code without any difference in graphical effects is not an opinion. You blaming ATI for 2k's actions is an opinion and so is you thinking I need to upgrade due to a variation in some code. Just because OSX only supports OpenGL doesn't mean the system running it is too slow for a directx game. UT3 runs with directx in windows and OpenGL in OSX because the developers supported both formats. 2K dropped support for my card because they dropped support for a format in exchange for a format thats practically the same thing. I have every right to be dissappointed, if you can't accept that oh well.

Only true ATI fanbois will say that SM3.0 is the same as SM2.0. OPEN YOUR EYES (the shots of SM2.0 look ugly) and upgrade that pos card you have.

Link to comment
Share on other sites

Sorry but 2k choosing to support SM3 is not an opinion and Bioshock's shaders being converted to SM2 code without any difference in graphical effects is not an opinion. You blaming ATI for 2k's actions is an opinion and so is you thinking I need to upgrade due to a variation in some code. Just because OSX only supports OpenGL doesn't mean the system running it is too slow for a directx game. UT3 runs with directx in windows and OpenGL in OSX because the developers supported both formats. 2K dropped support for my card because they dropped support for a format in exchange for a format thats practically the same thing. I have every right to be dissappointed, if you can't accept that oh well.

OK, now you're taking things out of context and comparing things that are not equal.

2K chose to support shader model 3.0 only.

ATI chose not to support shader model 3.0 in your card.

nVidia chose to support shader model 3.0 in cards from the same generation.

You said it's 2K's and nVidia's fault.

I said it's ATI's fault.

This is simple common sense. You think your statements are the only ones with factual basis, even when proof states otherwise. Don't compare to dissimilar statements. It's not hard to differentiate between opinion and fact, you're just mixing the two up on purpose to lend some sort of credence to your point.

It's your opinion that 2K Games should have supported shader model 2.0. It's my opinion that they shouldn't have, and that they were right to ignore a technology that's three generations old. I supported this with statements from ATI that they knew shader model 3.0 would eventually be the only option in some games. That's fact.

And you aren't even taking into account all of the confounding variables. For one, the game was being ported to the PC as it was being developed for the 360. They wanted to ship both at the same time, and clearly implementing shader model 2.0 support would have taken them longer on the PC version. 2K Australia/Boston has already said they'll consider supporting it in a later patch.

And it's your opinion that the formats are practically the same thing. That is not by any means a fact, so don't pretend it is.

Listen, I know where you're coming from. I've said that throughout this entire debate. But to think your side is the only one with the potentially right opinion is silly. I'm sure the Nazis thought the same thing ;)

Link to comment
Share on other sites

Only true ATI fanbois will say that SM3.0 is the same as SM2.0. OPEN YOUR EYES (the shots of SM2.0 look ugly) and upgrade that pos card you have.

Heh yep I think somebody needs to see what Bioshock SHOULD look like :)

epic_awesome.jpg

(this was before I upgraded to Quad Core (same gfx card 8800gts 320MB btw)

Link to comment
Share on other sites

The debate that's going on in this thread is absolutely rediculous.

For a start, ANova, I don't like taking sides in ultimately pointless debates (like this one) but you sound more like a disgruntled consumer than someone who actually seems to know what they're talking about.

For example, your claim that "SM3 is just a clone of SM2" is absolutely ludicrous. Just take a look at the wiki for it and there's a table that clearly shows SM3.0 has a LOT of differences from 2.0.

That's not to say that you couldn't possibly make a similar effect in SM2.0, but SM3.0 gives developers a LOT more options and can make them a LOT nicer. SM3 IS a big deal and if you seriously think that it's just a clone of SM2, then ask yourself why ATI deliberately didn't implement it a few years ago when nvidia did.

So hopefully that's that cleared up.

But who's to blame for all of this?

Some people seem happy to blame 2K games for this and in a very small way, they're probably right. The game COULD have a SM2 path in it, but it does mean that the developers would have to limit their design a bit. Think of it this way, say they made a really really kickass effect that was actually important to the game, what if it couldn't be reproduced in SM2? Then what do you do?

But ultimately, the people loosing out there are 2K themselves. They've just alienated potential customers. They're a big company, they know what they're doing and at some point, they've almost certainly worked out the figures and decided it was cheaper to just chop off SM2.0 rather than pay to have people rewrite all of the shaders.

Now what about ATI? They're the next people to blame, after all nvidia supported SM3 with the geforce 6's, why couldn't ATI do it with the x8xx series?

Here's the thing, their reasoning at the time was "Not many games currently use SM3 so we're not going to support it". Fair enough comment, it's the same sort of thing Valve is saying about DirectX10 (i.e. Not many gamers have DX10 cards +vista, so they're not going to support it).

So ATI's stance was pretty clear - at the time, nothing used it so people wouldn't notice a problem and they could save a few $$$ in development costs.

A worthy candidate for the blame here, if you ask me.

But.

There's another party people haven't considered here - The disgruntled owners.

It's quite simple, people (and listen closely, ANova, since you know so much about shaders), when you all went out and bought your shiny knew X800's or whatever, you should have known full well that they didn't support SM3.0. If you KNEW this and bought the card anyway, then you should have expected this day to come. If you did, then it's your own fault and not 2K's. If you want to blame ATI, go ahead, but at the end of the day, they sold you what was on the front of the box and you have no reason to complain.

Link to comment
Share on other sites

I'm really ****ed guys, you know, I have my GeFarce 4 MX and doesn't support shading at all! So why didn't they include support for fixed function cards?! ZOMGZORZ, 2K are bastards!

Moreover, my 16MB Cirrus Logic PCI card doesn't have 3D acceleration at all! Why is there no support for software rendering?! All those SM1.0/1.1/1.4/2.0/3.0/4.0 is just copy of software rendering!!1onetwothree!

:rolleyes:

Link to comment
Share on other sites

The debate that's going on in this thread is absolutely rediculous.

For a start, ANova, I don't like taking sides in ultimately pointless debates (like this one) but you sound more like a disgruntled consumer than someone who actually seems to know what they're talking about.

For example, your claim that "SM3 is just a clone of SM2" is absolutely ludicrous. Just take a look at the wiki for it and there's a table that clearly shows SM3.0 has a LOT of differences from 2.0.

That's not to say that you couldn't possibly make a similar effect in SM2.0, but SM3.0 gives developers a LOT more options and can make them a LOT nicer. SM3 IS a big deal and if you seriously think that it's just a clone of SM2, then ask yourself why ATI deliberately didn't implement it a few years ago when nvidia did.

No, that is not the case. SM3 simplifies the code and increases some maximums. Any shader created for SM3 can also be duplicated for SM2 without much if any loss in performance, they just require more passes and thus a little more code to achieve. It is a small evolution born out of SM2b.

Why did ATI "deliberately" decide not to implement it? They didn't deliberately decide anything. The R4xx was based on the R300, the architecture would have required a major rehaul to implement SM3. The 6800 on the other hand was a completely new design, nvidia implemented SM3 because they could and because it gave them a checkbox feature to add for marketing.

Leo, the white textures were a result of preliminary hacks, those have been fixed long ago. Don't be a dumbass.

Link to comment
Share on other sites

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.