Recommended Posts

I'd like to think they'll bring out a DX12 renderer and you guys will stop bitching, but I doubt it.  Maybe given how close it is, but I still doubt it.

Link to comment
Share on other sites

Wandering around happened upon some drowners basically at the start of the game...

Game: Tip:  You can flee battle by....

Me:  lol flee, witchers don't flee fool!

Me:  dead >.<

 

Game is good ole witcher fun, graphics are good but as LaP said, not really better than any other game.  I don't rate them much higher than DAI for example. 

 

Also runs surprisingly well on a 280x @ 3440x1440 on high settings with texture and detail to Ultra.

Link to comment
Share on other sites

Sorry, but this was absolute deception, no way to spin this. 2013 screenshot (and this was from gameplay footage)t:

 

the-witcher-3-downgrade-143136316157.png

 

Retail screenshot:

jpg

 

Such a removal of atmosphere, the day one patch even removes quite a bit of filtering on consoles because they still couldn't run the game at stable 30fps framerates looking only slightly better than The Witcher 2 did (which had a good port to Xbox 360).

Now if the options to make it look like the 2013 build were still in the game (doesn't matter if it requires massive power), nobody would have cried foul, but they aren't, that's why people are mad, people want things to keep moving forward, not stagnate to the lowest common denominator just because it sells well on hype alone.

Let's see if someone finds some of the options and hacks them like they did with Watch Dogs. Otherwise I have no doubt it will return in an enhanced edition only for PC.

Link to comment
Share on other sites

Default control is not very good on pc. I might play it with a controller.

Tried that during my playthrough, when I wanted to raise my feet and just relax with a gamepad. It was horrible.

Controls with mouse and keyboard were much more accurate for me. Especially during looting and not only combat.

Link to comment
Share on other sites

Using 360 Pad and its probs best. Tho swimming is a p.o.s. rest im fine with.

Not even giving a hoot about any gfx downgrades.. games stunning on Ultra on PC.

 

Link to comment
Share on other sites

I haven't really even noticed the graphics (they are nothing special), but the GAMEPLAY has been awesome. Working flawlessly here on the Xbox One.

Link to comment
Share on other sites

Tried that during my playthrough, when I wanted to raise my feet and just relax with a gamepad. It was horrible.

Controls with mouse and keyboard were much more accurate for me. Especially during looting and not only combat.

 

really?  i am a huge fan of kayboard and mouse, but it has being really weird in witcher 3.     switched to xbox one gamepad, and played for 10 minutes, felt much better.

 

i will see how it goes in major combat, but overall it felt more precise.

Link to comment
Share on other sites

Plays fine on KB&M to me but then again I disliked DAI with the controller and preferred it with KB&M despite popular opinion.

Link to comment
Share on other sites

Plays fine on KB&M to me but then again I disliked DAI with the controller and preferred it with KB&M despite popular opinion.

 

i take that back. after some combat, i realized controller does not help.   swimming is terrible. 

 

back to kb&m here for combat and using controller for general wondering.   it is nice that you can switch seamlessly and use both at the same time :)

i finished da:i with kb&m too, even though i bought the controller after first few hours playing just to improve my gaming experience.   kb&m players are hard converts.

 

i use my controller in platformers mostly.    in 1st/3rd person games it is not as great.      had to tweak the mouse sensitivity a lot before i got it to where it feel natural.

 

 

overall, it seems a great game, i see how reviews rate it so high.  you really have all that you possibly want from RPG.

have to play it much longer to see if i like it better then da:i but it is looking out to be that way so far.

Link to comment
Share on other sites

I haven't really even noticed the graphics (they are nothing special), but the GAMEPLAY has been awesome. Working flawlessly here on the Xbox One.

 

yeah combat is really fun specially on the last 2 levels of difficulty.

Link to comment
Share on other sites

really?  i am a huge fan of kayboard and mouse, but it has being really weird in witcher 3.     switched to xbox one gamepad, and played for 10 minutes, felt much better.

 

i will see how it goes in major combat, but overall it felt more precise.

 

It's not really the kb/m control that is weird. It's the default keys binding that kind of **** imo. I did my own keys binding and it works well now.

Link to comment
Share on other sites

i played an hour or so this morning and i just dont get people comments about the graphics.  I played the witcher 2 as well on max settings and in my opinion the only thing that i cant say that the withcher 3 kills it is the trees and grass.

 

Was designed for a controller this time around for sure but that works for me as i prefer a controller.

Link to comment
Share on other sites

Anyone knows how the DLC installation works on galaxy? When i click on them it downloads 27MB but then nothing happen i have no feedback and i don't know if they are installed or not. They really need to work on that game extras page and add feedback to know what is installed or not.

 

Also anyone knows what the bonus pack is?

Link to comment
Share on other sites

i played an hour or so this morning and i just dont get people comments about the graphics.

Was designed for a controller this time around for sure but that works for me as i prefer a controller.

 

Well CDPR kind of marketed the game as a "next gen game" that would push PC to its limit and require an upgrade to play on ultra.

 

To be honest i can put most of the settings on ultra and it still run fine enough at 1080 for an offline rpg even when fighting multiple mobs. I have an overclocked core i5 750 with 8GB of ram and a single 970. Not really an high end pc ...

 

It looks good but unless they make en enhanced edition it wont age very well. I just wish the game would support better AA. I would gladly turn some settings down to high to get less aliasing.

  • Like 1
Link to comment
Share on other sites

Anyone knows how the DLC installation works on galaxy? When i click on them it downloads 27MB but then nothing happen i have no feedback and i don't know if they are installed or not. They really need to work on that game extras page and add feedback to know what is installed or not.

 

Also anyone knows what the bonus pack is?

i got two dlcs on GOG Galaxy -

temerian armour set = 25mb

beard and hairstyle set for gerald = 19mb

 

 

clicking on them downloads them somewhere.. where i cannot find them, and i have no indications that they were ever installed.

 

 

some kind of bug maybe?

Link to comment
Share on other sites

Also anyone knows what the bonus pack is?

 

On May 20th we will release the first bundle of DLCs (2 of the planned 16) -- the Temerian Armor Set (horse armor included) and a Beard and Hairstyle Set for Geralt, the game
Link to comment
Share on other sites

No Cursing or Swear Words
We encourage you to use our communities as a forum to debate topics, but please use proper adjectives to express yourself. We do not tolerate circumvention of our word filter or abuse upon another member. As stated above, we are a forum for all ages and expect posts to be family / work friendly.

Edited by Andrew
Link to comment
Share on other sites

Well CDPR kind of marketed the game as a "next gen game" that would push PC to its limit and require an upgrade to play on ultra @4k

 

To be honest i can put most of the settings on ultra and it still run fine enough at 1080 for an offline rpg even when fighting multiple mobs. I have an overclocked core i5 750 with 8GB of ram and a single 970. Not really an high end pc ...

 

It looks good but unless they make en enhanced edition it wont age very well. I just wish the game would support better AA. I would gladly turn some settings down to high to get less aliasing.

 

Fixed that for you.

Sure if you're playing on PC hardware and you're just playing at 1080 it doesn't take a super system to ramp up the specs.

You're totally right that the game isn't going to age very well if your criteria is how good it looks at 1080.

They designed the game to encourage PC gamers to push up the resolution, eventually to 4k.

So in a few years maybe they'll be single card reasonably priced GPUs that can play the game at 4k and it will look better for them then 1080 Ultra does now on existing hardware.

Also your GeForce 970 is a second gen Maxwell GPU so it's the current top of the line architecture even if it isn't the top sku.  That part at least IS considered "high end".

Link to comment
Share on other sites

Fixed that for you.

Sure if you're playing on PC hardware and you're just playing at 1080 it doesn't take a super system to ramp up the specs.

You're totally right that the game isn't going to age very well if your criteria is how good it looks at 1080.

They designed the game to encourage PC gamers to push up the resolution, eventually to 4k.

So in a few years maybe they'll be single card reasonably priced GPUs that can play the game at 4k and it will look better for them then 1080 Ultra does now on existing hardware.

Also your GeForce 970 is a second gen Maxwell GPU so it's the current top of the line architecture even if it isn't the top sku.  That part at least IS considered "high end".

Perhaps but not with current video cards. A GeForce GTX 980 can't run the game at 1080p/60 on Ultra. For that, you'll need a GeForce Titan X. Also, it appears to favour AMD cards which isn't too surprising. Next-gen consoles use AMD GPUs with the same architecture as the HD 7K and R9 200 series cards.

 

It'll be awhile before 4K gaming is possible at 60 FPS with single-GPU video cards. At least with The Witcher 3, considering how demanding it is on Ultra.

Link to comment
Share on other sites

it appears to favour AMD cards which isn't too surprising. Next-gen consoles use AMD GPUs with the same architecture as the HD 7K and R9 200 series cards.

That's actually a bit surprising given that it's an nVidia GameWorks enhanced game.

http://wccftech.com/nvidia-responds-witcher-3-gameworks-controversy/

It'll be awhile before 4K gaming is possible at 60 FPS with single-GPU video cards. At least with The Witcher 3, considering how demanding it is on Ultra.

It depends on what you mean by awhile I guess. He was talking about how it was going to age though so my comments weren't directed to existing hardware. GPU's manufacturing has stagnated for the last few years at the 28nm process because fabs had issues with 20nm designs of high power chips (20nm low power mobile chips are fine).

So GPUs appear to be about to make a major jump from 28nm to 14nm/16nm FinFet designs as well as 3D memory. When that backlog finally breaks (likely later this year or next) there is likely to be a large jump in performance. Combine that with DX12 improvements and I'd bet higher end single card (non-Titan... something like a GeForce 1080 maybe) will be able to do 4k@30 consistently next year.

Maybe a year or two later for 60fps. Mid range cards in the sub 28nm/3D memory generation will (again later this year or next) will likely have little trouble with 1080@60fps.

Link to comment
Share on other sites

That's actually a bit surprising given that it's an nVidia GameWorks enhanced game.

http://wccftech.com/nvidia-responds-witcher-3-gameworks-controversy/

 

It's easy to assume that but when you take a look at benchmarks, it paints a different picture. The GameWorks features are just that, features. They're going to run better on NVIDIA hardware because that's what it's designed for but improving performance isn't the goal. AMD has a slight advantage now because of the next-gen consoles. Games like Ryse: Son of Rome and Far Cry 4 perform better on AMD hardware (despite the latter featuring GameWorks effects).

 

It depends on what you mean by awhile I guess. He was talking about how it was going to age though so my comments weren't directed to existing hardware. GPU's manufacturing has stagnated for the last few years at the 28nm process because fabs had issues with 20nm designs of high power chips (20nm low power mobile chips are fine).

So GPUs appear to be about to make a major jump from 28nm to 14nm/16nm FinFet designs as well as 3D memory. When that backlog finally breaks (likely later this year or next) there is likely to be a large jump in performance. Combine that with DX12 improvements and I'd bet higher end single card (non-Titan... something like a GeForce 1080 maybe) will be able to do 4k@30 consistently next year.

Maybe a year or two later for 60fps. Mid range cards in the sub 28nm/3D memory generation will (again later this year or next) will likely have little trouble with 1080@60fps.

4K/30 is doable now with current high-end cards (e.g. GTX 970, R9 290X). 4K/60, on the other hand, requires dual-GPU configurations. It'll be interesting to see how DirectX 12 benefits end users. As for aging, that's covered by ubersampling. It's intended to not run well on current hardware and it serves as a bit of future-proofing in terms of image quality.

 

I forgot about 14nm/16nm GPUs. I guess we'll have to wait and see. I said it'd be awhile because AMD still hasn't released their new video cards yet and we don't know what NVIDIA will bring out aside from the GTX 980 Ti. I'm currently running 2x R9 280Xs in CrossFire and I plan on running those until AMD and NVIDIA release their next next-gen cards.

Link to comment
Share on other sites

This topic is now closed to further replies.