Switched monitor from DVI to HDMI, annoying underscan bug


Recommended Posts

The monitor attached to my rig (all listed in signature) was previously attached via DVI, I decided to switch it to HDMI as an experiment (wanted to see how well it worked, I aim to get a blu ray drive in future and would like to use HDMI for HDCP) and I seemed to get better colour reproduction so I decided to stay with HDMI. While I fixed the underscan problem in desktop mode through the ATI CCC (and most games), for some reason when I run Unreal Tournament 2004 it seems to revert to underscanning by 15%, this problem seems unique as none of my other games return to the false underscan mode. Although not a massive issue as I don't play UT2004 that often I'd still like to fix it if possible so any tips would help (I run all my games at the native res of my monitor, 1920x1080)

Link to comment
Share on other sites

On the "better colour reproduction" part, have you got output set to RGB Full? As it defaults to the inferior YCbCr otherwise.

 

Because you shouldn't notice any difference between HDMI and DVI, they're the exact same signal.

Link to comment
Share on other sites

The monitor attached to my rig (all listed in signature) was previously attached via DVI, I decided to switch it to HDMI as an experiment (wanted to see how well it worked, I aim to get a blu ray drive in future and would like to use HDMI for HDCP) and I seemed to get better colour reproduction so I decided to stay with HDMI. While I fixed the underscan problem in desktop mode through the ATI CCC (and most games), for some reason when I run Unreal Tournament 2004 it seems to revert to underscanning by 15%, this problem seems unique as none of my other games return to the false underscan mode. Although not a massive issue as I don't play UT2004 that often I'd still like to fix it if possible so any tips would help (I run all my games at the native res of my monitor, 1920x1080)

 

Don't have a profile for UT2k4 set to underscan at 15% right?

 

Also, the obligatory just in case: http://www.justin.my/2011/12/cannot-display-full-screen-using-ati-amd-radeon/ ;-) (assuming they didn't change their configuration again, I haven't checked on my ATI card in awhile)

Link to comment
Share on other sites

On the "better colour reproduction" part, have you got output set to RGB Full? As it defaults to the inferior YCbCr otherwise.

 

Because you shouldn't notice any difference between HDMI and DVI, they're the exact same signal.

 

Didn't know that, thanks

 

Don't have a profile for UT2k4 set to underscan at 15% right?

 

Also, the obligatory just in case: http://www.justin.my/2011/12/cannot-display-full-screen-using-ati-amd-radeon/ ;-) (assuming they didn't change their configuration again, I haven't checked on my ATI card in awhile)

 

I'm using the default game profiles that you can install additionally with the drivers, there wasn't one for UT2004 by the looks of it, but you can't control the underscan option from game profiles anyway, it's only available in the "my digital flat panels" section. I've already done the bit that adjusts the setting in the CCC as your link suggests and it seems to work for all games except UT2004. For some reason it just doesn't like UT2004 and I can't work out why. Ironically it's fine with UT99. I have also tried to adjust GPU scaling but it makes no difference either

Link to comment
Share on other sites

Are you sure you're running UT2004 in 1080p?

 

ATI saves underscan settings per resolution. My guess is you're running in a different resolution. To fix it, change to whatever resolution you're using ex. 1600x1050 on the desktop, change the underscan settings and go back to your native resolution.

  • Like 1
Link to comment
Share on other sites

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.