Video ports not working


Recommended Posts

Hi,
I just got a new work computer. It's a Fujitsu ESPRIMO P520 E85+ /CORE I7-4790/ with a Radeon R9 255 graphics card. Windows 8.1 64-bit. I already have a 30 inch monitor since before. It's a Dell 3007WFP and it only has one Dual-Link DVI-D port.

The computer has a slew of video ports on the back, one of which is a DVI-D, but for some reason it doesn't output anything. It's totally dead. There's also a VGA out, which also doesn't work, and a DVI-I (positioned lower on the back panel) which is the only one that actually outputs anything (oh and there's one more, but I dont know what it is exactly, it kind of resembled a VGA but without the blue).

When I plug my DVI-D cable from my monitor to the DVI-I port, the best resolution I can get is 1280x800, which is terrible on such a huge screen and I know from my old computer that it can actually manage far higher resolutions past 1080.

I don't know why this is and so I need help with getting the other ports to actually output something (seriously, it's like they're not even there), or if possible get the DVI-I to output a higher resolutions. But since I really wanna use a 2 monitor setup, Id rather have all the outputs working.

Does anyone have any suggestions?

Thanks in advanced,
Raz

P.S. I've updated all the drivers, didn't help.

post-45891-0-46964300-1429710506.jpg

Link to comment
Share on other sites

You connecting the top one, or bottom one? By default, it should be using your GFX, not onboard. Might want to look in BIOS, and make sure your PCI is selected as default video device.

Link to comment
Share on other sites

Usually, when you plug in an external card it disables the motherboard video ports.  If it does not disable, the system defaults to the new card you installed and the motherboard video becomes secondary.

 

You want to default back to the main board, remove the added video card.

Link to comment
Share on other sites

Like what everyone said above, your most likely using the ports you focused on in the picture, instead of the graphics card right blow all of the phones. Its the card with the white DVI-D and what looks like a HDMI and Displayport

Link to comment
Share on other sites

You connecting the top one, or bottom one? By default, it should be using your GFX, not onboard. Might want to look in BIOS, and make sure your PCI is selected as default video device.

I'm using the bottom one, thats the DVI-I, but I use a DVI-D cable since thats what my monitor uses. In theory it should work just as good... But it doesn't.

I'm not terribly familiar with bios and all that technical stuff, but this is what I found (see attachments). Any suggestions? Oh, BTW, Im not sure why it says AMD Radeon R7/HD 9000. I'm pretty sure the computer I got has a R9 255 according to the component list on the delivery package and the page I ordered it from: https://www.dustin.se/product/5010824321/esprimo-p520

 

Usually, when you plug in an external card it disables the motherboard video ports.  If it does not disable, the system defaults to the new card you installed and the motherboard video becomes secondary.

 

You want to default back to the main board, remove the added video card.

 

Umm, not 100% sure i understood all that, but.. How do I change the default?

Like what everyone said above, your most likely using the ports you focused on in the picture, instead of the graphics card right blow all of the phones. Its the card with the white DVI-D and what looks like a HDMI and Displayport

I've tried them both. Only the DVI-I port outputs something (1280x800), the DVI-D port doesn't output any signal at all.

post-45891-0-94809600-1429715410.jpg

post-45891-0-00933200-1429715419.jpg

Link to comment
Share on other sites

Does your 30inch have HDMI or Displayport? I would use those over DVI-D anyway. Your focus should be using your Video card, there's a reason its included in the computer. I would also drive your 30" using the video card. So your 30inch SHOULD have HDMI or display port, and you should run a cable from the Video Card to your Monitor and see if that fixes it

Link to comment
Share on other sites

Does your 30inch have HDMI or Displayport? I would use those over DVI-D anyway. Your focus should be using your Video card, there's a reason its included in the computer. I would also drive your 30" using the video card. So your 30inch SHOULD have HDMI or display port, and you should run a cable from the Video Card to your Monitor and see if that fixes it

 

 

open box, pull out pcie card.

 

 

As I wrote in the first post, the monitor only has one Dual Link DVI-D port, nothing else.

 

P.S. This is the monitor. As you can see it only has that one video port, and it has HDCP, which I suspect is the reason why I only get 1280x800 when I hook it into the DVI-I on the computer.

http://www.dell.com/content/topics/topic.aspx/global/products/monitors/topics/en/monitor_3007wfp?c=us&l=en&s=gen&~section=specs

Link to comment
Share on other sites

the dvi port on the motherboard is dvi-d

 

the dvi port on the card is dvi-d and dvi-a

 

Pulling out the video card would make the motherboard dvi-d the primary.

post-118098-0-24705400-1429783156.jpg

Link to comment
Share on other sites

I

 

the dvi port on the motherboard is dvi-d

 

the dvi port on the card is dvi-d and dvi-a

 

Pulling out the video card would make the motherboard dvi-d the primary.

I'm sorry, but I really don't want to pull out the graphics card. I mean if I wanted a computer without a graphics card then I could have bought a much cheaper one. Cant I just enable enable the other ones? Like in the boot meny I took pics of?

Link to comment
Share on other sites

What difference would it make? If you are trying to use the onboard it would negate any benefits having the pcie video card, therefore removing it shouldn't be a concern. If you want a video card...get a better one that supports what you want to do.

Link to comment
Share on other sites

I'm not sure I follow your logic. My question is if I can use both. Ive looked around and even see a youtube video where a guy enabled both in his bios. I have not been able to do what he did.

 

I think one of the main problems right now is that the monitor has HDCP, which I thnk is what's causing the DVI-D signal (that's coming from the DVI-I port on the computer) is  being affected and giving me only 1280x800. So is there any way to avoid this HDCP problem? Something I can disable enable so I can get a higher resolution?

Link to comment
Share on other sites

The logic is simple...If you are using the ports on the motherboard and not the ports of the video card, you are not using the resources of the video card.  This makes the video card pointless and useless, which brought me to my suggestion to remove it which would enable and default to the video card on the motherboard.

 

Then you mentioned that you wanted to keep your video card because you bought a pc with a video card with the purpose to use a video card, which then brought me to my next suggestion which is to purchase a video card that would meet or exceed your requirements and remove the video card that was purchased with the system.   This would allow you to continue using an added video card with the purpose to run the resolution that you want.

Link to comment
Share on other sites

Are you under the idea that by putting a dedicated GFX card in, you can still connect to the motherboard ports but get the benefits of the GFX cards processor?

Link to comment
Share on other sites

The card should meet my requirements, that's my point. There's nothing wrong with the specs of the card, I highly doubt that 1280x800 is all that the card is capable of. In fact, it should be able to do far more than even 1080. But I think it's somehow being reduced by the HDCP. Could that be the case?

 

As for using multiple screens, Im pretty sure there are DisplayPort to VGA adapters, so that's probably not gonna be a problem.

 

 

P.S. Fujitsu also sent a DVI-to-VGA adapter. I plugged it into the DVI-I slot (from the graphics card) and then used a regular VGA cable that I plugged into a really old 20-inch monitor. I doubt there's any HDCP there because it's showing options of up to 1920x1200. So im getting more and more convinced that HDCP is the culprit. Any ideas?

Link to comment
Share on other sites

The card should meet my requirements, that's my point. There's nothing wrong with the specs of the card, I highly doubt that 1280x800 is all that the card is capable of. In fact, it should be able to do far more than even 1080. But I think it's somehow being reduced by the HDCP. Could that be the case?

 

As for using multiple screens, Im pretty sure there are DisplayPort to VGA adapters, so that's probably not gonna be a problem.

 

 

P.S. Fujitsu also sent a DVI-to-VGA adapter. I plugged it into the DVI-I slot (from the graphics card) and then used a regular VGA cable that I plugged into a really old 20-inch monitor. I doubt there's any HDCP there because it's showing options of up to 1920x1200. So im getting more and more convinced that HDCP is the culprit. Any ideas?

 

 

A little research would have given you an answer, the cheapy card cannot do more than 1280X720

 

http://www.techpowerup.com/gpudb/2464/radeon-r9-255-oem.html 

Link to comment
Share on other sites

A little research would have given you an answer, the cheapy card cannot do more than 1280X720

 

That is their recommended resolution for that card when you want to have all detail settings in your game on maximum. I'm a big fan of research. I'm an even bigger fan of its far superior cousin: careful research. That GPU supports resolutions up to 4096x2160, depending on which output port you use.

Now, just for the fun of it, I'll try to continue without ignoring the original poster's multiple statements that the discrete graphics card's DVI-I port is the one being used (I'm not sure why pretending that hasn't been said is a thing in this thread).

 

With a digital signal, the monitor is wanting to run at the next lower resolution available without distortion (half of its native resolution in both dimensions). There are a few possible reasons I can think of why this might be the case:

1) Maybe the DVI cable is not Dual Link. The existence of all the pins on the connector doesn't necessarily mean all the wires are there. But, if this cable happened to come with the monitor, then it's unlikely for it not to be Dual Link.

2) Maybe you have to disable HDMI Audio in your BIOS (if the option is there). Yes, I know that suggestion sounds totally weird, but it has fixed this problem for some people. For example: http://forumserver.twoplustwo.com/48/computer-technical-help/cant-sellect-correct-resolution-2560x1600-only-1280x800-908169/

 

3) Maybe some other reason I'm not yet familiar with, but which we all will no doubt find quite amusing.

Link to comment
Share on other sites

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.