Use Intel HD 4K & Radeon HD 7970


Recommended Posts

Have built my computer, Ivy bridge CPU and http://www.newegg.com/Product/Product.aspx?Item=N82E16814131468&nm_mc=AFC-C8Junction&cm_mmc=AFC-C8Junction-_-na-_-na-_-na&cm_sp= along with Gigabyte GA-Z77X-UD5H-WB

Is it at all possible to use both the GPU for one display and the HD 4K for another display? One monitor supports VGA & DVI. The only only supports VGA. Right now I'm using DVI with the GPU (Radeon) and want to use VGA with the Intel GPU. Running Windows 7 Ultimate 64 bit.

Link to comment
Share on other sites

In short, no. You can't use two different GPUs at once.

NVIDIA has a technology called SLI that allows two compatible GeForce GPUs to be used simultaneously, and AMD has a similar technology called CrossFire that allows two Radeon GPUs to work together. Runnings two GPUs from different vendors (or from the same vendor if they are not SLI or CrossFire capable) is possible in some operating systems but generally very difficult and somewhat limited. Technically it is possible to run two unrelated GPUs simultaneously in GNU/Linux (which is the only operating system I am aware of that has such a capability), but even that doesn't do exactly what you want. It requires running two X servers - one for each GPU - which effectively limits interaction between the two without a complication bridging setup.

Link to comment
Share on other sites

Some Asus and Asrock mobos have a feature called Virtu MVP provided by Lucid Logix, maybe you are looking for that, dunno if your board supports it.

That technology merely allows saving power by dynamically switching from the discreet to integrated graphics card, similar to NVIDIA's Optimus. It does not allow you to use both graphics cards at the same time!

Link to comment
Share on other sites

I have the same motherboard and an nvidia 670 gtx. Go in to the BIOS and set "internal graphics" to "enabled" and "init display first" to "peg". Install both sets of drivers (AMD's and Intel's.) They should work simultaneously unless AMD really messed something up. Mine works fine (with nvidia.)

Clipboard01-1.jpg

Link to comment
Share on other sites

In short, no. You can't use two different GPUs at once.

...

Are you sure on that? My HP N40L suggests otherwise, I can run the built in AMD 4200, an nVidia GT 640 AND an Aspeed IPMI graphics adaptor all at once.

You'll probably have to change the settings in the BIOS to set the built in graphics to be "Enabled/Always On/Primary" along those lines in order to keep it visible to the system when you install a PCIe graphics card.

Link to comment
Share on other sites

I have the same motherboard and an nvidia 670 gtx. Go in to the BIOS and set "internal graphics" to "enabled" and "init display first" to "peg". Install both sets of drivers (AMD's and Intel's.) They should work simultaneously unless AMD really messed something up. Mine works fine (with nvidia.)

That's interesting. How does it work? Is the same thing displayed on both screens? Are they completely separate (so you can't drag windows between them)? Or do they behave like using both outputs on the same graphics card? Does the performance suffer as a result? Which versions of Windows can do this? How long has it been available?

Apparently I've been out of the Windows world for too long. I generally try to keep up with what's going on, but I don't normally try things like this out myself. That said, I haven't heard of this technology before, and I can find very little about it online.

Link to comment
Share on other sites

You have use one GPU to drive one display and another for another display. The GPU's themselves cannot communicate with each other so you won't get some performance improvement out of it if you were expecting that. But if your using more than 2 monitors (or even 3 since AMD have the Eyeinifity setup) then you can use another graphics card to enable more monitors to be connected.

Link to comment
Share on other sites

Have built my computer, Ivy bridge CPU and http://www.newegg.co...-na-_-na&cm_sp= along with Gigabyte GA-Z77X-UD5H-WB

Is it at all possible to use both the GPU for one display and the HD 4K for another display? One monitor supports VGA & DVI. The only only supports VGA. Right now I'm using DVI with the GPU (Radeon) and want to use VGA with the Intel GPU. Running Windows 7 Ultimate 64 bit.

Why not just get a DVI to VGA adapter?

Link to comment
Share on other sites

@xorangekiller

  • The same can be displayed on both - like backgrounds. You can also have one thing displayed on one screen and something different on another. Right now I have Firefox on my main Display and foobar/steam on the monitor to me left.
  • I can drag windows and items from one screen to the other, just drag from one side to the other.
  • I'd think (In my opinion) that this has been possible since at lest Vista, if not XP . . . .

Even though the card came with a VGA to DVI adapter, it's a stupid type - DVI-A. This monitor doesn't seem to want to let the thing attach to the DVI port. And it seems the DVI cable I'm using doesn't either.

Link to comment
Share on other sites

In short, no. You can't use two different GPUs at once.

NVIDIA has a technology called SLI that allows two compatible GeForce GPUs to be used simultaneously, and AMD has a similar technology called CrossFire that allows two Radeon GPUs to work together. Runnings two GPUs from different vendors (or from the same vendor if they are not SLI or CrossFire capable) is possible in some operating systems but generally very difficult and somewhat limited. Technically it is possible to run two unrelated GPUs simultaneously in GNU/Linux (which is the only operating system I am aware of that has such a capability), but even that doesn't do exactly what you want. It requires running two X servers - one for each GPU - which effectively limits interaction between the two without a complication bridging setup.

Running multiple separate (non-SLI/XFire) GPUs on a Windows PC has worked seamlessly since at least Vista (never tried on XP). Right now at work I'm driving my two monitors with two different NVIDIA graphic cards (not in SLI), and my colleague has two monitors on an AMD card and another on an NVIDIA one.
  • Like 1
Link to comment
Share on other sites

It's a change that was introduced with the reworking of the driver model and display model in Vista.

Since Vista you can use two separate GPU's in a single machine independently (they won't "work together" to drive a single screen, but can run independent monitors easily enough)..

The one thing to be aware of is you might see a flicker as the window moves between screens and/or you might see different colors (due to monitor/GPU/profiles etc).

In short, should work fine :)

Link to comment
Share on other sites

Yes this will work. They will be independent of one another meaning you wont have improved performance. You will be able to use all the regular Windows monitor settings such as cloned/extended displays.

How do I know this? I have a PC with HD4000 graphics and GTX560 graphics with multiple displays.

Link to comment
Share on other sites

He'll get slightly better performance by offloading one screen onto a different GPU, but as above >.<

Nice display pic btw :p

Link to comment
Share on other sites

So if I'm understanding this right, you can use two GPUs at once in Windows, and have been able to since at least Vista, but there is no communication between the GPUs so windows cannot be seamlessly dragged between the two. I stand corrected.

Link to comment
Share on other sites

Windows can be seamlessly dragged between.. Kind of.

There's a flicker as memory is transferred from one GPU to the other, but other than that it's seamless >.<

If you drag something like a Windowed game, it might take a second or so to move, but other than that it should be fine :o

Link to comment
Share on other sites

With my set up I've set my discrete card as the primary output and the HD4000 as a secondary (set peg in BIOS). When playing games the discrete card is use and when watching movies Intel's Quick Sync is use without me changing any settings. Intel Quick Sync encoding works as well.

Link to comment
Share on other sites

Windows can be seamlessly dragged between.. Kind of.

There's a flicker as memory is transferred from one GPU to the other, but other than that it's seamless >.<

If you drag something like a Windowed game, it might take a second or so to move, but other than that it should be fine :o

Alright, someone get a video of a game running windowed at some monster settings in one monitor, then transfer it over to the HD4000 monitor so we can find out what happens. :laugh:

Link to comment
Share on other sites

You have use one GPU to drive one display and another for another display. The GPU's themselves cannot communicate with each other so you won't get some performance improvement out of it if you were expecting that. But if your using more than 2 monitors (or even 3 since AMD have the Eyeinifity setup) then you can use another graphics card to enable more monitors to be connected.

Is it at all possible to use both the GPU for one display and the HD 4K for another display?
Link to comment
Share on other sites

So if I'm understanding this right, you can use two GPUs at once in Windows, and have been able to since at least Vista, but there is no communication between the GPUs so windows cannot be seamlessly dragged between the two. I stand corrected.

Yes they can. Sometimes there's a slight hiccup when the windows starts crossing over, but my secondary card is the cheapest quadro in existence so I'm not sure if that's actually Windows' fault.

At least on the desktop, driving each monitor on a separate GPU is exactly the same experience as putting them all on one.

Link to comment
Share on other sites

The GPU's themselves cannot communicate with each other
Of course they can. There's a cost to that, but two video cards can very well exchange data. D3D and such APIs provide everything needed for that.

What they won't do is automatically split d3d workloads between them as in SLI/XFire; I suppose that's what the quote meant. SLI/XFire is basically a way for multiple video cards to act like a single logical d3d device and internally split the work issued through that device. So if you wanted to use two video cards without SLI you'd simply create two d3d devices, and there are APIs for sharing resources between devices.

Running separate gpus means each monitor uses its own independent graphics resources. This means, for instance, that you could run a different gpu-accelerated movie or video game on each monitor, and each could (optimally) use its own video card.

Link to comment
Share on other sites

Alright, someone get a video of a game running windowed at some monster settings in one monitor, then transfer it over to the HD4000 monitor so we can find out what happens. :laugh:

I just tried this with Battlefield 3. I'm surprised at how well it performed on the monitor hooked up to the motherboard (p8z68v-pro with sandybridge 2700k), still got a decent frame rate of 30 (15fps loss) and there was no flicker when dragging the window from my other monitors powered by a 6870. I was able drag the Battlefield window out to use all monitors but it was too laggy to play at that point, it worked though.

  • Like 1
Link to comment
Share on other sites

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.