Intel integrated G35 supports DirectX 10

IT looks like Q3 2007 will be very exciting for the industry. A lot of people will be on their holidays, but never the less Intel plans to introduce its first DirectX 10 chipset then. It is called G35 and this is a brand name not the code name, it seems.

It supports Intel Clear Video marchitecture and DirectX 10. It supports CPUs with 1333 and probably slower as well. It supports both DDR 3 1066 memory and DDR 2 800 at the same time. It can and will be matched with ICH9, ICH9R, ICH9DH Southbridges.

View: The full story
News source: The Inq

Report a problem with article
Previous Story

Worm Spreading via Skype Chat?

Next Story

Next XBLA Game: Novadrome. 800 Microsoft Points. Tomorrow.

20 Comments

Commenting is disabled on this article.

Let's hope Intel maintain their excellent track-record of releasing open-source drivers for their integrated graphics cards. Right now, they're far and away the best choice for people intending to use Linux or other OSes whose priority isn't gaming.

Although putting an end to the occasionally sh**ty VBIOSes served up with it by some OEMs would be nice

Going forward I think graphics are going to be emphasized more in newer operating systems. In which case it would be good to get a certain level of graphically capable hardware into the most peoples hands as possible so that application developers can leverage it to provide new and better applications. Stuff that we might not have even thought of with operating systems that have 2D and 3D segregated like XP. 3D graphics can and will be used in more places than just games.

It seems to me most people who downplay GPUs think that the state of graphics in operating systems is going to stay the same. I've even had friends state that vram over 32mb is overkill but in reality as you offload more of the drawing to the GPU it requires more vram. So if you are going to stick to XP forever 32MB of vram or less may be fine.

That should be the reason for faster hardware and better more feature packed operating systems in the first place. If most of the market has low end graphics hardware that's less of an incentive for innovation. Not that intel should put high end hardware into their integrated solutions however it doesn't mean that they can't also go for the middle end... which would be more than the bare minimum.

On the other hand if what some people predict is true about multi-core CPUs overtaking the role of the GPU it may be true that one day they may be able to compete or displace the discrete GPU. Those type of systems might at least replace the integrated GPU someday. Like for example the one AMD is working on.

Its only a matter of time before the integrated video catches up with nVidia and AMD/ATI. Afterall, a GFX card is just a bunch of chips and special drivers packed in together. Intel has the money to invest in developing a really good chip. Look at how far they've come in the last two or three years.

I guess the only thing to consider at this point, is how much would a full-blown uber DX-10 chip cost to have included with the motherboard? And wether or not they'll be able to cool it adequately.

nope, never, not a chance in hell. Intel video will never catchup with nvidia or ati. simply because they are not going for the rather small market that is computer gaming. Intel is selling graphic card to buisness and individual that do not need high power graphic. And they are the bigest seller of gpu for a reason. They have a gimmick that work well they are not about to change it.

Quote - DrunkenMaster said @ #8
Its only a matter of time before the integrated video catches up with nVidia and AMD/ATI. Afterall, a GFX card is just a bunch of chips and special drivers packed in together. Intel has the money to invest in developing a really good chip. Look at how far they've come in the last two or three years.

I guess the only thing to consider at this point, is how much would a full-blown uber DX-10 chip cost to have included with the motherboard? And wether or not they'll be able to cool it adequately.

U r smrt lolz~!

Itel could catch up if they wanted to but saying "it's only a matter of time" is a tad misleading. Currently they have shown no intention what so ever at catching up let alone overtaking the competition. They have the money and the technicians to do it but not the Will or desire. And despite this they sell more graphics solutions than AMD and Nvidia so really do they need to spend money trying to compete in the high end market?

It's probably best for them to invest in efficient low end chips, maintain their niche, and let the two established high end companies duke it out in what really is a smaller segment of the market.

You're saying this only because you think you can all predict the future of computing? 3D graphics are no longer being used for games only. There's a lot of CAD, scientific modeling and computer arts applications. And a whole lot more no one hasn't though of yet. Virtual reality? Needs 3D chip too.

Everyone was saying 10 years ago that nothing would kill SGI. Same idea here. Intel could beat ATI and NVIDIA at some point. The chipsets on laptops and desktops are what make the money.

Integrated GPUs are fine for some people, I don't know why people cant' see this. My dad will never get 10 seconds of real value from a higher end GPU, if current and next gen Intel parts support Aero with some speed that is enough for some.

Intel definitely makes some of the best integrated video out there, it's the only brand I trust. But why bother supporting DX10 when it won't get any use out of it? I guess the consumer is willing to pay 20$ for another sticker on the box.

Exactly. Not everybody wants to play 3D games, but they don't need or want some slouch of a GFX card either. Integrated graphics therefore is perfect for them (Think businesses, how many of them really need high performance GFX cards?)

Well, the huge problem I always had with integrated video is that you've got the combination of:

-It's a low-end part, likely used on machines with poor specs

and

-It steals system memory

So your 256M or 512M bargain box is dropping 32Mb on video even before you load a useful thing.

Years ago, "integrated video" had real memory-- like old OEM boxes and some server boards.

I could see a 'premium' integrated video board representing a good comparison with low-end discrete parts. It might also allow better flexibility for cooling, if the video hardware can be designed to not need to fit in a narrow slot.

Instead of spending $70 on a mainboard and $50 on a 6200 or 7300 class video card, why not solder the 6200/7300 on the board with 128 or 256M of memory, and sell the pair for $110?

Quote - Hak Foo said @ #7.3
Instead of spending $70 on a mainboard and $50 on a 6200 or 7300 class video card, why not solder the 6200/7300 on the board with 128 or 256M of memory, and sell the pair for $110?

We'll have that using AMD's Fusion in 2 yrs time... really useful processors specialized to whatever tasks u want!!

intel extreme crappy graphics its completely useless, i think that with ati and amd, intel should put more effort into video card and processor area integrating them and giving better performance and energy savings.

Q3 2007? Well, by then I doubt integrated Direct3D 10 (I believe there isn't really a "DirectX 10"?) won't be as special as it sounds now. But it sounds like a nice integrated chip though.