AMD preps DirectX 10.1 'Radeon HD 3600' GPU pair

AMD is expected to launch the successor to its ATI Radeon HD 2600 in January, and now the two new parts' speed details have emerged. Graphics card maker sources cited by DigiTimes point two versions of the 'RV635' GPU one an XT model, the other a Pro. The former will be clocked at 800MHz, the latter at 600MHz. The chips support DirectX 10.1 - due to be released with Windows Vista Service Pack 1 early next year. Both GPUs will connect to memory across a 128-bit bus, but while the Pro will appear on cards containing at least 512MB of GDDR 2 SDRam, XT-based cards will have 256MB of GDDR 3. As yet, memory clock speeds are not known.

View: Full Story @ RegHardware

Report a problem with article
Previous Story

Microsoft .NET Framework 3.5

Next Story

iPhone to get 3G in May 2008

14 Comments

Commenting is disabled on this article.

I doubt it. It's just not gonna be here in time for Christmas '07, that's all. I'm guessing they'll launch it with a new chipset that takes advantage of PCIe 2.0 and DDR3, possibly one that leverages Intel's "Skulltrail" platform as well. We'll just have to wait and see what they release.

Yeah, I understand it will eventually come out. It just seems that by ATI not being able to top the GTX then Nvidia will be in no rush to release it.

i actually think that ati makes much better gfx cards than nvidia. I have never had a problem with thier cards, and will be waiting for the 3800.

It's possible that ATI makes better cards, but if you don't have the drivers to back them up, you lose performance. And thats exactly what has been happening to ATI.

WICKO said,
It's possible that ATI makes better cards, but if you don't have the drivers to back them up, you lose performance. And thats exactly what has been happening to ATI.

Nvidia has been bribing most of the top PC developers to favor their cards. Nvidia and ATI cards have slightly different strengths, so favoring Nvidia's shader optimizations over ATI's will give you significantly decreased performance on ATI's cards. ATI cards also have better precision and IQ resulting in decreased performance.

Nvidia might be playing a little dirty, but there is also nothing stopping ATI from coming out with different designs to address these problems -- or bribe more developers like Nvidia is doing. Even giving developers better tech support and QA services for their code would help ATI win some over. Nvidia works very closely with devs, ATI not as much. I guess that's where all those Nvidia game optimization fixes come from?

toadeater said,

Nvidia has been bribing most of the top PC developers to favor their cards. Nvidia and ATI cards have slightly different strengths, so favoring Nvidia's shader optimizations over ATI's will give you significantly decreased performance on ATI's cards. ATI cards also have better precision and IQ resulting in decreased performance.

Nvidia might be playing a little dirty, but there is also nothing stopping ATI from coming out with different designs to address these problems -- or bribe more developers like Nvidia is doing. Even giving developers better tech support and QA services for their code would help ATI win some over. Nvidia works very closely with devs, ATI not as much. I guess that's where all those Nvidia game optimization fixes come from?

Hmm thats probably true now that I think about it. But you have to wonder how far that takes them? I'm not sure how shaders work with DirectX 9/10, I was under the impression that you had to use seperate languages for the 2 card makers. But then I'm pretty sure OpenGL has an open standard for that kind of thing, so I wonder if DirectX also has something like that. It would sure be a pain for developers having to write twice the amount of shader code and optimize it for each set.

I bought the new OpenGL superbible but I've yet to crack it open and really read up on it. But I still believe drivers have a big part as well. I mean we've seen new beta drivers every time a triple A game comes out from nVidia, but what about ATI? I haven't heard anything like that. While this definitely ties into how close they work with game devs, like you said, drivers definitely play a sizeable chunk.

WICKO said,

Hmm thats probably true now that I think about it. But you have to wonder how far that takes them? I'm not sure how shaders work with DirectX 9/10, I was under the impression that you had to use seperate languages for the 2 card makers. But then I'm pretty sure OpenGL has an open standard for that kind of thing, so I wonder if DirectX also has something like that. It would sure be a pain for developers having to write twice the amount of shader code and optimize it for each set.

You didn't believe that Developers make money through sales didja? :P .. No one buys software anymore, they pirate it. What would then be keeping those software developing companies afloat?

I don't think Directx is biased towards either nVidia or ATI. I think its done via programming the shader transformations such that they pass through nVidia's architecture more efficiency than ATIs. I could be wrong though.

huh... so is ATI/AMD planning on skipping the high end models like they are with the Phenom Processors? Because the 3800 doesn't really compare to the 8800GT or even the old GTS series.. and 3800 to me would have been the high end. But the 3800 seems more like a high midrange.

I guess Intel and nVidia hit them pretty hard then...

The Radeon 3870 is a midrange card, it's price point is at $250 max, probably as low as $200. It's cheaper than the Geforce 8800GT. So while the 8800GT wins by 10-15% the 3870 is 10-15% cheaper. The price difference makes up for the performence difference.

Also, the first benchmarks for the 3870 and 3850 that I saw used the current Catalyst 7.11 drivers. older versions than the newer Geforce 169.01/04 that nVidia released with their new card. So, I expect with the next Catalyst update the 3870 could inch up to be equal to the 8800GT but still be around $50 cheaper. If this holds true, AMD has a winner. Also in late Dec or start of january they'll have the 3870x2 out on sale.

GP007 said,
The Radeon 3870 is a midrange card, it's price point is at $250 max, probably as low as $200. It's cheaper than the Geforce 8800GT. So while the 8800GT wins by 10-15% the 3870 is 10-15% cheaper. The price difference makes up for the performence difference.

Also, the first benchmarks for the 3870 and 3850 that I saw used the current Catalyst 7.11 drivers. older versions than the newer Geforce 169.01/04 that nVidia released with their new card. So, I expect with the next Catalyst update the 3870 could inch up to be equal to the 8800GT but still be around $50 cheaper. If this holds true, AMD has a winner. Also in late Dec or start of january they'll have the 3870x2 out on sale.

Well lets hope so, I definitely think they made the right move, like the first guy that replied to me, focusing on the midrange/low range is the best idea. I just find the namings awkward for a brand new generation.

I haven't heard of this X2 stuff yet? I'm assuming thats like the GX2 7950 that nvidia had?

I think they need to focus on the 3870 being single slot though.. especially if they want people to start buying 4 of them haha, leaves no room for expansion cards!