Nvidia Owning ATI at next-gen


Recommended Posts

Tim Sweeney, founder of Epic Games, maker of the Unreal and Unreal Tournament games confessed that he believes that G70 Nvidia Geforce 7800 GTX is actually going to end up better than ATI's upcoming R520.

Sweeney is one of the key leader chaps who is also creator of our mysterious game that we heard about at E3 and mentioned here.

We know now it's named Gears of War and its Xbox 360 title, the game that will ironically run on ATI's R500 chip only.

In a video interview, a chap asked Tim "I have to ask you: R520 or G70?"

Tim smiled and said, "Oh, G70 for sure." You can download his video confession here.

I have to advise you to be careful about your thoughts, as Tim and Epic are in bed with Nvidia with it's "TWIMTBP" (The way its meant to be played) marketing program. Epic also happened to be was showing its Unreal 3 engine at Nvidia's booth.

I am not suggesting he is not telling the truth just that things can be different, you never know. Life taught us to be sceptical, us Bosnians and Scottish people.

I guess we will learn the truth in just a few weeks as Nvidia plans to introduce its chip soon.

Source

http://www.theinquirer.net/?article=23471

  • Like 1
Link to comment
Share on other sites

lol @ the G70 Nvidia Geforce 7800 GTX name. It sounds like a rocket to space or something.

Link to comment
Share on other sites

His opinion is automatically biased if he has business dealings with Nvidia. Neither company is gonna see me pay what they ask for a "Top of the line , demise being planned before its release" gfx card.

Link to comment
Share on other sites

no, please explain if you are so kind?:))

585975120[/snapback]

ATI has been working on the Xbox 360 GPU for approximately two years, and it has been developed independently of any PC GPU. So despite what you may have heard elsewhere, theXbox 360 GPU is not based on ATI's R5xx architecture. >

Unlike any of their current-gen desktop GPUs, the 360 GPU supports FP32 from start to finish (as opposed to the current FP24 spec that ATI has implemented). Full FP32 support puts this aspect of the 360 GPU on par with NVIDIA's RSX.

ATI was very light on details of their pipeline implementation on the 360's GPU, but we were able to get some more clarification on some items. Each of the 48 shader pipelines is able to process two shader operations per cycle (one scalar and one vector), offering a total of 96 shader ops per cycle across the entire array. Remember that because the GPU implements a Unified Shader Architecture, each of these pipelines features execution units that can operate on either pixel or vertex shader instructions.

Both consoles are built on a 90nm process, and thus ATI's GPU is also built on a 90nm process at TSMC. ATI isn't talking transistor counts just yet, but given that the chip has a full 10MB of DRAM on it, we'd expect the chip to be fairly large.

One thing that ATI did shed some light on is that the Xbox 360 GPU is actually a multi-die design, referring to it as a parent-daughter die relationship. Because the GPU's die is so big, ATI had to split it into two separate die on the same package - connected by a "very wide" bus operating at 2GHz.

The daughter die is where the 10MB of embedded DRAM resides, but there is also a great deal of logic on the daughter die alongside the memory. The daughter die features 192 floating point units that are responsible for a lot of the work in sampling for AA among other things.

Remember the 256GB/s bandwidth figure from earlier? It turns out that that's not how much bandwidth is between the parent and daughter die, but rather the bandwidth available to this array of 192 floating point units on the daughter die itself. Clever use of words, no?

Because of the extremely large amount of bandwidth available both between the parent and daughter die as well as between the embedded DRAM and its FPUs, multi-sample AA is essentially free at 720p and 1080p in the Xbox 360. If you're wondering why Microsoft is insisting that all games will have AA enabled, this is why.

ATI did clarify that although Microsoft isn't targetting 1080p (1920 x 1080) as a resolution for games, their GPU would be able to handle the resolution with 4X AA enabled at no performance penalty.

ATI has also implemented a number of intelligent algorithms on the daughter die to handle situations where you need more memory than the 10MB of DRAM on-die. The daughter die has the ability to split the frame into two sections if the frame itself can't fit into the embedded memory. A z-pass is done to determine the location of all of the pixels of the screen and the daughter die then fetches only what is going to be a part of the scene that is being drawn at that particular time.

On the physical side, unlike ATI's Flipper GPU in the Gamecube, the 360 GPU does not use 1T-SRAM for its on-die memory. The memory on-die is actually DRAM. By using regular DRAM on-die, latencies are higher than SRAM or 1T-SRAM but costs should be kept to a minimum thanks to a smaller die than either of the aforementioned technologies.

Remember that in addition to functioning as a GPU, ATI's chip must also function as a memory controller for the 3-core PPC CPU in the Xbox 360. The memory controller services both the GPU and the CPU's needs, and as we mentioned before the controller is 256-bits wide and interfaces to 512MB of unified GDDR3 memory running at 700MHz. The memory controller resides on the parent die.

http://anandtech.com/tradeshows/showdoc.aspx?i=2423&p=2

Link to comment
Share on other sites

I have to advise you to be careful about your thoughts, as Tim and Epic are in bed with Nvidia with it's "TWIMTBP" (The way its meant to be played) marketing program. Epic also happened to be was showing its Unreal 3 engine at Nvidia's booth.

585975099[/snapback]

Jeez .. when the article itself warns you to take the statement with truckloads of salt, you dont get it ? :pinch:

The irony is that even though Nvidia contribute significant amount of resources to Epic, their games run faster on ATi cards. :D

Link to comment
Share on other sites

I hope its true. I used to think ATI was great when they outmatched NVidia in half the space. Now they're spitting out these stupid double decker power guzzlers too. If thats the way it works, I might as well just support an american company.

Link to comment
Share on other sites

The Unreal games advertise nVidia when you start the games up so if you think this is news or take it as facts, you're mistaken.

585975677[/snapback]

:yes:

Link to comment
Share on other sites

people take this ati vs nvidia thing way too seriously...buy whatever card you want. you won't go to heaven for defending their l33t hardware

Link to comment
Share on other sites

The Unreal games advertise nVidia when you start the games up so if you think this is news or take it as facts, you're mistaken.

585975677[/snapback]

Correct. However if you read the Tweaktown UT2004 tweak guide, there's a link to replace the nVidia logo with an ATI 'Get in the Game' logo (doesn't look snappy, but it pleases the nVidia haters).

I hope its true.  I used to think ATI was great when they outmatched NVidia in half the space.  Now they're spitting out these stupid double decker power guzzlers too.  If thats the way it works, I might as well just support an american company.

585975725[/snapback]

And I'll do the opposite. :yes:

Link to comment
Share on other sites

Correct. However if you read the Tweaktown UT2004 tweak guide, there's a link to replace the nVidia logo with an ATI 'Get in the Game' logo (doesn't look snappy, but it pleases the nVidia haters).

585975751[/snapback]

:rofl: :rofl: :rofl: That is hilarious!!! People are a little crazy about somethings

Link to comment
Share on other sites

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.