Recommended Posts

The 64-bit/128-bit attached to consoles is not the same -bit rating attached to processors. The Nintendo 64 (claimed to have 64-bit power) had a 32-bit (G3/Motorolla) processor.

What I'm not sure about is exactly what these consoles are referring to. From the posts above it sounds like there's two choices: video bandwidth or bus bandwidth. It would seem to make sense that it would be video bandwidth because current high-end video cards are 128 (256?)-bit and they're newer than these 64-bit machines and similar in timeline to the current 128-bit consoles (xbox/ps2/such).

By my reasoning I'm going to assume consoles give bit-ratings that speak to the graphical abilities while computing systems refer to the processing abilities.

That make sense?

I think the bit rating on consoles is to do with the graphix. In many games consoles the games are already pre-loaded so to speak on the media (CD, Cartridge, DVD) which is why they load faster and why consoles have little memory. This pre-loading state also reduces the amount of information that has to be processed int he cpu as most of it is already processed, all except the graphics and possible variable data. Therefore console cpu's dont need many instructions and can be given in 32bit streams.

32bit and 64bit processors on the otherhand are extremly powerfull and process nearly all of the operations that a pc requires as much of the time a pc has many variables and things that change or that can differ. These processors require ALOT of instructions to accomplish this flexability (Most heard about instruction sets include intels MMX, which is simply instructions that speed up multimedia operations).

32 bit processors in theory can have around 16million instructions that change the input data in some way, such as adding two input numbers together may be one instruction.

64 bit processors are much the same but can handle ALOT more instructions (Can't figure out or find the exact number)

In short processors in pc's are more powerfull than processors in consoles regardless of bit rating.

I'd also like ot point out that most console processors are RISC based which also makes a huge difference.

*I may be wrong on some of this, please feel free to correct*

XBOX has a 733MHz Pentium III CPU (32-bit x86, CISC).

Playstation 2 has a 266MHz 128-bit RISC CPU.

N64 has a 64-bit graphicsprocessor.

The bits for the processors are the size of the general puropse registers.

Pentium II is 32-bit CPU but has a 64-bit databus.

  Radium said:
XBOX has a 733MHz Pentium III CPU (32-bit x86, CISC).

Playstation 2 has a 266MHz 128-bit RISC CPU.

N64 has a 64-bit graphicsprocessor.

The bits for the processors are the size of the general puropse registers.

Pentium II is 32-bit CPU but has a 64-bit databus.

Sounds like an educated answer to me. It makes sense in my head.

As far as I know the N64 used a Motorolla/IBM chip similar to the chip Apple used in their G3-series computers. I think the speed was something like 66MHz.

The Xbox was the first console to have a fast processor. Any intelligent person knows processor speed isn't what's important and it seems gaming companies have understood this. The general public likes bigger numbers so PC processor companes have opted to raise the MHz. Apple has been a long-time supporter of the higher-band processor and AMD has followed suit. Intel still sucks.

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.