PS2 vs xbox argument. Having done some research, I thought some of you might
like to read this, as apparently it is all true (however, it came from 'the
internet' so it may be blatantly wrong... <g>). Lets start a nice, peaceful
thread, shall we? [Like that's gonna ever happen :-) ].
People keep telling me that the PS2 has aliasing problems. On my tv - a
Panasonic pure flat 28" - there was a fuzzyness around bright colours, but
that wasn't aliasing, it was the composite video connection (when I stick my
vcr into my tv card I get the same problem). When I connected them up with a
'real' cable, there were no problems at all. The aliasing is a lot worse on
a regular tv program than in a game. It is only noticable on recent games if
you look really hard. Some release games (such as SSX) do exhibit quite a
lot of aliasing, but read on...
And by the way - to dispel the myth - the PS2 does have an antialiasing unit
on board, but just like those vector units it is really hard to code fast
enough. Due to the fact Sony just gave out a list of assembler op-codes and
no C libraries whatsoever to the developers, the developers have had to
start from scratch. That is why PS2 games are getting better and better all
the time. Big software houses, like Konami, Square, EA and so forth, have
developed their own kits which are vastly different to each other. I know EA
use one of the vector units to generate realtime dolby digital surround
sound in games like SSX Tricky and NHL '02 while the other unit is running
the entire game all by itself. This shows just how much potential there is
in there.
When the games can fully utilise all of the PS2 hardware (I estimate about
another year or so), then we can truly compare the performance (and
graphical capability) of the consoles. The xbox was at it's peak from day
one, due to the fact that everybody is used to getting the most out of PC's,
and that is what it is (more or less). I have no idea about the GC dev
setup, as I have no intention of writing a game on it (the Nintendo QC dept
is *way* too strict for my liking - they don't like niche games at all).
I was looking up how the PS2 and xbox compare under benchmarking (for the
CPU, graphics chips etc) to see really which one is better. We all already
know how Mhz is a completely pointless method of comparing RISC and CISC
chips (as the CISC chips don't usually execute 1 instruction per clock
cycle, and it has really weird microcoded operations and other random stuff
going on in there), and I like to have facts rather than sales hype. I found
some interesting facts:
1) The PS2 has a 2,560-bit data bus in its parallel rendering engine. All I
can say is '!!!' (actually, I doubt I can say '!!!', but you see the point).
The xbox GPU has a 128-bit bus by way of comparison.
2) The 4.0 GPixels produced by the xbox GPU is in antialiased mode (the xbox
is hardwired for 4xFSAA). This passes over each pixel 4 times, so in actual
fact, 1.0 GPixels are each rendered 4 times, which is a total of 1.0 GPixels
(different ones, which is what counts).
3) The differences between a GeForce 3 and the xbox GPU: xbox GPU shares
64Mb with the rest of the system and the memory is 100Mhz slower than the
dedicated 64Mb on the GF3. The xbox GPU has 2 vertex shaders, rather than
the GF3's one vertex shader. However, Microsoft claims that this second
vertex shader instantly bumps the XBox's theoretical max poly count from the
31 million that Nvidia lists for the GeForce 3, all the way up to 125
million pps. Uh? I cant see a vertex shader suddenly quadrupling the pps
count, especially when combined with less memory *and* slower memory. This
kinda proves point 2. So the xbox can only absolutely manage 31 million pps
(nothing will ever hit the maximum, anyway, this is just the absolute top
throughput). The PS2 can do 75 million pps (again, no games will hit the
maximum). That may seem like a huge difference, but these are 32 pixel
untextured polygons, rather than anything you would see in a game, unless
you are playing Virtua Fighter (the original, flat-shaded one) in 320x240 on
your console :-).
4) This is something I thought up, which may be completely useless, but it
seems to have a shred of potential in it... With regards to straight
comparisons, remember how the Mhz is useless. Think of this: the PSone had
33Mhz. Tomb Raider was identical on the PSone and the PC. The PC version
needed a P133 and a 3DFX graphics chip to run properly, or something like a
166/200Mhz Pentium. You could look up the recommended specs for the game if
you want to check. That is a scale factor of about 4 with the 3DFX. The PS2
processor is an evolution of the PSone processor, just as the celeron in the
xbox is an evolution of the Pentium (and don't let Microsoft tell you it's a
P3 because it isn't). Using the same scale factor, the 295Mhz PS2 chip would
equal a 1.1Ghz Intel chip. Obviously that is a very random example, you
could try this with a whole bunch of games to see the differences. You have
to run the game in the same resolution at the same FPS to get an accurate
reading. As another example, Quake 3: Revolution, Medal of Honor: Frontline
and James Bond 007 in Agent Under Fire all use the PS2 Q3 engine. Quake
3:Arena, Medal of Honor: Allied *** and Jedi Outcast all use the PC Q3
engine. Running the games on my Cel500 with an almost identical graphics
card to the xbox GPU at 640x480 got me about 15fps. I now have a 1.8Ghz P4
so I can't compare, but I don't think a 733 would run it at 60fps. More like
1Ghz. That is a quick and dirty (and somewhat innacurate) method of
comparing the two processors in terms of real *** performance (ie the
only thing that really matters).
It's interesting stuff, the actual hardware details between the three
consoles. I understand these numbers and how they are calculated, but I
can't verify them, as it is all based on what the companies say (ie they
might, well, they do lie - it's just a matter of not believing anything you
read unless it is independent).
As always, correct me if I am wrong...
--
Nick
"The overriding purpose of software is
to be useful, rather than correct."
John Carmack, id Software