Okay, so I was having a little "discussion" with some jerk on youtube claiming that the PS2 had more graphical power than the Gamecube, especially when it comes to polygonal/vertex power. I was telling him that, while the PS2 had a higher count, that was only when there was no textures or lighting whatsoever, that they were only flat-shaded gourad polygons. After reading this (PlayStation 2 - Wikipedia, the free encyclopedia ) I gathered the fact that when everything is applied to the polygons, the count is only ~15 million or so, if even that. He also told me that the CPU was more powerful than the gamecube's PowerPC, which isn't also true. It is safe to assume that its CPU is almost as powerful as the Xbox's or what? Know I'd really like to know if the PS2 really is more powerful graphics and processor-wise or not. I'd like to have some insight about this, because this guy is being a real rabid fanboy about it.
PS2 GPU Specifications (pulled from Wikipedia)
Pixel pipelines: 16
Video output resolution: variable from 256x224 to 1280x1024 pixels
4 MB Embedded DRAM video memory bandwidth at 48 gigabytes per second (main system 32 MB can be dedicated into VRAM for off-screen materials)
Texture buffer bandwidth: 9.6 GB/s
Frame buffer bandwidth: 38.4 GB/s
DRAM Bus width: 2560-bit (composed of three independent buses: 1024-bit write, 1024-bit read, 512-bit read/write)
Pixel configuration: RGB: Alpha:Z Buffer (24:8, 15:1 for RGB, 16, 24, or 32-bit Z buffer)
Dedicated connection to: Main CPU and VU1
Overall pixel fillrate: 16x147 = 2.352 Gpixel/s (rounded to 2.4 Gpixel/s)
Pixel fillrate: with no texture, flat shaded 2.4(75,000,000 32pixel raster triangles)
Pixel fillrate: with 1 full texture(Diffuse Map), Gouraud shaded 1.2 (37,750,000 32-bit pixel raster triangles)
Pixel fillrate: with 2 full textures(Diffuse map + specular or alpha or other),
Gouraud shaded 0.6 (18,750,000 32-bit pixel raster triangles)
GS effects: AAx2 (poly sorting required),[45] Bilinear, Trilinear, Multi-pass,
Palletizing (4-bit = 6:1 ratio, 8-bit = 4:1)
Multi-pass rendering ability
Four passes = 300 Mpixel/s (300 Mpixels/s divided by 32 pixels = 9,375,000
triangles/s lost every four passes)[46
And the Gamecube's
162 MHz "Flipper" LSI (co-developed by Nintendo and ArtX, acquired by ATI)
180 nm NEC eDRAM-compatible process
8 GFLOPS
4 pixel pipelines with 1 texture unit each
TEV "Texture EnVironment" engine (similar to Nvidia's GeForce-class "register combiners")
Fixed-function hardware transform and lighting (T&L), 20+ million polygons in-game
648 megapixels/second (162 MHz Ă— 4 pipelines), 648 megatexels/second (648 MP Ă— 1 texture unit) (peak)
Peak triangle performance: 20,250,000 32-pixel triangles/s raw and with 1 texture and lit
337,500 triangles a frame at 60 FPS
675,000 triangles a frame at 30 FPS
8 texture layers per pass, texture compression, full scene anti-aliasing[15]
8 simultaneous hardware light sources
Bilinear, trilinear, and anisotropic texture filtering
Multi-texturing, bump mapping, reflection mapping, 24-bit z-buffer
24-bit RGB/32-bit RGBA color depth
Hardware limitations sometimes require a 6r+6g+6b+6a mode (18-bit color), resulting in color banding.
720 Ă— 480 interlaced (480i) or progressive scan (480p) - 60 Hz, 720 Ă— 576 interlaced (576i) - 50 Hz
How accurate these really are is another matter in itself. But again, I'd really like to know what was more powerful.
PS2 GPU Specifications (pulled from Wikipedia)
Pixel pipelines: 16
Video output resolution: variable from 256x224 to 1280x1024 pixels
4 MB Embedded DRAM video memory bandwidth at 48 gigabytes per second (main system 32 MB can be dedicated into VRAM for off-screen materials)
Texture buffer bandwidth: 9.6 GB/s
Frame buffer bandwidth: 38.4 GB/s
DRAM Bus width: 2560-bit (composed of three independent buses: 1024-bit write, 1024-bit read, 512-bit read/write)
Pixel configuration: RGB: Alpha:Z Buffer (24:8, 15:1 for RGB, 16, 24, or 32-bit Z buffer)
Dedicated connection to: Main CPU and VU1
Overall pixel fillrate: 16x147 = 2.352 Gpixel/s (rounded to 2.4 Gpixel/s)
Pixel fillrate: with no texture, flat shaded 2.4(75,000,000 32pixel raster triangles)
Pixel fillrate: with 1 full texture(Diffuse Map), Gouraud shaded 1.2 (37,750,000 32-bit pixel raster triangles)
Pixel fillrate: with 2 full textures(Diffuse map + specular or alpha or other),
Gouraud shaded 0.6 (18,750,000 32-bit pixel raster triangles)
GS effects: AAx2 (poly sorting required),[45] Bilinear, Trilinear, Multi-pass,
Palletizing (4-bit = 6:1 ratio, 8-bit = 4:1)
Multi-pass rendering ability
Four passes = 300 Mpixel/s (300 Mpixels/s divided by 32 pixels = 9,375,000
triangles/s lost every four passes)[46
And the Gamecube's
162 MHz "Flipper" LSI (co-developed by Nintendo and ArtX, acquired by ATI)
180 nm NEC eDRAM-compatible process
8 GFLOPS
4 pixel pipelines with 1 texture unit each
TEV "Texture EnVironment" engine (similar to Nvidia's GeForce-class "register combiners")
Fixed-function hardware transform and lighting (T&L), 20+ million polygons in-game
648 megapixels/second (162 MHz Ă— 4 pipelines), 648 megatexels/second (648 MP Ă— 1 texture unit) (peak)
Peak triangle performance: 20,250,000 32-pixel triangles/s raw and with 1 texture and lit
337,500 triangles a frame at 60 FPS
675,000 triangles a frame at 30 FPS
8 texture layers per pass, texture compression, full scene anti-aliasing[15]
8 simultaneous hardware light sources
Bilinear, trilinear, and anisotropic texture filtering
Multi-texturing, bump mapping, reflection mapping, 24-bit z-buffer
24-bit RGB/32-bit RGBA color depth
Hardware limitations sometimes require a 6r+6g+6b+6a mode (18-bit color), resulting in color banding.
720 Ă— 480 interlaced (480i) or progressive scan (480p) - 60 Hz, 720 Ă— 576 interlaced (576i) - 50 Hz
How accurate these really are is another matter in itself. But again, I'd really like to know what was more powerful.