Player-X said:
Next week I may be upgrading to a 9700 Non pro
how much faster is a 9700 compared to a 9600 pro?
The Radeon 9700 is quite a bit faster (roughly 20 - 30%). Its FSAA and AF is also faster than the 9600 Pro, due to its 8 pipes. The difference won't be staggering though; you can probably play games with one more level of detail than you are now.
Cerberus said:
I have a Geforce4 Ti4600 & I play DX9 game just fine. I play the new Tron 2.0 game at 800x600 with FSA (quincux) & every single graphics option turned on & I never drop below 40fps.
As chp said, you're playing in DX8 mode. Most DX9 games have a fallback feature for older hardware.
Cerberus said:
Not to mention that nVidia's domenance on opengl is undisputed. OpenGL's a better API than DirectX anyway (although MS is catching up with recent version of DX). I use it on anything that will support it. You will always get better rendering quality out of OpenGL, especially when it comes to lighting.
I hope you're willing to back up your claims. Personally I think DirectX is equal, if not better than OpenGL. DirectX games like NOLF2 and UT 2003 run beautifully and with awesome graphics. There's also the upcoming Half-Life 2 which is written in pure DX9 code. The best example of OpenGL I can think of is Quake 3 and its derivatives (RTCW, MoH, etc.) and they're not really that different than other games in terms of image quality or rendering capabilities.
Oh, and Nvidia's OpenGL 2.0 (ARB2) capabilities suck. Their FP pixel shader units don't do any better in OpenGL than in Direct3D. Then again ATI's pixel shader units are incredibly powerful and the GeForce FX is just slow by comparison.
Cerberus said:
One thing I will put to rest is that graphics benchmark programs results cannot really be trusted when debating video cards. Both manufacturers "tweak" their drivers specifically to get good scores from those programs. The only real evidence as to speed comes from the FPS recorded during an actual video game. It's important, too, to remember that speed isn't everything. I've never thought that the rendering on ATI cards was as pretty as nVidia cards, but mabey I'm just wierd.
Well ATI hardly needs to tweak their cards for benchmarks. Their cards have shown to perform equally on every game and benchmark so I think it's suffice to say that they don't have much tweaking going on, which is more than I can say for Nvidia.
I've owned two Nvidia cards before (TNT2, GeForce 2 Pro) and there isn't much IQ difference in stock settings. However, with FSAA and AF ATI definitely has the edge. I've used 16x AF since my Radeon 9000 Pro and I can't stand having blurry textures anymore. FSAA I've only recently started using seriously (all my previous cards could only do slow supersampling) but it's pretty awesome as well. You won't really notice too much difference (between Nvidia and ATI's FSAA) in gaming situations though, but it's been shown in various sites that ATI's rotated-grid multisampling does produce a marginally better image than Nvidia's algorithm.