Next Generation Emulation banner

Ati vs Nvidia

341 - 360 of 925 Posts
sorry I should have said this. Notice that there is screen smoothing and 2XSAI. That is a big performance hit. And they improve quality by a lot. I doubt your card can handle this settings. I said my Ti 4600 did not. Perhaps at 1024x768 and not in all games of course.

Oh yeah and to put your GF4 to shame I must tell you I use 8XS FSAA not just 8X FSAA.

I'm using this settings.

Plugin: Pete's OpenGL Driver 1.1.74
Author: Pete Bernert
Card vendor: NVIDIA Corporation
GFX card: GeForce FX 5900 Ultra/AGP/SSE2

Resolution/Color:
- 1280x1024 Fullscreen - NO desktop changing [32 Bit]
- Keep psx aspect ratio: off

Textures:
- R8G8B8A8
- Filtering: 5
- Hi-Res textures: 1
- VRam size: 256 MBytes

Framerate:
- FPS limitation: on
- Frame skipping: off
- FPS limit: 60

Compatibility:
- Offscreen drawing: 4
- Framebuffer texture: 3
- Framebuffer access: 4
- Alpha multipass: on
- Mask bit: on
- Advanced blending: on

Misc:
- Scanlines: off
- Line mode: off
- Unfiltered FB: off
- 15 bit FB: off
- Dithering: off
- Screen smoothing: on
- Screen cushion: off
- Game fixes: off [00000000]

and for N64 emulation.

I use 1964 emulator and glN64 plugin. Is in OGL. I just love OGL. looks the best!

With N64 emulation I can go even with 8XS FSAA which is a big performance hit over just 8X FSAA and looks the best, and also 8X AF and get full speed at 1280x1024. I doubt it will do 1600x1200 though. Not full speed.
 
Just ask yourself why weren't the benchmarks done with dets 50s and why they don't want reviewers to use dets 50s.

I mean they won't give the same performance as the ATI Radeon 9800 Pro but it sure as hell won't make it look as bad.
Please....just stop trying to bring that point up....The reason the 50 dets weren't used is simple: they wanted to stick to existing publicly available drivers - what if Ati suddenly decided to say oh "we want you to use these hot new 4.0 cats that will smoke this benchy....". That would not allow the public to have any reference to draw from since they don't know if the drivers being used have no cheats enabled.

----------------------------

EDIT: oh wait....

saulin...just for future reference - what would you define as "full speed"?
 
Re: Re: Ati vs Nvidia

D.D. said:
Please....just stop trying to bring that point up....The reason the 50 dets weren't used is simple: they wanted to stick to existing publicly available drivers - what if Ati suddenly decided to say oh "we want you to use these hot new 4.0 cats that will smoke this benchy....". That would not allow the public to have any reference to draw from since they don't know if the drivers being used have no cheats enabled.

----------------------------

EDIT: oh wait....

saulin...just for future reference - what would you define as "full speed"?
I agree. If I were a developer, I would only use publicly available drivers. If you ask me, Nvidia, despite what other people say, has to fix something. It's not Valve's fault. It would we be actually very dumb if they actually intended to favor ATI Cards, because that they would be alienating the majority of the market and consumers, which has Nvidia cards.
 
Just ask yourself why weren't the benchmarks done with dets 50s and why they don't want reviewers to use dets 50s.

I mean they won't give the same performance as the ATI Radeon 9800 Pro but it sure as hell won't make it look as bad.
Did you read what the main differences between Det 5.0 and Det 4.0 will be for FX card? It only means it compiles data to ignore useless temp fetches and reads that are not neccasary. Making Optimal use of a Very thin Nvidia regitry.


If you read my previous post about the Nv architecture. Which is in this thread. About the Lack of Dedicated Floating Point Units. lack of Pipelines for Shader routines. You will understand why Half Life 2 is killing the Geforce card.

The geforce card is inferior when it comes to shaders. Det 5.0 are going to improve and reorder shader instructions. Which will be fine and nice. But I dont see a card 4 with non dedicated Floating Point Shader Units which share texel addressing data comparing to a card with 8 dedicated Floating Point Units.

This isnt about precision. If the Nvidia card had 8 dedicated units. it would be up to two times as fast as the ATI card in 16 bit Floating Point calculations. Currently you're best bet for Geforce FX cards is to run a mixture of DX 8.0/9.0 functions for optimal use of the Nvidia registry.


Also. Need I remind you. The special NV30 pathway alrdy makes optimal use of the Nvidia architecture, partial reads, Partial writes, releases negative temp fetches. This is exactly what Det 5.0 was supposed to do for shader instructions.


And now all ATI fans just think that the FX 5900U performs 50% of what the Radeon performs lol
No one has stated the Geforce FX is a bad performer. It has massive texel fill rate, Great Stencil Shadow Acceleration. and is a highly programmable GPU.

All We have been saying is the Nvidia card, Will never match the shader output of a dedicated Floating point unit radeon. Because the architecture of the Nv30/mv35 has only 4 Floating Point Units + The Units are not dedicated like an 8 pipeline solution.

And the 9600 Pro offer 4 dedicated FP Units mind you. But does not share texel addressing data with its shader routines.
 
offers better FSAA and AF algorithms than Nvidia's. All of my games work very well with it and I see no major issues with them.
Sorry I disagree with your assessment about ATIs AF, IMO Ati's AF has some real problems in some games.

Most notable gameplay types is MMORPG. SInce ATI's AF only filters on 90 degree angles and ignores angles on the left and right. You can see down a straight pattern in a wide open area, But if you look left or right You will notice significant blurring on those angles.

IMO ATi's AF is what needs work now. Graphic cards are finally offer the Pixel Fill Rate to Push AF (Even Quasi Trillinear/Billinear, which is what ATI does and I think is fine) On all Angles.

The improvement needs to be made to hit all angles because I found this extremely distracting in wide open areas being rendered.

Now corridor shooting games you will not see this problem,.
 
saulin...just for future reference - what would you define as "full speed"?
full speed is 60 fps and no one frame skip.

In other words a game should run 60 fps not 59fps and use no frame skipping whatsoever.

If you use frame skip to achieve 60fps you are cheating and not getting full speed.

In other words there is no guarantee the Radeon 9800 pro will do 60 fps on every map in HL2 at 1024x768 32-bit with the highest quality settings.

But see that res is too low for people that have big monitors. So are they screwed?

also no FSAA can be used I would assume as the frames would drop like crazy.

So yeah the Radeon 9800 pro runs HL2 faster. But is it fast enough?

Will future games run fast enough?

Looks like everyone that has a high end card might need an upgrade sooner than expected after all.

No one has stated the Geforce FX is a bad performer. It has massive texel fill rate, Great Stencil Shadow Acceleration. and is a highly programmable GPU.
That is not what I have been hearing and is what makes me step up and say the things as they are and not how some people think or assume.

I have heard. OMG the FX 5900U is a piece of crap and even a Radeon 9600 pro beats it.

I have heard Nvidia IQ is garbage. Not so I have seen both cards before my eyes and the image quality on both cards is excellent. What I noticed the most and said is that ATI FSAA works better.

so if you are gonna talk trash against Nvidia. Put your facts straight.

If you say ATI will run HL2 and similar games faster than Nvidia I totally agree. But when people say Nvidia is garbage and their video cards perform like crap that ain't true.

Like someone said. By time others game start using such insane requirements such as HL2 or higher. There already will be new video cards out from both ATI and Nvidia.

Oh yeah about not using the dets 50s because they are not officially out is BS since the game is not officially out either. And we will see the drivers out before the game hits the stores.

Besides Valve did have access to the drivers prior to the benchmarks.
 
That is not what I have been hearing and is what makes me step up and say the things as they are and not how some people think or assume.

I have heard. OMG the FX 5900U is a piece of crap and even a Radeon 9600 pro beats it
Well 9600 card is a better card than an FX 5900 Ultra when it comes to high precision data such as shaders. Depending on how well shaders catch on will depend on how much big that difference could be.

In Half Life 2. The difference is obsurd unfortunately for Nvidia owners. Currently Nvidia cards can draw the diffuse, combine components pretty well. The problem is (as we have stated) Floating Point shaders per pipeline are mini proccessors within a proccessor. And currently the Nvidia architecture just doesnt have enough to handle high shader instruction counts.

For Example. Each Floating Point unit could be compared to a Minature Athlon (Tho the programmiability of these Floating Point Units still has a long way to go to reach that of a CPU)

Now Combine the fact that one card has 8 dedicated Units. To handle Shader data.(9800,9700,9500 Pro) And the 9600 Pro has 4 dedicated units. And the 5600/5800/5200 only have 4 Units which share there function of texel addressing units as well.

You cans see clearly why the The 5900 cant handle the high precision instruction count of DirectX 9.0 shaders as well as a r300 equivalent.


Oh yeah about not using the dets 50s because they are not officially out is BS since the game is not officially out either. And we will see the drivers out before the game hits the stores.

Besides Valve did have access to the drivers prior to the benchmarks.
Gabe actually implied that the Det 5.0 were not realistic results for performance. Because Nvidia is probably lowering precision to integer. And So far from many of the preliminary shots I have seen. This holds true. Nvidia has been replacing a great deal of Floating Point calculations to integer to save instruction counts.
 
saulin said:
So yeah the Radeon 9800 pro runs HL2 faster. But is it fast enough?

Will future games run fast enough?

Looks like everyone that has a high end card might need an upgrade sooner than expected after all.
Why wouldn't it be fast enough? It runs over 60 fps at high resolutions and max details, which is more than I can say for that weak shader processor called the GeForce FX.

The bottom line in gaming speed for most enthusiasts is 30 fps at 1024x768 and medium details. Judging by the scores of the Radeon 9800 Pro I'd say it'll last at least the next two generations of games (HL2 and the following). By that time they'll be significantly more powerful graphics processors in the market to replace it.

saulin said:
If you say ATI will run HL2 and similar games faster than Nvidia I totally agree. But when people say Nvidia is garbage and their video cards perform like crap that ain't true.
No one here is saying that. We're only saying that the GeForce FX is crap in terms of its shader capabilities. The Half-Life 2 results clearly show this, as well as ChrisRay's research on the GeForce FX architecture. There is really no denying it. People can say all they want about it (immature drivers, unfair ATI optimizations, etc.) but the cold, hard fact is that the GeForce FX is just an inferior shader product to the R300.
 
Re: Re: Ati vs Nvidia

Demigod said:
Why wouldn't it be fast enough? It runs over 60 fps at high resolutions and max details, which is more than I can say for that weak shader processor called the GeForce FX.

The bottom line in gaming speed for most enthusiasts is 30 fps at 1024x768 and medium details. Judging by the scores of the Radeon 9800 Pro I'd say it'll last at least the next two generations of games (HL2 and the following). By that time they'll be significantly more powerful graphics processors in the market to replace it.

No one here is saying that. We're only saying that the GeForce FX is crap in terms of its shader capabilities. The Half-Life 2 results clearly show this, as well as ChrisRay's research on the GeForce FX architecture. There is really no denying it. People can say all they want about it (immature drivers, unfair ATI optimizations, etc.) but the cold, hard fact is that the GeForce FX is just an inferior shader product to the R300.
I agree, DemiGod. This makes me wonder, though. How would Doom 3 run on Nvidia cards, and would Doom 3 be playable AND look great at the same time on a GeforceFX line cards?
 
Re: Re: Re: Ati vs Nvidia

dogulation000 said:
I agree, DemiGod. This makes me wonder, though. How would Doom 3 run on Nvidia cards, and would Doom 3 be playable AND look great at the same time on a GeforceFX line cards?
I explained this if you read above in my comparison of the Doom 3 engine.

Doom 3 does use shaders. But it doesnt require nearly many calculations. or precision. Since the shading is generally use for speculiar lighting effects. So Scenes are not always rendering with these shaders anyways.
 
saulin said:
also no FSAA can be used I would assume as the frames would drop like crazy.
http://www.tomshardware.com/graphic/20030912/half-life-05.html
Not sure if you realized it, but because the game is so computationally expensive, the cards aren't as memory bandwidth bottlenecked as other games. They are GPU bottlenecked. Therefore, FSAA and AF won't make as large of a difference as one might expect.

saulin said:
So yeah the Radeon 9800 pro runs HL2 faster. But is it fast enough?

Will future games run fast enough?
maybe if you wanna live on the bleeding edge of perormance, in which case, the situation is no different than its always been, where you have to upgrade every six months.

saulin said:
Looks like everyone that has a high end card might need an upgrade sooner than expected after all.
Not as soon as GF FX owners. DX 9 wise anyway.

saulin said:
Oh yeah about not using the dets 50s because they are not officially out is BS since the game is not officially out either. And we will see the drivers out before the game hits the stores.

Besides Valve did have access to the drivers prior to the benchmarks.
This one is just foolish. ATi was tested with the unoffical game beta too, but I don't here you complaining about that. Official or not, both were run on similar conditions, the same game benchmark and the latest PUBLICALLY available drivers.

And don't give us this BS about the det 50s. How come nvidia can use unreleased drivers while ati can't? "Because they are slowly unlocking the potential of the FX architecture"? Right. Unlocking on an app by app basis. More like workarounds for the weaker architecture :rolleyes:
 
Guys theres no reason to turn this into a flame war.. I see on enough of that on the Nvidia/ATI forums.

Remember respect is the utmost key to discussion.











Demigod you are an ATI loving dog sucking worm!



Ok everyone keep up the fruitful discussion!
 
I'm sorry but 30 fps just doesn't cut it for me. I'm a Quake player and believe me 30 fps just makes you a newbie when others get over 60 fps.

From the looks of things I doubt the R9800 pro will get me over 60 fps all the time on all maps. Specially with FSAA. Also if I have the top of the line video card I don't think I would want to run my games with medium detail.

And I guess we will see how the Dets 5 will performe on HL2 once released. But I really don't think it will be 50% the performance of the R9800 pro.
 
saulin said:
I'm sorry but 30 fps just doesn't cut it for me. I'm a Quake player and believe me 30 fps just makes you a newbie when others get over 60 fps.
As I said it's the BOTTOM LINE for gaming enthusiasts. When you get lower than 30 fps then you should seriously consider getting an upgrade but until then it's playable. It's not perfect, but you can still enjoy the game.

saulin said:
From the looks of things I doubt the R9800 pro will get me over 60 fps all the time on all maps. Specially with FSAA. Also if I have the top of the line video card I don't think I would want to run my games with medium detail.
Of course not. It's a fact that frame rates drop and rise at certain times. That's just the average frame rate. In UT 2003 I get anything between 30 fps to 350 fps on a single map and get around 120 fps on average.

And what else are you going to get that's faster at HL2 (and most upcoming shader-intensive apps) than the Radeon 9800 Pro? The answer is obviously nothing. You can get your GeForce FX and play at 30 fps if you want, but I'll vouch for my Radeon at 60 fps, thank you.

I don't get your comment on the detail. When you're forced to run games at medium detail then the Radeon 9800 Pro will no longer be top of the line then, right? (note that most gamers drop detail as a last resort. They usually turn off FSAA/AF and tone down resolutions first)

ChrisRay said:
Demigod you are an ATI loving dog sucking worm
Heh, damn straight:D
 
saulin said:
I'm sorry but 30 fps just doesn't cut it for me. I'm a Quake player and believe me 30 fps just makes you a newbie when others get over 60 fps.

From the looks of things I doubt the R9800 pro will get me over 60 fps all the time on all maps. Specially with FSAA. Also if I have the top of the line video card I don't think I would want to run my games with medium detail.

And I guess we will see how the Dets 5 will performe on HL2 once released. But I really don't think it will be 50% the performance of the R9800 pro.

The Dets are also forcing low Integer precision.. But umm. Honestly. The new dets. From everything I have seen. Have only given the Nv30/35 about a 15% performance increase. Which is a little bit less than even I predicted from the reordering of shaders. That being said. The Information I have given you practically lays out what is limiting the Pixel Shader of the Nv30 series. It's up to you to either evaluate that and draw your own conclusions. Or ignore it. And say I'm just spouting ATI rhetoric fanboyism.


The Nv35 needs more Floating Point Units. It just doesnt have them. And. In many cases. When the Shaders are acting as texel addressing units. The card becomes more of a 3x1 card or a 2x1 card. Depending on how many FP units are used for the texel addressing data.

If you doubt my background. You could speak to Demigod. I actually support many of the decisions Nvidia has made. And I think Nvidia had some unique ideas in the Nv30 architecture. They just were too little of too much at the time. But Its clear that the r300 has many more Floating Point Units available to it.
 
Re: Re: Ati vs Nvidia

ChrisRay said:
The Dets are also forcing low Integer precision.. But umm. Honestly. The new dets. From everything I have seen. Have only given the Nv30/35 about a 15% performance increase. Which is a little bit less than even I predicted from the reordering of shaders. That being said. The Information I have given you practically lays out what is limiting the Pixel Shader of the Nv30 series. It's up to you to either evaluate that and draw your own conclusions. Or ignore it. And say I'm just spouting ATI rhetoric fanboyism.


The Nv35 needs more Floating Point Units. It just doesnt have them. And. In many cases. When the Shaders are acting as texel addressing units. The card becomes more of a 3x1 card or a 2x1 card. Depending on how many FP units are used for the texel addressing data.

If you doubt my background. You could speak to Demigod. I actually support many of the decisions Nvidia has made. And I think Nvidia had some unique ideas in the Nv30 architecture. They just were too little of too much at the time. But Its clear that the r300 has many more Floating Point Units available to it.
I concur. I'm also a Nvidia fan, and had bought mostly Nvidia cards in the past, and the only ATI card in my past was the rather horrible ATI Radeon 7200. I have ATI card now, which is ATI Radeon 9500 Pro. I only have it because Nvidia is disappointing me as of late. I'm sure that they will eventually get back on track. I think. :eek:nthepull
 
the 7200 isn't that bad anymore... the new catalysts finally made it stable :p

well, i recieved a 7200 from a friend who couldn't stand it anymore last week. and since my sis needed a comp, i built her one with the 7200 in it (1ghz tbird, all spare parts... kind of funny how spare parts build up to finally create a new computer). Played quake3, bf1942, ut2k3 just fine with it using the 3.7s.
 
Re: Re: Ati vs Nvidia

Gamer1 said:
the 7200 isn't that bad anymore... the new catalysts finally made it stable :p

well, i recieved a 7200 from a friend who couldn't stand it anymore last week. and since my sis needed a comp, i built her one with the 7200 in it (1ghz tbird, all spare parts... kind of funny how spare parts build up to finally create a new computer). Played quake3, bf1942, ut2k3 just fine with it using the 3.7s.
Well they WERE horrible they do their crappy drivers at the time. :cuss2:
 
341 - 360 of 925 Posts