Next Generation Emulation banner
1 - 8 of 8 Posts

·
InnarX
Joined
·
2,756 Posts
Discussion Starter · #1 ·
Hey all.

I recently had to switch to a GeForce 5600, and I am experiencing a small problem with Pete's OpenGL 2 plugin and the 'shader effect' option. As all of those who use Pete's plugin, there is a way to use custom shader protocols that others have created, and they are available on Pete's website.

To be more specific, the moment I enable 'Shader Effect' in the properties of the plugin, and choose ANY shader, whether Full Screen Smoothing, or ARB Program etc. my games run slower, i.e the fps should be 60 and it's at 40 or so.

Now, the higher I set the Shader Level, the slower it runs. I have tested many shader effects and all are the same.

Why does this happen?

r2rX :(
 

·
Registered
Joined
·
654 Posts
From checking a little on the 'net, the FX5600 is not all that different from the FX5200 (which I used to own). While these cards support shaders, their support is minimal and not very high performance.

The more shader code any card runs, the longer it takes it to render a frame. Shader code has a bigger impact on rendering time on these cards than on newer ones.

Clearly your card can just barely finish rendering a frame (without shaders) in 1/60th of a second. Add shader code and apparently you add another 50 milliseconds.


Dan
 

·
Premium Member
bsnes, ePSXe
Joined
·
21,982 Posts
ah yes, the 5600. i remember that beast..

most shaders didnt even work (no foolies) and the few that did worked horribly. the standard fullscreen smoothing once, i even played FF9 at full speed with it set at 3 (4 being the highest) and at 1152x864.

but that card is at the bottom of the bucket as far as shaders go. it runs them, sometimes, and very very slowly. get an ATi card or at least a 5900.
 

·
InnarX
Joined
·
2,756 Posts
Discussion Starter · #4 ·
Well, thanks for the input. I figured as much, and it's alright I guess. Besides, the fact that I can increase the internal Y resolution up to Very High, I can see the difference in the quality of the games, so it won't bug me too much...although, degrading from an ATi Radeon 9800 Pro to an nVidia GeForce 5600 sucks.

But, at least my upgrade on christmas will rock...7800GTX minimum...that should be able to render the shaders just fine. :)

r2rX :D
 

·
Registered
Joined
·
654 Posts
Incidentally, shaders that do full-screen stuff like AA or smoothing work on a per-pixel basis, so you might be able to trade off more shader levels against lower desktop resolution. That's what I used to do with Doom3 to get it running okay on a 5200 FX - I'd hack the config file to make it run at 560x420 or even 400x300.


Dan
 

·
Premium Member
bsnes, ePSXe
Joined
·
21,982 Posts
oO i think playing at a high res with very high X and Y is much better than playing at a low res with smoothing.

high res is nice and sharp...whereas with a low res and smoothing, you just have a huge blocky display and blur everywhere. i never found extreme blur to be that great, maybe juuuust a tiny bit, but otherwise i like to have the feeling that im seeing things correctly :p
 

·
Premium Member
bsnes, ePSXe
Joined
·
21,982 Posts
well, i had a 5600 back when OGL2 was first released (before "better" shaders came out like scale2x and the better smoothing one) but all the shaders back in the day either didnt work, or worked bad.

like, the black and white (novelty) shader didnt work at all.
the smoothing shader (the first one) was ok, but it was tough to maintain full speed at 3 or 4.
and lets see...a few of the other shaders just produced a glitched screen.

the luigi's shader and the luminance shader werent out when i had the 5600 though. the blur AA one sounds interesting though...
 
1 - 8 of 8 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top