Next Generation Emulation banner

1 - 10 of 10 Posts

·
Is it a dream...
Joined
·
616 Posts
Discussion Starter #1 (Edited)
I noticed something peculiar last night while I was messing around with the latest PCSX2. Normally, I use gsdx (unless i'm playing odin sphere) and I have the resolution and internal resolution set to 1280x720. most my games look and run great (aside from Need for Speed Most Wanted; just a bit too slow to be fully playable.) But I figured for the hell of it to crank both resolution options to 1920x1080.

well, for starters, everything looked great. But while playing Shadow of the Colossus I noticed that the framerate was a bit more stable. I just figured it was a fluke, but while playing RE4 i noticed the same thing. Then i tried NFS Most Wanted, and while it still wasn't fully playable, the framerate was improved, and the audio was barely skipping and popping. Now i was getting intrigued..

So I changed to Odin Sphere. This game usually chugs hard with gsdx. The game at 1280x720 usually starts to tank at the "directed by" animation, getting worse at the cinemas where it drops to about ~30 fps. The audio chops like crazy and everything gets out of sync. At 1920x1080, the game was completely smooth up till the scene where Griselda dies. It dropped to around ~40fps, still getting out of sync and chopping, but not nearly as bad.

Now I've seen PC game benchmarks where, at lower resolution, the framerate is usually capped by the processor, and it takes higher settings for the GPU to start flexing its muscle. But for something as CPU dependent as PCSX2, this didn't seem to make any sense. Is there a bit more offloading at higher resolutions? Is it just a fluke? I'm certainly not complaining. Better framerate AND graphics? Score!

heres my specs if it'll help:

Intel Q6600 @ 2.7ghz
Kigston 2GB 800mhz DDR2 @ 840mhz
Nvidia 9600GT @ 733/1033

And as I tend to be horribly long winded, congratulations to everyone who read all of this...
 

·
Final Fantasy XXX
Joined
·
2,413 Posts
Based on my previous experiences with pcsx2, the higher you set the resolution the slower the fps you'll get because you are now playing the game at a much higher res with better graphic and sharper image. And the remedy for this is simply get yourself a faster, better, modern gpu.

Anyways, it's always best to use frap to fps comparison.
 

·
troubleshooter
Joined
·
7,514 Posts
Some game run better with GSdx and some do with ZeroGS.
Games which require too much cpu power like tekken and Soul calibur run better with ZeroGS whereas Games like FFX run better with GSdx.
 

·
Registered
Joined
·
4 Posts
Yup actually it does. I noticed this on Gsdx while I'm tweaking the d3d internal res. The higher the resolution, the better the graphics, but it decreases some fps.
 

·
Registered
Joined
·
2,882 Posts
Actually, he's referring the opposite, he says his performance increases when he ups the internal resolution. That's quite bizarre. But maybe because of the quad core with the decent video card.
 

·
Registered
Joined
·
5 Posts
The reason thats happening is because your GPU's internal clock kick's to its fullest when you put the heavier workload on your GPU by turning up your game's settings. My GPU's do the same thing after they pass about 20% usage. Otherwise your GPU is in power-down mode, causing the clockspeeds to lower considerably.
 

·
Heroes Might& Magic Champ
Joined
·
4,713 Posts
Hmm interesting Epsilon

Tell us all your settings so that we can synthesize your system settings as close as possible.

I'd be interested to see if anyone can recreate this phenomenon.

The reason thats happening is because your GPU's internal clock kick's to its fullest when you put the heavier workload on your GPU by turning up your game's settings. My GPU's do the same thing after they pass about 20% usage. Otherwise your GPU is in power-down mode, causing the clockspeeds to lower considerably.
ahh true, that may be it... this is a possibility I doubt many have considered... even the pcsx2 team

he's noticing this on his g92 based 9600gt, i wonder if i'm going to notice this on my e6600 & g80 8800GTX

*runs off to test*
 

·
Registered
Joined
·
5 Posts
The fact is, ever since power consumption became an issue, ATI and nVidia have taken "secret" measurures to make more "effecient" GPU's. Both, however, only intergrated the feature into recent chipsets, ATI started using the technology in the HD3000 Series and nVidia started with the 7000 Series. ATI calls their technology PowerPlay (ATI PowerPlay™) and nVidia calls their technology PowerMizer (NVIDIA PowerMizer Technology). What everyone has neglected to notice is that the Power-Guidelines even exist, and that they are built into the BIOS of the Newest GPU's, where as its forced by the Drivers in older hardware. I do, however, know how to circumvent both ATI's and nVidia's power-controls, let me know if I should make a How-To guide.
 
1 - 10 of 10 Posts
Top