Next Generation Emulation banner
1 - 20 of 75 Posts

· Registered
Joined
·
81 Posts
GTA 2 doesn't work with PGXP XD. I was thinking if this was similar to DOOM, as it's not actually 3D, but using "Mem + CPU logic" mode stretches everything, like it did with most games I played before, so maybe the buildings are 3D.

PGXP "Memory only" mode: PGXP "Mem + CPU logic" mode:
Does the Perspective Correct Texturing not work with GTA? Those textures are so wobbly lol
 

· Registered
Joined
·
81 Posts
So, I got kind of nostalgic and decided to dust off my old PlayStation, see if still worked. It did, so I decided to play some Crash Bandicoot, in its original form, the way it was meant to be played. The disc I have has Crash 1, 2 and 3 in it, but 1 didn't worked, so I pressed reset, and now booted Crash 3. Besides my head now not being able to clear the feeling of "why everything shakes?" (I got too used with PGXP XD), I immediately noticed something. Those "holes" that appear at a distance, even with PGXP, happen on the real hardware as well! It's very weird actually, I don't remember that happening at all. Unless my PlayStation is too old (it only works when standing vertically actually), or maybe the disc I have has been compressed or had something modified to fit the 3 Crash games in it, but still, maybe this issue can't actually be fixed with PGXP. Some kind of ancient draw distance/LoD bias method? Here's some pics and video (got used to it now). I had to use a Windows Phone to record, and also sorry about my old 14" CRT, it sucks, but I still play my Wii on it XD.

Crash Bandicoot 3: Warped on real PlayStation. Original SCPH-1001 model, with the separate RCA inputs and parallel I/O port, if it interests. It also has that issue with the power supply being just under the CD drive, so over time it heated up the lens, its thin metal support dilated and shifted the lens focus, that's why it only works standing up.
Also some pics EDIT: The pics were too big :oops:.
Nice find with the terrain holes. So there is no problem with pgxp then. It's just a bug that was there in the first place.

I was wondering, what does pgxp work like on the jetski level? The Crash developers programmed a software z-buffer just for that level. So would pgxp like that or spit out corruptions?
 

· Registered
Joined
·
81 Posts
@Reventon2010 I think it works as it should. PGXP mode shows a white circle around the jet ski, maybe that's the z-buffer part? Nevertheless, the moving water looks like what I think water would look like on the PSX...
I think the z-buffer is to prevent any z fighting on the water mesh. There's a lot of overlapping waves with transparency. Without a z-buffer, the ps1 doesn't know in what order front-to-back to show the polygons. So you would get a lot of flickering and flashing.

It's amazing the way Naughty Dog could levearage the Hardware potential of the ps1. Also, it looks good with pgxp.
 

· Registered
Joined
·
81 Posts
@the_randomizer My case is the same as @unreal676's case, but if his GTX 260 is ancient, my GeForce 9600GT is prehistoric then XD. But for emulation, the CPU is what's used the most, and while recording, any other software will use compression to keep the file size small, and since FRAPS has practically no compression, it won't use much of the CPU while recording. If my video card supported any kind of hardware decoding, it would be easier for me, I could just set a decoder and use the video card for the compression task instead, but alas, this s**t is old XD. The best other recording software that worked for me was BandiCam, even while compressing, but the end quality will never be greater than with FRAPS and recoding with XMediaRecode, since recording while compressing, to keep the framerate steady, will result in a s**t quality as well. OBS is just too complicated for me, and Shadow Play needs a GTX600 series + card to work.
Lol im using a 9500gt after my gtx 560ti kicked the dirt. I am looking at getting a gtx 750ti though. So I will get to experience OBS/Shadowplay.
 

· Registered
Joined
·
81 Posts
Using the OSD in MSI Afterburner shows an accurate internel fps because it counts the actual frames coming through the gpu. So you would see any dips that occur in games.

Edit: The gui counter is actually a good indicator of your pc performance. Im pretty sure when it drops low, it's your pc thats strugling to render the frames. So if it stays at 59.99, any stutters will be in the game itself.
 

· Registered
Joined
·
81 Posts
I have a few problems I'd like to ask about.

The first one is ResHack. To my understanding, 1x1 means 320x240x1. So native. Then, 8x8 would be 320x240x8 which should roughly be 2560x1920.
Now I run my Resolution at 1440x900. So 8x8 should be 2560x1920 downscaled to my 1440x900. But it just looks like 1440x900. Infact, anything over 5x5 doesn't benefit me at all.

This is 5x5:
Light Visual effect lighting Stage Performance


This is 8x8:
Light Visual effect lighting Stage Performance


Which brings me to my second problem. Shaders.

I have the PsxFX shader. I drop both files (gpuPeteOGL2.slf, gpuPeteOGL2.slv) into Shaders, edit the slf to enable FXAA, then enable the Shader in OGL2Tweak plugin (GLSlang files) Shader level=4: Maximum.

But nothing at all. The picture is just as aliased as before.

And the last one isn't as important. Trying to select OGL1.78 plugin gives me this error:
Blue White Text Line Font

I have Visual 2015 x86 thingy installed, so I don't know what's wrong with that.

So could anyone help me with these strange problems? Thank you.
 

· Registered
Joined
·
81 Posts
@otherman Yes, I get that emulation needs hardware that is more powerful than the original (not only PC, but other console hardware, and even the "smart" devices nowadays), and as new technology appears, devs will use those technologies for improving either accuracy, or for enhancing the experience in their emulators. But what I meant was that if they implement new features, why would they "turn off" features that worked on older hardware? That's why there are tons of different emulator branches out there, if it's open source. Not only because of the different things one branch does that the "main" branch doesn't (like PCSXR-PGXP), but the most branches I've seen out there, are performance related.

I do love what PGXP does right now, I just don't want that in the future, it becomes unplayable for me because of features that any device I have won't support. And it's not that easy to buy new devices...
Maintaining multiple features in the source code can be hard work. But I agree with you. Why turn them off when they are working perfectly fine.

Also, it would be worth getting hold of a core i3 if your ever looking for an upgrade. There's a cheapish one that clocks above 4ghz. It's one of the fastest available and would probably be a huge upgrade on your current amd.
But I can understand the prices in some countries are poor though, so whatever is possible.

Edit: I was wrong. Its not an i3, it's Pentium G3258. It can easily overclock past 4ghz and is the most powerful dual core for the price.
 

· Registered
Joined
·
81 Posts
Just wanted to show a picture of Crash 2, wanted to take an image in the same area you did, but... well I didn't want to play until I got there :3.

Anyhow, only difference is I do have 2xSAL high-res textures enabled alongside 8x8 ResHack and FXAA from Asmodeans PSxFX shaders.

That looks amazing. It's strange how PsxFx works for you but not me. What GPU are you using? Also, what resolution do you run at?
 

· Registered
Joined
·
81 Posts
@Reventon2010 Here is a picture of my settings, the only thing that matters will be the boxed in location.

http://i.picpar.com/svac.png


Here is what my PSxFX settings look like, you can copy them 1-1 if you'd like, I also have Phong Shading and Color Correction on, they have benign effects on performance, but the slight difference that they give the image can be noticed.

http://pastebin.com/9s8tEdHj

It should work like that. FXAA isn't perfect and you will notice aliasing in the image with certain angles, but most of the time, you should have really nice lines from anything 3D based.

Here's what the game looks like with no filtering, no high-res textures and xBRZ disabled.

http://i.picpar.com/0vac.jpg
I will give your settings a try next time I use PCSXR. Looking in the shader though, I just found this line:

GL_ARB_shading_language_420pack

That is OpenGl/glsl 4.2. My 9500gt only supports 3.3 It could explain why it doesn't work for me

What is your GPU?
 

· Registered
Joined
·
81 Posts
I run a GTX 260, and I don't believe the entire shader would stop working just because your missing that feature.

If you still can't get FXAA working with the settings above, set that line to disable and see if anything happens/changes. (Unless it's already disabled automagically)

Note: I decided to check up on that feature, here is a page I found that suggests your GPU enables that specific OpenGL function, so there shouldn't be any issues on your end. http://feedback.wildfiregames.com/report/opengl/feature/GL_ARB_shading_language_420pack

http://feedback.wildfiregames.com/report/opengl/device/GeForce 9500 GT - For your GPU in specific

Hope this helps you!
I used your config and it works. My config was actually missing a few lines. But, the fxaa is so minimal that it's hardly noticable. Infact I get better results by forcing fxaa through Nvidia Control Panel.

What I've settled for is using "GLSLang Smoothing" on Max, and "Fullscreen Smoothing" together. No more jagged edges :)
 

· Registered
Joined
·
81 Posts
Why do you have the shader level set to 1 in the OGL2 settings dialog. Of course the effect will be minimal. It even says "minimal" ;p

it's interpolated to 25% with the base game at that level. Turn the shader level to 4, and use the shader settings themselves to adjust the strength of effects.
When I tried it, mine was set to maximum along with Ultra Quality set in the shader file. Still, fxaa was hardly noticable.
 

· Registered
Joined
·
81 Posts
Well, xBRZ tries to "filter" FMVs as well, and since they're low res, with heavy blocking created by resolution upscaling, it will get very slow because of the real time/constant filtering process, depending on how "detailed" the scene is. One of the slowest intro FMVs for me, is the ones in Capcom games after Resident Evil 2, it was slow even without xBRZ. It will crash if xBRZ is too high (above x4 for me). But not only FMVs, during framebuffer effects as well, there's a random error message sometimes related to C++, other times a proper Pcsx error or something.

I like the look that xBR(Z) gives to games, but the many issues it has with the Tweak right now, make it almost unusable (after playing many games, only a few are really playable with xBRZ). I hope @tapeq comes back to the Tweaks sometime, his work on it was great. But that discussion would be better on the Tweaks thread, so @Thirteen1355, if you have any other doubts related to xBRZ, I would recommend asking there, since there's a better chance of tapeq looking at it (I guess).

Or maybe a small feature request to "turn off" xBRz during MDECs (or something), like "FasFBE" did for framebuffer effects, would be nice... Too hard maybe :p? (Give a 'Yay' for "FastFMV"!!)
YAY!
 

· Registered
Joined
·
81 Posts
Yea for sure it is super stable, also CPU overclock and widescreen hack display properly with King's Field. I think I'm just going to bite the bullet and upgrade my computer as I've been dawdling on it for awhile.

Also, please forgive my use of wording. I didn't mean to come across as a blowhard haha. I was more wondering if I was doing something wrong rather than mednafen's opengl implementation not working correctly.
What are your current computer specs? Perhaps we can advise you on parts :)
 

· Registered
Joined
·
81 Posts
Yea, I wanted to hold out but Kaby Lake looks marginally better than Skylake and I don't really want to wait until 2018~ for a highend Cannonlake when I'm already CPU bottlenecked with my 1070.

Vulkan and DX12 do alleviate a lot of my CPU concerns for regular gaming but when I went from 28nm GPU to 16nm, it was apparent that my current cpus are holding me back. I was considering with getting dual E5-2680s because they are cheap as chips right now, but I'd really like to also get a motherboard that supports USB C 3.1 (preferably thunderbolt 3) and 2+ M.2 slots that support NVME.

I'm on the fence if I should get the 6800-6850k just to have more PCI-E lanes.
Don't buy Kaby Lake if your on Windows 7/8. It is strictly 10 support only.
 
1 - 20 of 75 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top