Next Generation Emulation banner
Status
Not open for further replies.
1 - 20 of 26 Posts

·
From Love and Limerence
Joined
·
6,574 Posts
Always use the best connection you can get, but in the case of VGA vs DVI/HDMI (HDMI is just DVI with sound), it isn't that big. That's actually putting it mildly. It's basically next to nothing. It literally is nothing on any CRT display (since the display is analog anyway), and it's next to nothing on an LCD. Most people can't even tell the difference. Don't worry about it. You would, however, see a large difference between VGA/DVI/HDMI and anything lower (like component, S-Video, or composite), so just make sure you stick to either VGA, DVI, or HDMI. In all but a rare few cases, you'll get about the same end results.
 

·
From Love and Limerence
Joined
·
6,574 Posts
I'd opt for a real monitor over a television myself.

With connections, it really depends. There's supposed to so little difference between DVI and VGA, or VGA and component, but some displays look better with one and not the other, and this is especially so with televisions. There are even cases of people getting blurry pictures with DVI, and then getting real sharp ones with VGA and an adapter. Try it and see. That's the only way you'll find the best picture.
 

·
From Love and Limerence
Joined
·
6,574 Posts
Clarity has less to do with size and more to do with resolution and display quality. Don't think that the biggest display will give you the best clarity. On the contrare, a large display with a very low resolution (such as yours) will look more blurred. On top of that, televisions often aren't as clear as monitors for PC displays.

I have what's technically considered an HD television, but it's a rear projection 1080i 56" CRT type, so using component input from my PC gives a very blurry image (at least when I tried at 1280x720p, which it's supposed to upscale to 1080i very smoothly), but input from HD broadcasting over the same input comes in very clear. In fact, it looked no better than an S-Video connection, even though the display itself looks otherwise clear from broadcasting. I haven't tried using it's DVI port yet. The thing is DVI-D, and all of my adapters are DVI-I, so they won't fit. I'm thinking of picking up an adapter from Newegg to try it out, but I'm doubting it'll help, and I hardly use my PC on my television anyway. I tried it just to try it (though Dead Space looked pretty good).

I've found that most televisions don't do so good as PC monitors. Some of the better true HD LCD ones may, but that at that point, unless the PC is being used solely as a HTPC or something, you're better off saving half the money and getting a real PC display.

Also, I'll follow your idea of seeing the display in person before buying it. Beauty is in the eye of the beholder, so buy something you think looks good.
 

·
From Love and Limerence
Joined
·
6,574 Posts
Almost every 1080i TV I've ever seen aren't 1080 native. They're 1366x768. (some are even 1024x768....bleh) They only say they accept 1080i video for marketting purposes.

Change your resolution to 1360x768 and try again.
If it was blurry at 1280x720, I don't think 1366x768 will be much better. Remember, this a rear projection screen, and a CRT type one at that, and I'll be the first to admit I don't know much about them, but I know they don't seem to work the same way as native HD displays do. It doesn't appear to have a real native resolution, and everything I've found about it's specifications lists nothing for a native resolution. It just highlights that it'll do 720p, 1080i, and upconvert all sources to 1080i (and yes, this is often in it's description or features part, not specifications part, which makes it just seem like the usual marketing stuff). I tried 720p via component from my PC, and it looked only barely better than S-Video. That same component connection from broadcasting in HD, however, looks pretty good (though probably not as good as a true native HD display would show 720p/1080i).
Crysis doesn't count as a game Shadow, its only used for e-penis stroking contests.
Personally, I love the game. Just because it's hard on performance and people use it as a benchmark, or just because it may not be the best optimized game, doesn't mean it isn't a game. I don't use it for benchmarks, at all. I use it to play it, and personally, I've loved both of the games. I'm awaiting the next two.

Edit: Televisions aren't my specialty, but besides the older and larger CRTS thad could do 2048x1536, there's only two displays that come to mind that are over 1920x1200, and that's the Sony GDM FW900 (2304x1440 Widescreen CRT) or those large LCDs (2560x1600).
 

·
From Love and Limerence
Joined
·
6,574 Posts
1080p seems to signify that it does 1920x1080 as it's native/maximum. I'm honestly having a hard time buying that a 42" LCD television does 2880x2660. Not only is that too high for that size, but there's no such resolution or aspect ratio. 1080p means it has a 16:9 ratio, and I'm not sure offhand what a 2880x2660 resolution ratio would be, but it's nowhere near widescreen, and is a very odd aspect ratio that's more square than 4:3 (probably 5:4 even), and it's one of which Google has no results on. What's the brand and model of the television?

By the way, the polygon doesn't go up with resolution increase. Polygon size does, but not polygon count. It's the pixel count that goes up.
 

·
From Love and Limerence
Joined
·
6,574 Posts
A 1080p display can not physically display anything more than 1920x1080. Even if the source is higher, you're not getting anything more than 1920x1080, and if the source isn't 16:9, you'll get less as it letterboxes or adds bars to cover the blank space the non 16:9 ratio resolution would not fill as it's scaled down to fit.

I'm also finding it hard to believe that that Radeon HD3650 is able to output a resolution of 2880x2660. That card can't do higher than 2560x1600.

Speaking of 2880x2660, a ratio like that is just downright odd. It's almost square. That format screen downsized onto a 16:9 screen would have huge Black bars on both the left and right of the screen (we're talking about 50% of the screen area would be unused here), and I doubt anyone would use it like that.
 

·
From Love and Limerence
Joined
·
6,574 Posts
I'm not talking about Microsoft pushing DirectX 10 or EA's use of protection software. All I'm saying is that there were those of use who did genuinely like Crysis for a game and for more than a show off benchmark. The story, visuals, sound, and gameplay were all at least average or above average for me, so I enjoyed it. That's all I'm saying.
 

·
From Love and Limerence
Joined
·
6,574 Posts
Well for me, I found the games a major disappointment. Same with FEAR 2. All the hype was for nothing IMO, since none of those games bring anything really innovating.

Sure in Crysis, using depth components of a framebuffer image to calculate ambience occlusion entirely on the GPU sounds neat, but the concept of ambience occlusion itself is quite old. So also, nothing really innovating on the technical sense either.
Well, that all may be true to you, and I can definitely see how one could be disappointed with the Crysis games, but I had a different opinion of them.

First of all, I'm not one of those people who will call a game out if it isn't original or innovative or anything like that, because new does not automatically equal good, and been done does not automatically equal bad. Yes, it's extra points if something is unique, but it's not a factor to a game's greatness itself. Really, being cliche isn't as bad as everyone says.

That being said, while the storyline wasn't the best you'll find, it was acceptable. I'm mixed on the aliens and the second half of the game. I'm not someone who thinks it shouldn't have had aliens (that's a key part to it's story), but the fighting with them wasn't as fun as the first half when you fought the KPA. There should have been more KPA in the second half.

The visuals were harder on performance than I felt they should be, but they still looked damn good, and are still among the best.

The sound was pretty underrated (at least the music). I loved the music in this game. The sound effects were your standard fare.

As for the often criticized gameplay, it was there. The thing with Crysis is that the gameplay depended on you, the user, not itself, so alot of people rushed through (and I really mean rushed through) and thought the game was bad. If you play it that way, yeah, it won't be fun. I found it more fun to turn the difficulty up (hard at the least), and take your time. There's more than one way to do almost any task, so it has replayability. Examine a situation and then decide how to carry it out. The game gives you the ingredients, and you decide how to play. The nanosuit and sandbox style levels opened up alot of possibilites.

I really liked this game, but then again, I didn't go in expecting the hype. In fact, before I played it, I had more a negative view due to both hearing all of the hype and knowing of it's performance tolls, but it turns out I ended up really liking the games. Were they the best? No. Were they good? Hell yes.
 

·
From Love and Limerence
Joined
·
6,574 Posts
Yes, that's because of the dot pitch/pixel size, but Spyhop is right that resolution and screen size are not officially related in any way.

No, stretching does not happen, LCD or CRT, period, unless the source and resolution the television wants to display at are different.
 

·
From Love and Limerence
Joined
·
6,574 Posts
I know you like a big display, but at the cost of using 640x480!? Most websites don't even display properly that low. Alot of sites may work at 800x600, but you really need 1024x768 for the internet to fully look right. 1024x768 and 1280x1024 are the standards right now, so 1024x768 is what sites are designed to fit in.

You already have a large television display that does fine for you, so here's what I'd do. Keep the television as a secondary display dedicated to emulation and such. Use the new display, which would be a dedicated monitor, as an actual monitor. This way, you get the best of both, so to speak.

This has been dragged out long enough. Just give a price range, and others will recommend the best 1680x1050 or 1920x1200 monitors that fall within it.

Edit:
Ever saw this crap Gameman? It's a 17" CRT.


Your 42" TV would be able to display almost the same amount of information as that piece of s**t.
You want to pay a bunch of cash for something that displays the same resolution as a craptastic monitor? Go ahead I'm sure people won't laugh at you.
Actually, I have a 17" CRT in my closet (Gateway VX700), and it's a pretty decent monitor. It's from 1998, but it was high(er) end at the time, and goes up to 1600x1200. That's infinitely more than his 640x480 interlaced and blurry display is giving him.
 

·
From Love and Limerence
Joined
·
6,574 Posts
No kidding about the weight of these things. The one I'm using now, and the older 17" one, are both aperture grille types, and they weigh a whole lot more than the more common, I want to say "standard", shadow mask types. The 17" one had to weigh at least twice that of an older 19" Dell one (shadow mask, so it was lighter) I had at one point.

The size isn't so much a factor for me, since I don't need the room on my desk for much besides the monitor, keyboard, mouse, and speakers, but I did have to remove the top shelf of my desk when I went from the older one to this one (the shelf weight limit was obviously less than just having the monitor on the main desk part). In fact, if weight wasn't a factor, I'd put a second one up next to it.
 

·
From Love and Limerence
Joined
·
6,574 Posts
Every time you've posted a screenshot of your desktop, it's been a 640x480 image, so either it's scaling it down and you're really getting 800x600 shown in 640x480 (which would be the reason it's so blurry, so drop it back to 640x480 and it'll likely get clearer), or you have changed it higher since you posted your last desktop image.

Either way, 800x600 is still very small. It really only still exists in Windows as a legacy option for older monitors that can't go much, if any, higher (though I think it should have been dropped with Windows Vista, just like Windows XP dropped 640x480).
 

·
From Love and Limerence
Joined
·
6,574 Posts
The standard isn't 1680x1050 (unless you meant for your monitor size). The standard is 1024x768 and 1280x1024. It'll move to 1680x1050 next I would guess, which would begin the shift towards a widescreen standard.
 

·
From Love and Limerence
Joined
·
6,574 Posts
Correct.

And to expand on this topic, 1024x768 is a 4:3 resolution, but only if square pixels are used. Widescreen displays that have a 1024x768 native resolution use rectangular pixels to make them widescreen.
Ah, I've heard of those, but forgot about them. They're not really common, especially with plasma displays being phased out in favor of LCDs, so I assumed he was listing the more standard ones. I still think it's odd how you didn't mention 1280x720 though.

By the way, technically, 1024x768 is a 4:3 resolution no matter what. It's all in the math. Making the pixels rectangle may make it natively 1024x768 in a wide fashion, but it's still 1024 pixels wide, and 768 pixels tall, which has a 4:3 relationship. The fact that the pixels are rectangle does not change that. It just makes the image appear like it's widescreen, but it's still 4:3 stretched, only the "stretching" is due to the fact that the rectangle pixels make it appear wider than it "should be", not necessarily that the television stretches the image from one resolution to another.
 

·
From Love and Limerence
Joined
·
6,574 Posts
A 1024x768 display will only have a 4:3 aspect ratio if the individual pixels are 1:1 (square) When you make reference to all 1024x768 displays having an AR of 4:3, that is only true if square pixels are used.

Your math *does* work, but you're not taking into account the pixel aspect ratio(PAR), which affects a display's overall aspect ratio. Pics:

View attachment 202148

View attachment 202149

As you can see, the rectangular pixel's PAR will affect the overall AR of a display, even if two displays have the same resolution, but different PARs, affecting the display with the rectangular pixels' overall AR.
I know all of this. What I was saying though was that the resolution itself is still truly a 4:3 ratio no matter what, despite the physical dimensions of the screen being different.

Your talking about the aspect ratio of the physical screen dimension.

I'm talking about the true aspect ratio of the resolution.

I was calling it out for being "fake widescreen", especially if it's terrible enough to be as bad as the example you posted. A 1024x768 screen shaped as an 8:3 screen? Now that would look incredibly stretched/widened/fattened/whatever.
 

·
From Love and Limerence
Joined
·
6,574 Posts
This is in terms from greatest to least.

HDMI/DVI
VGA
Composite
S-Video
Component/RCA

There's a pretty big gap between the top three and bottom two.
 

·
From Love and Limerence
Joined
·
6,574 Posts
I used the same connection method when I tried connecting my PC to my television. I used the S-Video to Component adapter. However, if you look at the S-Video out on the video card, and the adapter, it has more plugs than regular S-Video, so it's obvious some must be for if Component is used, so I'm not sure if any quality is lost, or if it was just the television, but it looked blurry for me too when I connected mine the same way (yet an actual Component connection from my television box looks fine for HD viewing).

My television has a DVI input, but it's a DVI-D connection (which is odd, since it's a rear projection CRT, which I thought was analog), and I don't have any DVI-D cables or adapters (just have DVI-I to VGA and DVI-A to VGA adapters). I'm contemplating picking up a DVI-D/DVI-D cable to try and see if it's the connection or just the television, but I don't use my PC on my television much anyway.
 

·
From Love and Limerence
Joined
·
6,574 Posts
Component and composite/RCA were mixed up. :p
No, I had it right. Component (I'm referring to YPbPr, obviously) is definitely better than S-Video, and Composite (the Yellow plug, sometimes found with Red and White audio plugs) is definitely the worst. I probably made things confusing by adding RCA to the end of Composite, as RCA is a type of connector (that both Component and Composite use), not a connection type itself.

I'm not sure what resolution you are or were using Gameman. I just stated that I thought it was always 640x480 since your screenshots were always that size and showed no signs of having been resized.
 
1 - 20 of 26 Posts
Status
Not open for further replies.
Top