1080p vs. 720p vs. 576p vs. 480p vs. Etc.


Recommended Posts

One of the questions I have always asked myself is (related to consoles), how on EARTH do people know when a game is running in one of the above resolutions? I know that when my Xbox 360 and PS3 starts up, my TV notes the resolution of the dashboards. But what about when you are in a game?

Say as example, they say a game that is running on the PS3 is running at a sub HD (say 600 or 570 or whatever) resolutions, and the Xbox 360 version is running at 720p res. Or even if the PS3 was running 1080p vs. a 720p on the 360. OR, if you see the game run 720p at some points, but drops to below HD at other points.

How on earth are you seeing all this happen? My TV doesn't show the signal changes. Not to mention, maybe I am thinking about this wrong, but wouldn't it look dramatically different. I think about my desktop resolution. When I change my desktop to a higher resolution, my icons get smaller, I get more real estate, etc. But when I see websites doing side by side game comparisons, all of the images are the same size when they have the screenshots cut in half with each respective version version on each side. Say it's the back of a car:

gg_gta4platforms.jpg

And even then, how does one just look at the above photo and say "that is not 1080p".

Is there some magic I am missing on how to see all this. I have been playing my PS3 and Xbox 360 since they came out, and when playing those games all these years, I have had NO idea what the games resolution was.

Just figured this would be a good place to ask. biggrin.gif

Link to comment
Share on other sites

When the TV is being 'fed' a lower than HD resolution, you just notice. And my tv blinks whenever the resolution over HDMI changes. All of them report the resolution if you hit info or whatever.

Now, when the console is feeding HD resolution (720p, 1080i or 1080p), and the game is being upscaled from a lower resolution by the console, it's very difficult to know what is the actual resolution the game is being rendered on.

But when it comes, to, let's say, video, on my full HD monitor I can definitely notice when something is 1080i/p, 720p, or even less. On a TV it's a bit harder to notice the difference between 1080i and 720p (and almost impossible to notice if the tv is not 1080p). And if the source is crisp enough you may not be able to notice between 720p and 576p (providing the colors are not washed out in 576p which sometimes happens)

Thanks for ur attention :p

Link to comment
Share on other sites

When the TV is being 'fed' a lower than HD resolution, you just notice. And my tv blinks whenever the resolution over HDMI changes. All of them report the resolution if you hit info or whatever.

Now, when the console is feeding HD resolution (720p, 1080i or 1080p), and the game is being upscaled from a lower resolution by the console, it's very difficult to know what is the actual resolution the game is being rendered on.

But when it comes, to, let's say, video, on my full HD monitor I can definitely notice when something is 1080i/p, 720p, or even less. On a TV it's a bit harder to notice the difference between 1080i and 720p (and almost impossible to notice if the tv is not 1080p). And if the source is crisp enough you may not be able to notice between 720p and 576p (providing the colors are not washed out in 576p which sometimes happens)

Thanks for ur attention :p

Not sure that answers my question. That doesn't explain "you just notice". What did you notice? Is it putting more things on the screen? Would building on the left and right be cut off more in 720p vs. 1080p?

Hitting info on your remote to see the game resolution? Is that what real gamers do? Play their games and hit "info" through out the game to see what the resolution is during that particular scene?

Seriously, I am just WAY confused.

My projector shows the resolution every time it changes. It will however not show it if the console upscales it to 1080p.

That is fine and dandy, BUT, how do YOU know it's 1080p vs. 720p if the projector didn't tell you?

Link to comment
Share on other sites

my eyes have been artificially adjusted to count the exact amount of pixels that appear on my tv screen. It usually takes me about 5 seconds to count.

Link to comment
Share on other sites

That is fine and dandy, BUT, how do YOU know it's 1080p vs. 720p if the projector didn't tell you?

My projector supports both 720p and 1080p and I have my consoles set up to switch to the native resolution of the game when available. That is how I know what the console outputs.

For exact details of the resolution of a game there is always this thread: http://forum.beyond3...ead.php?t=46241

Link to comment
Share on other sites

Not sure that answers my question. That doesn't explain "you just notice". What did you notice? Is it putting more things on the screen? Would building on the left and right be cut off more in 720p vs. 1080p?

Hitting info on your remote to see the game resolution? Is that what real gamers do? Play their games and hit "info" through out the game to see what the resolution is during that particular scene?

Seriously, I am just WAY confused.

[/size][/font][/color]

That is fine and dandy, BUT, how do YOU know it's 1080p vs. 720p if the projector didn't tell you?

You notice the amount of detail. on a 1080p screen, any lower resolution is scaled up. Don't think of comparing two pictures based on their size. a 1080p picture is supposed to cover the same 'physical' area with more pixels. On the same monitor and without scaling, one picture will just look 'smaller'. but fullscreen, the lower-than-native resolution will be upscaled.

So you notice. soemthing like this:

http://plastik.hu/media/irobot-4fele.jpg

Link to comment
Share on other sites

You notice the amount of detail. on a 1080p screen, any lower resolution is scaled up. Don't think of comparing two pictures based on their size. a 1080p picture is supposed to cover the same 'physical' area with more pixels. On the same monitor and without scaling, one picture will just look 'smaller'. but fullscreen, the lower-than-native resolution will be upscaled.

So you notice. soemthing like this:

http://plastik.hu/me...robot-4fele.jpg

Whoa....nice link. So that is a picture explanation then. I can see the difference there no question. They all look the same at a small zoom out, but when set to full size, each image is progressively more detailed.

So now I just need to train my eye to determine which which is which so I can definitively say "oh this game is 720p, but here it's running at 576p". because the detail in the character is much worse here than it is there.

Really? A one star vote for this thread? I was raised to believe there are no stupid questions.

  • Like 1
Link to comment
Share on other sites

Tip: Almost the back of every game box actually lists what resolution the game runs at. For example when I look at the box of Uncharted 2, it clearly says it runs at 720p. Other games are not so clear as they list 720p / 1080i / 1080p, and usually what that means is 720p is the highest it will run and it upscales to the other resolutions.

Thing is though it is only running at 720p, there are notorious instances were games were actually not even 720p but 640 and just being upscaled to 720. However distingushing this difference on an HDTV is not all that easy, but one could perhaps tell.

In regards to how one can just tell, that comes from having a extensive background in PC gaming, at least for myself. As a PC gamer, one tends to get obsessed with things such as resolution, and truth be told what are considered to be "High Def" resolutions such as 720p, are actually weak resolutions when it comes to PC gaming. I have not games on my PC at 1280 x 720 I do not think ever to be honest. Even when I had a CRT I am pretty sure the native resolution I set it to was 1600 x 1200. But because of PC gaming, my eye has become trained, that is the simplest explanation. On my PC's monitor, I can without a doubt tell if a game is running at Native Resolution or if it is being upscaled. I can easily spot the difference between 1680 x 1050 and 1900 x 1200. Easily. How? I just can.

After years and years of playing with graphical settings, it just comes naturally. It gets much tougher to distinguish resolution on HDTV's because they are so large in size and truth is close up nothing really looks that good, but when a game is native 1080p, which most games are not, but when they are, I can indeed tell. How? Again not sure, I just can.

Link to comment
Share on other sites

1080p and 720p are simply differences in resolutions. It's just as if you were blowing up a 1024x768 wallpaper to put on your 21-inch monitor... it's going to look fuzzy and pixelated. Anything below 720p is going to look fuzzy on your TV (IF it's an HD TV! Perhaps it's your TV that's not letting you notice the difference).

Edited by Nightwind Hawk
Link to comment
Share on other sites

1080p and 720p are simply differences in resolutions. It's just as if you were blowing up a 1024x768 wallpaper to put on your 21-inch monitor... it's going to look fuzzy and pixelated. Anything below 720p is going to look fuzzy on your TV (IF it's an HD TV! Perhaps it's your TV that's not letting you notice the difference).

720p = DVD quality

1080p = Blu-ray/HD Quality

So yes 720p should look fine... but 1080p is just going to be more crisp and smooth because it's not having to be upscaled.

Wrong.

DVD is 720x480.

720p is 1280x720

Link to comment
Share on other sites

Thanks guys. I am familiar with what aspect the different resolutions are, etc. But I was looking at how everyone in the gaming forums on the interwebs can see these differences on their TV and can pin point the exact resolution, even without the TV signal flashing it on the screen for them.

DirtyLarry seemed to have stated exactly what I thought. Those who have years and years of training on different resolutions can spot them. Experience, training, and familiarity. That doesn't explain how every gamer on these forums seem to know what the resolution is by their naked eye. I was reading about how Alan Wake has 720p and 576p recently. I imagine for "most" people, they probably wouldn't see the difference when the res changes.

I have the same issue with people that seem to know what frame rate a game is running at, especially to the exact fps. I see people say...oh yeah, I was playing game A, B, or C, and when this explosion happened, the game dropped from 60 fps to 28 fps second. REALLY? You actually sat there and counted how many fps just happened, in less than a second? Now I am not talking about some high rig PC game where some crazy cat might have some UI element that shows the FPS on their screen while they play (or the game res the whole time they play). I am just talking about natural play.

Everyone seems to have these super human eyes that can tell you exactly what a res or fps a game is at while they play it, with no help from any tech information displayed on the screen.

I wish I could see all this stuff :(

Link to comment
Share on other sites

I wish I could see all this stuff :(

Do you believe everything you read online? :whistle:

While I can notice when the frame rate drops (stuttering, slow action) I can't tell what game is running 720p or 1080p. I've been playing games for 25yrs consoles and PC. If it looks good and runs smoothly then who cares? Thats my opinoin, gameplay > graphics anyday!

Link to comment
Share on other sites

You'd need really good eyes and a knack for detail (as well as a reference image) to see if a game is running in 720p or 1080p without actually knowing it beforehand.

Link to comment
Share on other sites

As far as I can tell, there's no way to tell.

Some TV's, like mine for example, have a "unscaled" mode so they don't perform any upscaling or downscaling. Using it along with "pc mode" (some tv's have this under a different name) for computers truly makes a difference, even at same resolutions. With my Xbox 360, there is no upscailing being performed by the tv set, only by the console if the game is not 1080i/1080p but I can't tell by just running and playing it.

For console games, there's no way to know the native resolution of a game if you don't know it beforehand.

Link to comment
Share on other sites

I don't know why you would want to be able to tell when a game is at a sub-HD resolution and is just being up-scaled, and I would personally call BS on most being able to see it with their naked eyes, which is why they have that topic that is on some forum which I can't remember now where people have to get pictures and zoom in and count the pixels to see.

Plus I can only see it ruining your gaming experience in the end if you noticed every one of these things. :p

And ajua above me is wrong, you can see the naive resolution of a game on a console whether it was being upscaled or not.

If you want some info on how it was done back with Halo 3 I would look here. http://www.gamerawr.com/2007/09/28/halo-3-only-runs-at-640p/

As for changing of FPS, that's pretty noticeable at certain degrees, especially when a game drops just below 30 and it starts to jitter slightly. If you want a good game to see, I would try Bioshock 2 on either console, it has an option to unlock the frame rate so the game can push more when not much is going on, but then can get bogged down in a battle slightly.

I would turn it off and run around a deserted area, and then turn it on, I immediately notice a difference in smoothness myself, and with it on and playing through the game you can see it change while things are going on.

Link to comment
Share on other sites

Any game that is processed by the console as less than 720 etc ... ie: MW2 is 600p I believe ... but what happens then is the console OUTPUTS 720p ... so the pixels are scaled by the system pre-output. It means the console has less pixels to process in realtime, and just has to scale up which is far less intensive than actually drawing 720p worth of action in the machine.

Regarding the ability to tell ... well ... if you can't tell, you shouldn't care :) If you can tell, you wouldn't be asking about it all. For your information, 99% of games on Xbox and PS3 are processed at exactly the same resolutions.

Link to comment
Share on other sites

I am honestly very surprised, especially from people who have been a PC gamer for even just a few years, that people are saying they can honestly not tell the difference between 1280 x 720 and 1920 x 1080? That is 720p vs 1080p. It is a pretty big difference overall.

Yes, I will admit, and as I pointed out in my first post in this thread, it is much harder to tell the difference on an HDTV because of the sheer size of TV sets everything pretty much poor when your viewing distance to a TV is the same as a monitor, but when you sit far back as you are supposed to, it can be a bit tricky to distinguish the two resolutions, but I still absolutely can.

However there is no doubt in my mind, that if you are looking at and LCD Monitor attached to a computer from a normal computer chair sitting distance, and the image was full screen and not windowed, I am pretty damn confident most people here would be able to tell the difference instantly between 1280 x 720 and 1920 x 1080. It is a pretty big difference and not hard to distinguish at all actually.

I don't know, perhaps it is because I am a graphic designer with a focus more on multimedia and Web, so I have always been really aware and worked I have worked with and have had to be very aware of resolutions on everything I have ever designed. Basically whatever I have designed on screen is how others would also see it, so resolution has always been an incredibly important thing for me and my design. So I actually said in my first post, I am not sure how I know, but I just know. I realize I was wrong saying that, I know because I have worked with resolution for my career. I guess I just viewed it as something that I did not really learn but apparently that is not the case with all of the response I am reading.

  • Like 2
Link to comment
Share on other sites

the difference is BIG on a monitor.

not so much on a TV

When i purchased my new full HD monitor. I played a DVD on it and man it looked so ugly. Even 720 is not so sharp on this. The difference is actually pretty big in games if u stop moving for a second to look at the details like pipes and floor and walls. U can actually point out the blurry lines.

TVs have huge pixels compared to monitors. So naturally its a little blurry when u sit 2 to 3 feet away from it.

Link to comment
Share on other sites

Every Xbox 360 and PS3 game is rendered at the same resolution regardless of what it's outputting. It's either downconverted for SD (480i, 480p) or upconverted to HD (720p, 1080i, 1080p). As far as I know, most games are rendered internally at a little less than what would be 720p resolution. Your TV would not display any changes in resolution since it's done within the console. Hopefully the next generation of consoles will render everything natively at 1080p. Most TVs should be 1080p by that point hopefully anyway and 1080p will become the standard resolution so we can forget about 480i, 480p, 720p, and 1080i.

As far as scaling is concerned, it's quite easy to tell when something is being scaled, at least to me. Play a game like Call of Duty on your PC and your console next to each other and you'll instantly be able to see the scaling. There's a significant amount of scaling involved in getting that image up to 1080p. I think most console games like just fine but scaling is very noticeable if you're looking for it or understand what scaling even is. Like I said, I hope the next generation of consoles renders games at 1080p, that alone should give the next generation of games a significant increase in noticeable image quality.

Link to comment
Share on other sites

Yes, I will admit, and as I pointed out in my first post in this thread, it is much harder to tell the difference on an HDTV because of the sheer size of TV sets everything pretty much poor when your viewing distance to a TV is the same as a monitor, but when you sit far back as you are supposed to, it can be a bit tricky to distinguish the two resolutions, but I still absolutely can.

I think thats the issue, most people who game on their PC sit about arms length from the monitor, and yeah you can really notice the difference, but most modern monitors are designed to use much higher resolutions then 1080p.

With HDTV's its different they are built and designed upto a max of 1080p, plus their size dictate the viewing distance. Sitting at arms length from a 37" HDTV and everything looks disgusting! :p but sitting a few feet away and things look good. But I can deffinately tell the difference between SDTV and HDTV broadcasts and DVD vs BluRay, but with games I don't notice whether its 720p or 1080p unless I sit really close.

The real problem is, people with nitpick, and analys the pixels up close, but thats not what you do in reallity. You sit back and enjoy the experiance. As long as the game doesn't look fugly when sitting at "normal" distance and doesn't drop in FPS while playing then it doesn't mater its its sub-hd, 720p or 1080p.

Link to comment
Share on other sites

I can usually tell by looking at the edge of 3D geometry. For instance, the edge of a character's arms or legs.. lower resolutions, those edges on their arms and legs will appear more jagged. On native resolutions, the diagonal edges like that will appear much more smoother.

Looking really close, I can tell that final fantasy 13 on the PS3 is still slightly jagged, but less jagged than regular 720 models usually look on the TV. So this way I know that FF13 upscales.

On my PC monitor, the difference between 1280x720 and 1920x1080 is massive and easy to pick out. This is because PC monitors are 'pixel perfect'. In the LCD native resolution, every displayed pixel matches every physical pixel the monitor has... so the picture is as sharp as can be, every square is firm and there's no blur. This is especially noticable on text, mouse cursors, and other items that you normally are used to knowing the pixel-makeup of. In any other resolution, the monitor stretches the picture to fit the screen, which means some display pixels take up multiple physical pixels and get blurred by a filter to prevent that distorted look you would otherwise get. As such, your pixels will be noticeably off kilter and no longer perfectly sharp.

I can even tell when an LCD screen is hooked up via VGA instead of DVI, and has not been auto-adjusted even on the native resolution. Because on some columns or rows of pixels, there will be an unusual blurring on the pixels, once again easily noticed on the mouse cursors or other commonly viewed objects.

Link to comment
Share on other sites

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.