1080p vs. 720p vs. 576p vs. 480p vs. Etc.


Recommended Posts

On a 1080p TV, it's easy to tell if a game is running at 1080p or not. A game running at the native resolution will look pixel-perfect, any other resolution will not. There will be noticeable blur and approximation if you look closely. Usually this means the game is running at 720p, but it could be even lower than that.

If your computer monitor has a native resolution superior to 1280x720, try comparing its native resolution with 1280x720, and you'll have a good idea of the visual difference. You could also find a 1080p video on Youtube and check out the difference between the various inferior formats, but it's not very fair since compression makes most of the difference there.

Link to comment
Share on other sites

I have the same issue with people that seem to know what frame rate a game is running at, especially to the exact fps. I see people say...oh yeah, I was playing game A, B, or C, and when this explosion happened, the game dropped from 60 fps to 28 fps second. REALLY? You actually sat there and counted how many fps just happened, in less than a second? Now I am not talking about some high rig PC game where some crazy cat might have some UI element that shows the FPS on their screen while they play (or the game res the whole time they play). I am just talking about natural play.

Everyone seems to have these super human eyes that can tell you exactly what a res or fps a game is at while they play it, with no help from any tech information displayed on the screen.

I wish I could see all this stuff :(

I can easily tell the difference between a game running at 24 or 30 frames per second and a game running at 60 frames per second.

Link to comment
Share on other sites

I can easily tell the difference between a game running at 24 or 30 frames per second and a game running at 60 frames per second.

So if a game is running at 60 fps, but drops to 24 fps, how did you come up with that 24fps figure?

Link to comment
Share on other sites

So if a game is running at 60 fps, but drops to 24 fps, how did you come up with that 24fps figure?

It's not that you can exactly count that it's running at 24 fps. But there is a great difference between what 60 fps looks like and what 30 fps looks like. It is especially jarring if a game runs at a very fluid, steady 60 fps but then drops to 30 fps for a scene, or for part of a scene.

Here's a cool thing I found: http://www.boallen.com/fps-compare.html

Link to comment
Share on other sites

It's not that you can exactly count that it's running at 24 fps. But there is a great difference between what 60 fps looks like and what 30 fps looks like. It is especially jarring if a game runs at a very fluid, steady 60 fps but then drops to 30 fps for a scene, or for part of a scene.

Here's a cool thing I found: http://www.boallen.c...ps-compare.html

Right, I am aware of that. My question was, how are these people knowing the exact frame, such as 24 fps. I see people claiming they know this stuff.

Link to comment
Share on other sites

Right, I am aware of that. My question was, how are these people knowing the exact frame, such as 24 fps. I see people claiming they know this stuff.

Honestly, I'm going to call bs on their part. To tell only visually whether a game is running at 24fps vs 30fps is rather impossible. In a side-by-side comparison, maybe you could tell one is running at a slightly lower framerate, but even that I am not so sure.

Also, if not framerate-locked, it is very likely that the fps is constantly changing, so you can't just say "oh, it's running at __ fps" without a true counter displayed.

What is to say that the game isn't running at 27 or 20 fps instead of 24? The only reason people spit out these numbers is because these framerates are commonly used by films and video games. 24fps is used very commonly in movies to achieve the "cinematic" look, while 30 and 60 are other popular framerates.

Link to comment
Share on other sites

Honestly, I'm going to call bs on their part. To tell only visually whether a game is running at 24fps vs 30fps is rather impossible. In a side-by-side comparison, maybe you could tell one is running at a slightly lower framerate, but even that I am not so sure.

Also, if not framerate-locked, it is very likely that the fps is constantly changing, so you can't just say "oh, it's running at __ fps" without a true counter displayed.

What is to say that the game isn't running at 27 or 20 fps instead of 24? The only reason people spit out these numbers is because these framerates are commonly used by films and video games. 24fps is used very commonly in movies to achieve the "cinematic" look, while 30 and 60 are other popular framerates.

That is EXACTLY my thoughts as well.

Link to comment
Share on other sites

Just because people say it, doesn't mean its true.

Agreed and understood. But if it WAS true, I wanted to know how!

Link to comment
Share on other sites

Agreed and understood. But if it WAS true, I wanted to know how!

because you choose to buy multiplatform games on your 360 :p

Link to comment
Share on other sites

I guess some people in this thread have never heard of a program called FRAPS, which among other things displays the exact frames per second a game runs at.

I know at least for myself, when I first start playing a game on the PC, I leave FRAPS running for at least an hour or two until I make sure all the graphic settings are giving me optimal performance.

Not only that, I tend to leave FRAPS running since it allows me to take screen-shots.

So it is not some magical ability, it is called running a program that shows you the frames per second. :rolleyes:

Link to comment
Share on other sites

I guess some people in this thread have never heard of a program called FRAPS, which among other things displays the exact frames per second a game runs at.

I know at least for myself, when I first start playing a game on the PC, I leave FRAPS running for at least an hour or two until I make sure all the graphic settings are giving me optimal performance.

Not only that, I tend to leave FRAPS running since it allows me to take screen-shots.

So it is not some magical ability, it is called running a program that shows you the frames per second. :rolleyes:

Right on. But in my OP, I stated this is for consoles only (and NOT using any info that the TV gives).

because you choose to buy multiplatform games on your 360 :p

Lies, lies and more lies! I bought Rainbow Six Vegas 2 for my PS3 wink.gif One mp title has to account for something, right? biggrin.gif

Link to comment
Share on other sites

Honestly, I'm going to call bs on their part. To tell only visually whether a game is running at 24fps vs 30fps is rather impossible. In a side-by-side comparison, maybe you could tell one is running at a slightly lower framerate, but even that I am not so sure.

Also, if not framerate-locked, it is very likely that the fps is constantly changing, so you can't just say "oh, it's running at __ fps" without a true counter displayed.

What is to say that the game isn't running at 27 or 20 fps instead of 24? The only reason people spit out these numbers is because these framerates are commonly used by films and video games. 24fps is used very commonly in movies to achieve the "cinematic" look, while 30 and 60 are other popular framerates.

I'm not sure if I can tell between one scene and another, but I can definatley tell between 30fps and 24fps. Movies are played back at 24fps (for a "cinematic experience") and hence when you see those extra takes or whatever it seems to move much faster. Soap operas, educational shows (mythbusters) are shown at 30fps. Flip between HBO and Discovery and watch the difference.

Link to comment
Share on other sites

One of the questions I have always asked myself is (related to consoles), how on EARTH do people know when a game is running in one of the above resolutions? I know that when my Xbox 360 and PS3 starts up, my TV notes the resolution of the dashboards. But what about when you are in a game?

Say as example, they say a game that is running on the PS3 is running at a sub HD (say 600 or 570 or whatever) resolutions, and the Xbox 360 version is running at 720p res. Or even if the PS3 was running 1080p vs. a 720p on the 360. OR, if you see the game run 720p at some points, but drops to below HD at other points.

How on earth are you seeing all this happen? My TV doesn't show the signal changes. Not to mention, maybe I am thinking about this wrong, but wouldn't it look dramatically different. I think about my desktop resolution. When I change my desktop to a higher resolution, my icons get smaller, I get more real estate, etc. But when I see websites doing side by side game comparisons, all of the images are the same size when they have the screenshots cut in half with each respective version version on each side. Say it's the back of a car:

gg_gta4platforms.jpg

And even then, how does one just look at the above photo and say "that is not 1080p".

Is there some magic I am missing on how to see all this. I have been playing my PS3 and Xbox 360 since they came out, and when playing those games all these years, I have had NO idea what the games resolution was.

Just figured this would be a good place to ask. biggrin.gif

that picture you posted isnt about resolution comparison, its image quality comparison, to see if anything gets rendered differently between the 360 and ps3

a telltale sign of resolution is if you look for any shimmering which happens on screen, which happens when there are details which are too fine to be shown with the pixels... it would suddenly get grainier or blurier when the resolution changes , like you cant make out faraway things all of a sudden... people who spot these things happen usually dont pay attention to where the action is happening, like they've probably been through that part of the game before

Link to comment
Share on other sites

Wrong.

DVD is 720x480.

720p is 1280x720

A little addition regarding DVD resolution : 720*480 is for NTSC and it's 720*576 for PAL territories.

Link to comment
Share on other sites

Right, I am aware of that. My question was, how are these people knowing the exact frame, such as 24 fps. I see people claiming they know this stuff.

most PC games have a console command to show the FPS.

I can tell the diffrence between 60 fps and 500 fps in wow :p. but I think after about 80 I dont notice any diffrence at all

Link to comment
Share on other sites

I don't know why you would want to be able to tell when a game is at a sub-HD resolution and is just being up-scaled, and I would personally call BS on most being able to see it with their naked eyes, which is why they have that topic that is on some forum which I can't remember now where people have to get pictures and zoom in and count the pixels to see.

Plus I can only see it ruining your gaming experience in the end if you noticed every one of these things. :p

And ajua above me is wrong, you can see the naive resolution of a game on a console whether it was being upscaled or not.

If you want some info on how it was done back with Halo 3 I would look here. http://www.gamerawr.com/2007/09/28/halo-3-only-runs-at-640p/

As for changing of FPS, that's pretty noticeable at certain degrees, especially when a game drops just below 30 and it starts to jitter slightly. If you want a good game to see, I would try Bioshock 2 on either console, it has an option to unlock the frame rate so the game can push more when not much is going on, but then can get bogged down in a battle slightly.

I would turn it off and run around a deserted area, and then turn it on, I immediately notice a difference in smoothness myself, and with it on and playing through the game you can see it change while things are going on.

I personally call BS on most not being able to see it with their naked eyes.

When a game is upscaled -- such as Halo 3 or Grand Theft Auto IV -- it's very noticeable to me. You can see obvious blurring. It's almost like seeing a game being played on PC with 2 x antialiasing (AA) enabled, and then seeing it with 4x AA or higher... it may look similar, but it's an obvious difference to the naked eye, IMO.

Try this: take a high-resolution image. Scale it down in Paint.NET or some other program, and then scale it back up based off the downscaled image and compare it to the original. That's similar to what's happening here -- you're taking a smaller image than 720p, and you're upscaling it to 720p, so the blurring is obvious IMO.

Edit: As for the OP... how can you post a low-resolution image and ask someone to tell the difference? That's like lowering the resolution of both a Blu-ray and a DVD by a significant amount and asking to see the difference (not the best example in terms of quality, as the Blu-ray will still likely have better color even downscaled).

Link to comment
Share on other sites

You check your monitor/TV's reported resolution while in game. Also, the image should become clearer the closer it gets to your monitor/TV's native resolution.

Link to comment
Share on other sites

A little addition regarding DVD resolution : 720*480 is for NTSC and it's 720*576 for PAL territories.

Nicely put, i was just about to say the same myself :)

Link to comment
Share on other sites

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.