What is "HD"?


 Share

Recommended Posts

BTW if your doing native VGA 1366x768, it is not running at an offical "HD" timing/standard.

But there is enough bandwidth over VGA, and you are displaying a HD image if running at this resolution. in fact you are exceeding the standard 720p resolution of 1280x720 resolution.

so atkinsn2000 is right in that you aren't outputting to standards if over VGA, but bandwidth one is incorrect as you can output resolutions that match or exceed say 1280x720 or 1920x1080

P.S

I dont have a "direct" source for what I'm posting, but this is general information that I have accumulated in my head over the years. The timings stuff I discussed with a TEAC technician, and when you look at what I have posted, it all fits in.

Edited by _kane81
Link to comment
Share on other sites

If you run HD content on say, a 20" or 24" screen, 720p and 1080p will you notice a difference at all? I read you can't really tell.

Link to comment
Share on other sites

VGA can handle HD resolutions such as 720p and 1080i

BUT

it is not HD because it can't handle the bandwidth required to display a true HD image

to be classed a a true HD image it must HDMI or DVI

Component is a version of analogue HD that can handle the bandwidth because of the use of the separate cables but it does not use a digital signal so you do loose some quality

Im guessing you own a PS3 too?

Link to comment
Share on other sites

no you wont notice much difference ^^

think of these resolutions like a photo which is taken wth say 100 mega pixels, but your display will only show 20mega pixels. you will not be able to see the other pixels, infact the pc or scaler will throw away the pixels so that you can see all of the picture.

if however you show a 2 mega pixel image on a 20 Mega pixel display, the image will look all blocky, just like zooming in on a small picture.... that is of course if you zoom to ake it full screen. if you leave the source unchanged, it willjust look like a small picture on a large screen.

Link to comment
Share on other sites

^but the 24" is capable of full 1080 output, its a 1920x1200 display. Still though people say a 720 video will have no noticable difference with the 1080. Anyone can elaborate or give insights to this?

Link to comment
Share on other sites

720 video will be zoomed in / scaled on that screen, I dont think you will notice the difference on such a small screen though.

1080p to me any way is really appropriate for large screens.

video may look better on a native 1366x768 screen than a 1920x1200 screen. depends on the video source and the video scaler built into the TV.

if you really want to know, go down to the store with a DVD and test the screen. screen quality and processing technology can make a big difference to theoutput.

Link to comment
Share on other sites

VGA can handle HD resolutions such as 720p and 1080i

BUT

it is not HD because it can't handle the bandwidth required to display a true HD image

to be classed a a true HD image it must HDMI or DVI

Component is a version of analogue HD that can handle the bandwidth because of the use of the separate cables but it does not use a digital signal so you do loose some quality

Seriously, just no. The simple fact is, with a simple adapter and nothing else, you can turn a DVI cable into a VGA cable. Now, if VGA was unable to accept the bandwidth that you are trumpeting, then such a connection wouldn't work. Your information is bogus, at best. Component, VGA, DVI and HDMI are all fully capable of displaying HD content. Component and VGA are even capable of 1080p, easily, just like the other two.

Link to comment
Share on other sites

Aren't there monitors and video cards that can display > 1920 horizonal resolutions over VGA?

wouldn't that make them better than HD?

Exactly why I say he is wrong.

Link to comment
Share on other sites

anyone want to provide some proof about all this HD talk, rather than voice your opinion and have someone else say something totally different?

if you want proof do some of your own research, forums are alwasy only peoples opinions. if you want to know go read the specs yourself

Aren't there monitors and video cards that can display > 1280 horizonal resolutions over VGA?

wouldn't that make them better than HD?

no becasue that is no longer VGA you are up to WXGA or WUXGA at that point

VGA resolution is 640 x 480 thats all you can technically ever get on a VGA connection if you go up to SVGA then it's 800x 600 etc etc

look here it will tell you all http://en.wikipedia.org/wiki/Display_resolution

VGA can handle HD resolutions such as 720p and 1080i

BUT

it is not HD because it can't handle the bandwidth required to display a true HD image

to be classed a a true HD image it must HDMI or DVI

Component is a version of analogue HD that can handle the bandwidth because of the use of the separate cables but it does not use a digital signal so you do loose some quality

VGA = 640x480 it does not handle higher resolution, it will be a different format then, i.e. SVGA, XGA, SXGA, UXGA etc etc

^

So where does it say VGA does not have enough bandwidth?

the specs for VGA pretty much say it loud and clear 640x480

BTW if your doing native VGA 1366x768, it is not running at an offical "HD" timing/standard.

But there is enough bandwidth over VGA, and you are displaying a HD image if running at this resolution. in fact you are exceeding the standard 720p resolution of 1280x720 resolution.

so atkinsn2000 is right in that you aren't outputting to standards if over VGA, but bandwidth one is incorrect as you can output resolutions that match or exceed say 1280x720 or 1920x1080

P.S

I dont have a "direct" source for what I'm posting, but this is general information that I have accumulated in my head over the years. The timings stuff I discussed with a TEAC technician, and when you look at what I have posted, it all fits in.

ok the 640x480 thing again, VGA is a standard for display resolution and that resolution is 640x480

Link to comment
Share on other sites

whocares78: You're overlooking the part where the term "VGA" is a commonly used and completely accepted term for the analog input on PC monitors. VGA resolution is 640*480, but the connection and the wires themselves can carry much more. No one buys a "WXGA cable". They buy a "VGA cable". I don't know how you completely missed out on the fact that it's so widely accepted.

Link to comment
Share on other sites

whocares78: You're overlooking the part where the term "VGA" is a commonly used and completely accepted term for the analog input on PC monitors. VGA resolution is 640*480, but the connection and the wires themselves can carry much more. No one buys a "WXGA cable". They buy a "VGA cable". I don't know how you completely missed out on the fact that it's so widely accepted.

I know what you are saying and i agree, but i don't like it, it's terms like this that annoy the crap out of me, i am talking about terms that mean somethign specific, but people have began using it as a generic term for all connectinos with a DB 15 connector on it. this is whyu everyone gets so confused when the term comes out.

all i know is when i go and buy a cable they generally have SVGA, or XGA writen on them not VGA, and when i did actually buy a VGA cable it was crap and the picture was screwed cause it could not handle the resolution

Link to comment
Share on other sites

ANY COMPUTER MONITOR OUT THERE IS OUTPUTTING BETTER PICTURE THAN AN HDTV WILL EVER.

even my monochrome screens i have form like 20 years ago :)

p.s. my advice to yo is to never say never, HDTV will only get better and when it comes down to it its the same technology as computer monitors

Edited by whocares78
Link to comment
Share on other sites

ANY COMPUTER MONITOR OUT THERE IS OUTPUTTING BETTER PICTURE THAN AN HDTV WILL EVER.

Any? Odd, most computer monitors on the market have a lower resolution than a 1080p television. Television screens are also, obviously, much larger thus allowing the 10 foot experience as opposed to the 3 foot, as MS would like to say.

Link to comment
Share on other sites

I know what you are saying and i agree, but i don't like it, it's terms like this that annoy the crap out of me, i am talking about terms that mean somethign specific, but people have began using it as a generic term for all connectinos with a DB 15 connector on it. this is whyu everyone gets so confused when the term comes out.

all i know is when i go and buy a cable they generally have SVGA, or XGA writen on them not VGA, and when i did actually buy a VGA cable it was crap and the picture was screwed cause it could not handle the resolution

I bet the term USB makes you want to commit suicide.

Link to comment
Share on other sites

This topic is now closed to further replies.
 Share

  • Recently Browsing   0 members

    No registered users viewing this page.