What is "HD"?


Recommended Posts

DigitalE

I run my Xbox 360 into my computer monitor via a VGA connection, and it's displaying at 1280x720.

Now, my friend claims that since I'm using VGA, an analog signal, my display is not actually in "hi-definition." He says I have to be using DVI, a digital signal, to have real "HD."

I'm quite sure that it doesn't matter whether the signal is analog or digital; it's still HD if it's in the 720p resolution.

Could I have a little info from some experts here? :)

Edited by DigitalE
Link to post
Share on other sites
User6060

HD depends on the resolution, not the means of which that res. is transferred.

you are correct (Y)

for another example, component is totally analogue but it is probably the most used cable to connect HD devices to older HD tv's

Link to post
Share on other sites
sundayx

Though you are correct, I think going full digital is the only "full" HD experience.

Link to post
Share on other sites
LunarFalcon

I've noticed that most people who have a setup slightly better than someone else's tend to enjoy telling others that theirs isn't TRUE HD :p

Link to post
Share on other sites
Gangsta

You know, anything is "HD" if it has a resolution above [correct me if I'm wrong] 640x480. Therefore, you can have Bungie's Halo 3 fiasco, in which people argue that 640p isn't real HD. In this case, even 640p (or for that matter, 481p if it exists :p) is "HD" (though not a *preferred* HD resolution. Though, digital transfer does make for better picture quality (believe me, I had component hooked up to my HD DVR, and I switched to HDMI, and what a difference!).

Link to post
Share on other sites
TheDreamX

HD, as you already know, means "high definition." While there is no true standard to where people believe HD begins, most will say it begins at 720p, or 1280 pixels by 720 pixels progressive.

When dealing with displays, you either have "interlaced" or "progressive." With an interlaced display, you are seeing half of the visible lines (of pixels) every frame. Every other frame shows the opposite lines. In the current American standards - called NTSC - there are 29.97 (or 30) frames per second. With progressive displays, all lines are visible in each frame which, in turn, gives a smoother, more fluid appearance. To be truly HD, as stated before, the signal should be progressive -- or interlaced with a high enough resolution, i.e. 1920x1080, but that's another story.

While VGA does in fact display an analog signal, that does not mean it is not progressive. Since the signal is analog, it must be converted twice before reaching your eyes: once from the output device (digital to analog) and again in the monitor/display (analog to digital). This means that the information being sent through the cable may degrade causing "slight" color loss or blurriness, but nothing to be worried about. Meanwhile, with a DVI cable, the signal stays digital through the entire line.

Either way, 720p is 720p and is, in fact, real HD. I probably over-answered your question, but I'm bored and it's late.

Link to post
Share on other sites
  • 3 weeks later...
Jason S.

I've actually heard both definitions of HD. Both mentioned here. Any resolution higher than 480p or 720p. not sure which is correct actually.

Link to post
Share on other sites
Galley

720p is the baseline for high-defnition.

Link to post
Share on other sites
bangbang023

Your friend needs to do more research before speaking.

HD can be transmitted over digital or analog cables. VGA is analog, just like the component cables the 360 comes with and both are fully capable of delivering HD.

Link to post
Share on other sites
sundayx

Would you recommend HD over analogue though, seems like a step back.

Link to post
Share on other sites
bangbang023
Would you recommend HD over analogue though, seems like a step back.

If you've taken the time to compare the two, you'd realize that unless you have a high end set, you won't notice much of a difference, if any at all. It's not a step back, at all.

Link to post
Share on other sites
goji
Would you recommend HD over analogue though, seems like a step back.

Why would it matter? It is the same with arguments surrounding Vinyl and CD. Its nothing more than one based upon format of delivery.

Taken into consideration that the majority of recorded film and song has been stored in a non digital or 10101 state. With the advent of CD and DVD and modern cinema tech, the transfer of much our data has been pushed in the digital direction for numerous reasons. Keep in mind that after a certain point, Id say with the advent of cinemascope et al, most films were shot in native HD resolution, once again stored in analog. Of concern to you as a consumer should not be the format the media is stored in or on, but rather the interoperability of your equipment to properly display the signal in the best fidelity you can afford and that the transfer is of high enough quality for your enjoyment.

Thats where true concern should lie. Though I will admit there may be other issues, consider this a very broad answer to your question.

Link to post
Share on other sites
xDayan

I've seen 480p "HD" before I remember seeing it on a game case and also on some old dvds

Link to post
Share on other sites
neufuse
I've seen 480p "HD" before I remember seeing it on a game case and also on some old dvds

I think they misused the term... when I saw that (on XBOX 1 games) it just was saying its in 480p wide screen it really should of said EDTV not HDTV (ED is 480p)

Link to post
Share on other sites
whocares78
Would you recommend HD over analogue though, seems like a step back.

HUH, have you read nay of the above posts.

it really has nothing to do with analogue or digital.

Link to post
Share on other sites
neufuse

So by your friends logic when my HD signal comes in through component cables its not HD anymore? Component is analog... HD is just specifying the size and format not the transmission method (analog or digital)...

HD can be analog or digital... to be HD you just have to be 720p or higher

SD - 480i

ED - 480p

HD - 720p

HD - 1080i

HD - 1080p

all of those can be transmitted through Digital or Analog and still be what they are classified as

Link to post
Share on other sites
Slimy

I always found it interesting that TVs were getting bigger and bigger and resolution increased very very slowly

Link to post
Share on other sites
Malik05

umm i have a Dell E207WFP...it takes analog VGA, and digital DVI...1680x1050 is the current setting...if i was to have a HD TV Tuner, would it really be viewed as HD?

Edited by Malik05
Link to post
Share on other sites
atkinsn2000

VGA can handle HD resolutions such as 720p and 1080i

BUT

it is not HD because it can't handle the bandwidth required to display a true HD image

to be classed a a true HD image it must HDMI or DVI

Component is a version of analogue HD that can handle the bandwidth because of the use of the separate cables but it does not use a digital signal so you do loose some quality

Link to post
Share on other sites
watkinsx2
VGA can handle HD resolutions such as 720p and 1080i

BUT

it is not HD because it can't handle the bandwidth required to display a true HD image

to be classed a a true HD image it must HDMI or DVI

Component is a version of analogue HD that can handle the bandwidth because of the use of the separate cables but it does not use a digital signal so you do loose some quality

Thats crap, VGA also uses separate cables and is comparable quality to component.

Link to post
Share on other sites
atkinsn2000

It is more to do with the bandwidth that the cables can carry, which VGA cables can't

Link to post
Share on other sites
ctebah

anyone want to provide some proof about all this HD talk, rather than voice your opinion and have someone else say something totally different?

Link to post
Share on other sites
atkinsn2000

In order to be awarded the label "HD ready" a display device has to cover the following requirements:

1. Display, display engine

1.1The minimum native resolution of the display (e.g. LCD, PDP) or display engine (e.g. DLP) is 720 physical lines in wide aspect ratio.

2. Video Interfaces

2.1 The display device accepts HD input via:

Analog YPbPr. ?HD ready? displays support analog YPbPr as a HD input format to allow full compatibility with today's HD video sources in the market. Support of the YPbPr signal should be through common industry standard connectors directly on the HD ready display or through an adaptor easily accessible to the consumer; and:

DVI or HDMI

2.2 HD capable inputs accept the following HD video formats:

1280x720 @ 50 and 60Hz progressive scan (?720p?), and

1920x1080 @ 50 and 60Hz interlaced (?1080i?)

2.3 The DVI or HDMI input supports copy protection (HDCP)

The following technical references apply to the above deDVI:

DVI: DDWG, ?Digital Visual Interface?, rev 1.0, Apr 2, 1999 as further qualified in EIA861B, ?A DTV Profile for Uncompressed High Speed Digital Interfaces? May 2002, furthermore allowing both DVI-D and DVI-I connectors, requiring compliance to both 50 and 60Hz profiles, and requiring support for both 720p and 1HDMI:ormats.

HDMI: HDMI Licensing, LLC, ?High-Definition Multimedia Interface?HDCP:ay 20, 2004

HDCP: Intel, ?High-Bandwidth Digital Content Protection System?, rev 1.1, June 9, 2003.

(NB: on DYPbPr:.0 will apply)

YPbPr: EIA770.3-A, March 2000, with the notice that the connectors required may be available only through an adaptor.

http://www.hdready.org.uk/

Link to post
Share on other sites
_kane81

^

So where does it say VGA does not have enough bandwidth?

VGA can handle HD resolutions such as 720p and 1080i

BUT

it is not HD because it can't handle the bandwidth required to display a true HD image

to be classed a a true HD image it must HDMI or DVI

Component is a version of analogue HD that can handle the bandwidth because of the use of the separate cables but it does not use a digital signal so you do loose some quality

It is more to do with the bandwidth that the cables can carry, which VGA cables can't

As some one pointed out already, That is crap! bandwith it not an issue here. who told you/ where is your source for this info.

It's like saying I need these monster thick audio cable that cost me $100 p/meter or else the sound wont come out.

snip - cause gettng over board

SD - 480i/576i

ED - 480p/576p

HD - 720p

HD - 1080i

HD - 1080p

just for completness

SD - 480i/576i

ED - 480p/576p

the last line could be considered incorrect, in todays marketing terminology

FULL HD - 1080P

Just to confirm some small things with people, timings and resolutions are not necessarily equal. the i/p part indicates that they are talking about timings not resolution.

yes you can have interlaced or progressive video sources. but you can send a progressive video source over a interlaced signal, if that makes sense. or you can send progress over interlaced, but you get the stripy lines like if you watch a interlaced video source onyour PC.

so say you have a 1280x720 image/ photo / movie sent to a 1366x768 LCD TV as a 720p signal.

the image sent along the 720p timing is scaled by the LCD to the native 1366x768 resolution.

you could technically output 1024x768 image to an LCD in a 720p timing or a 1080i video source to a 720p timing...etc

I have my LCD hooked up to my PC at the native resolution, bypassing the internal scaler. Some people however prefer using the scaler as it can make the picture look better....things like panasonics 100hz technology etc.

Edited by _kane81
Link to post
Share on other sites
This topic is now closed to further replies.
  • Recently Browsing   0 members

    No registered users viewing this page.