ITU approves 8K ultra HDTV specification

The International Telecommunication Union, a standards agency of the UN, has approved a new television format which boasts a resolution of 32 megapixels, which is 16 times bigger than the current HDTV standard. The new ultra HDTV 8K standard, also known as Super Hi-Vision, was partially developed and demonstrated by The Japan Broadcasting Corporation, better known as NHK.

Super Hi-Vision offers a resolution of 7680 by 4320 pixels at 120 frames per second. Smaller frame rates such as 25 fps are also included, but only progressive frame rates are allowed. Just like the current HDTV standard, there's also a smaller format included in the specification which has a 3840 by 2160 resolution. What kind of audio specification will be used together with the ultra high definition material has not been decided yet, but the NHK is supporting 22.2 surround sound. However there is more work to be done on how to transmit and store the 8K video.

According to documents published by the ITU, the standard was made for use at home as in public spaces such as cinemas and big venues. At the 2012 Olympic games, NHK showed off the technology and developed three special cameras to shoot the 8K video. In 2020 NHK plans to begin it's first experimental broadcasts in 8K format.

Several Japanese electronic makers already have some 4K devices on the market. LG will start the sale of a 84 inch television for about $22,000 very soon and Sony and JVC have some 4k projectors on the market, which are used by digital cinema cameras.

Source: Techworld | Images via NHK

Report a problem with article
Previous Story

Review: Amped Wireless R20000G Gigabit dual band router

Next Story

Weekend Poll: How do you like Microsoft's new logo?

56 Comments

Commenting is disabled on this article.

"What kind of audio specification will be used together with the ultra high definition material has not been decided yet, but the NHK is supporting 22.2 surround sound"

Well, how about Dolby Atmos?

Sounds like a 20 year standard to me...then again, you never know. Personally 4k is all I'd ever need, I'm perfectly happy with 1080 most of the time on my 42", but at 55" or more, the extra pixels are appreciated. 8k? That's going to require retooling the industry a bit as well.

How big does a proper hollywood video camera shot?

And how large would one scan negatives before it would be pointless?

Its funny how we always went from the vertical side to judge TV resolution, now we suddenly shift to horizontal because it's a bigger number?

fmanchu said,
I can't wait for the 3d tvs that support this resolution! /s

Not sure what the sarcasm is about. 3D has improved and it is only going to get better. It still has some faults, but your sarcastic comment really doesn't make any sense.

Several Japanese electronic makers already have some 4K devices on the market. LG will start the sale of a 84 inch television for about $22,000 very soon

Who the hell buys an 84" TV for their lounge shoot I'd have to sit outside to watch the damn thing otherwise I'd be constantly moving my eyes around to see it all and probably miss half of what was going on

Athlonite said,
Several Japanese electronic makers already have some 4K devices on the market. LG will start the sale of a 84 inch television for about $22,000 very soon

Who the hell buys an 84" TV for their lounge shoot I'd have to sit outside to watch the damn thing otherwise I'd be constantly moving my eyes around to see it all and probably miss half of what was going on

I have a home theater projector with a 100" screen. I think our normal viewing distance is about 12ft or so, so its not exactly a huge room. Honestly anything smaller just wouldn't feel right, we use it for watching football and its one of the best buys ive made. It sounds crazy, but that size at the distance we watch it as is perfect.

also, directv has these "mix" channels that shows 8 games on the screen at the same time just like you would do split screen with a video game. On a normal screen something like that is just stupid and you cant watch anything, but on a 100" screen all of the split screens are like 32" so it works really well.

I am Reid said,

I have a home theater projector with a 100" screen. I think our normal viewing distance is about 12ft or so, so its not exactly a huge room. Honestly anything smaller just wouldn't feel right, we use it for watching football and its one of the best buys ive made. It sounds crazy, but that size at the distance we watch it as is perfect.

also, directv has these "mix" channels that shows 8 games on the screen at the same time just like you would do split screen with a video game. On a normal screen something like that is just stupid and you cant watch anything, but on a 100" screen all of the split screens are like 32" so it works really well.

did it cost you 22K or more

togermano said,
I am still using a CRT tv in my bed room.... I dont care x

WOW, i got rid of anything CRT at the earliest possibility.. those things are painful to look at, i suffered with migraines throughout their usage, ever since i moved over to flat panel displays the migraines went away.

Uplift said,

WOW, i got rid of anything CRT at the earliest possibility.. those things are painful to look at, i suffered with migraines throughout their usage, ever since i moved over to flat panel displays the migraines went away.

and I spose it was set to 60Hz silly boy should have bought a real CRT that does 100Hz @ 1024x768 even the 19" CRT I had did 1600x1200@75Hz and that was flicker free too

Uplift said,

WOW, i got rid of anything CRT at the earliest possibility.. those things are painful to look at, i suffered with migraines throughout their usage, ever since i moved over to flat panel displays the migraines went away.

They run at a 60 Hz refresh rate making the flicker unbearable, especially on still images. Throw in the really low interlaced resolution and I don't know we ever tolerated them.

Skyfrog said,

They run at a 60 Hz refresh rate making the flicker unbearable, especially on still images. Throw in the really low interlaced resolution and I don't know we ever tolerated them.

You didn't tolerate it ... you loved it. And your parents saw much worse. And your grandparents may have been around when TV wasn't even invented and the first TV's they saw were 7 inches ...

Will you benefit from it over 1080p set? Let's calculate.

Research done by Japanese NHK shows us healthy observers can notice image quality improvement up to the point where spacing between separate pixels is ~0.005 degrees of arc [of viewer's field of view; regardless of viewing distance]. 0.005 arc-degrees is angular resolution of 200 pixels per degree.

So, for example, if you have two displays, one with angular resolution of 100 and other with 200 pixels per degree, you'd be able to notice difference in quality between those two. But, if you had a display with angular resolution of 300 pixels per degree next to one with 200 ppd, there would be no perceivable difference between those two.

Let's find out at which point 1080p meets this requirement, for usual viewing distance of 9 feet (2.7432 meters).

At 9 ft distance, pixel spacing must be 0.0094247 inches or less. For 1080p set, this equals to 18.09'' width or less. With Pythagorean theorem we can find out diagonal measurement. It is ~21''.

So, at 9 ft viewing distance, 1080p TV set is meeting visual limitations of healthy observers when TV set diagonal measurement is 21 inches or less. That means you'll benefit from 4K (2160p to be precise) if your TV is bigger than 21 inches.

Let's see when a 4K TV set meets this requirement, for 9 ft viewing distance too.

At 9 ft distance, pixel spacing must be 0.0094247 inches or less. For 4K (2160p) set, this equals to 36.19'' width or less. With Pythagorean theorem we can find out diagonal measurement. It is ~42''.

So, at 9 ft viewing distance, 4K (2160p) TV set is meeting visual limitations of healthy observers when TV set diagonal measurement is 42 inches or less. That means you'll benefit from 8K (4320p to be precise) only if your TV is bigger than 42 inches.

Any questions?

RetinaMath said,
Will you benefit from it over 1080p set? Let's calculate.

Research done by Japanese NHK shows us healthy observers can notice image quality improvement up to the point where spacing between separate pixels is ~0.005 degrees of arc [of viewer's field of view; regardless of viewing distance]. 0.005 arc-degrees is angular resolution of 200 pixels per degree.

So, for example, if you have two displays, one with angular resolution of 100 and other with 200 pixels per degree, you'd be able to notice difference in quality between those two. But, if you had a display with angular resolution of 300 pixels per degree next to one with 200 ppd, there would be no perceivable difference between those two.

Let's find out at which point 1080p meets this requirement, for usual viewing distance of 9 feet (2.7432 meters).

At 9 ft distance, pixel spacing must be 0.0094247 inches or less. For 1080p set, this equals to 18.09'' width or less. With Pythagorean theorem we can find out diagonal measurement. It is ~21''.

So, at 9 ft viewing distance, 1080p TV set is meeting visual limitations of healthy observers when TV set diagonal measurement is 21 inches or less. That means you'll benefit from 4K (2160p to be precise) if your TV is bigger than 21 inches.

Let's see when a 4K TV set meets this requirement, for 9 ft viewing distance too.

At 9 ft distance, pixel spacing must be 0.0094247 inches or less. For 4K (2160p) set, this equals to 36.19'' width or less. With Pythagorean theorem we can find out diagonal measurement. It is ~42''.

So, at 9 ft viewing distance, 4K (2160p) TV set is meeting visual limitations of healthy observers when TV set diagonal measurement is 42 inches or less. That means you'll benefit from 8K (4320p to be precise) only if your TV is bigger than 42 inches.

Any questions?

I like it!

RetinaMath said,
Will you benefit from it over 1080p set? Let'

Eye resolution is about 8000x8000px (higher for black/while, lower for color). Human eye Field-of-view (FOV):

40° FOV - focus area (highest density of photo receptor cells in the retina)
140° FOV - binocular/depth/3D vision (overlapping images from eyes)
160° FOV - peripheral vision
180° FOV - peripheral vision and free eye movement in its socket. (extra 1000 pixels)

Basically, 3D screen with resolution of 9000x9000 and 180° FOV would immerse you in virtual reality where you won't be able to say if its real or not. ITU standard must take this into account for the future.

Head-mounted displays have huge advantage - GPU don't have to work so hard, because only 60° (40° + 20° eye movement) focus area have to be rendered with highest quality (video encoders, tailored to human eye, could be more efficient by taking this into account - improved quality in the center at expense of quality in the corners).

Now, about TV. Ideal TV would cover 50° field of view - perfect distance because you have everything important in your view (40° focus area) for brain to handle and some extra space for eye movements. So perfect_distance = screen_size / 2 / tan(50°/2) * 2.54

Screen size (FOV° covered by screen); perfect distance from the screen.
24" (50°); 65cm
27" (50°); 74cm
32" (50°); 87cm
40" (50°); 109cm

Perfect resolution: 50°/160° * 8000pixels = 2500x2500 pixels.

Conclusion:

3840x2160 screen on your desk is more than enough for entertainment (in most cases moving images with one important scene at the time only), because your eyes and brain won't be able to handle more information efficiently.

Edited by EJocys, Aug 25 2012, 2:36pm :

EJocys said,

Eye resolution is about 8000x8000px black & while (lower for color). Human eye Field-of-view (FOV)...


Sorry, you're oversimplifying it way too much. Plus, foveola has highest cone density and that's what counts. Theoretically, you derive highest human eye resolution from smallest cone spacing on foveola. However, real "resolution" is somewhat lower and the only way to determine it is by testing: exactly what NHK has done.

If we're talking about flat (not curved) displays, you can't take human eye resolution expressed in pixels per degree (ppd) and multiply it with number of degrees display is occupying. I hope your logic tells you why.

RetinaMath said,

Sorry, you're oversimplifying it way too much.

I understand complexity, but I have to simplify and round, because screens are flat, distance between pixels are equal and there will be no rounded displays with different resolutions in different areas depending on type of receptors in the eye. I guess that it would be cheaper to send signals straight to the brain than create display with 36000x36000 resolution in order to make sure that image projected on foveola have the highest quality all the time. Maybe small and light, high resolution screen (with highest resolution at the center), inside HMD, which is moving in front of foveola would be a cheaper option - it will look like moving peace of glass with highest clarity in the center in front of the picture.

Edited by EJocys, Aug 25 2012, 3:41pm :

theyarecomingforyou said,
It's called 8K (that is, 8000) but falls shorts of actually achieving that, at 7680 pixels wide. They're trying to cheat consumers out of 320 pixels!

1080p is 2k...which robs you of 80 pixels (1920), since this is 4x the pixels per horizontal line it would be 320 short for 8k.

So it adds up to normal specs actually...

Shane Nokes said,

1080p is 2k...which robs you of 80 pixels (1920), since this is 4x the pixels per horizontal line it would be 320 short for 8k.
So it adds up to normal specs actually...

That's right, it has to keep the aspect ratio as well!!

Shane Nokes said,
1080p is 2k...which robs you of 80 pixels (1920), since this is 4x the pixels per horizontal line it would be 320 short for 8k.

But 1920x1080 was called 1080p - it was never marketed as 2K. Marketing 7780 as 8K is misleading. It's similar to what hard-drive manufacturers did and as they continue to increase in size the consumer gets cheated out of ever more space.

The_Decryptor said,
It's not really similar, HDD makers are using the correct units, the OS is reporting it wrongly.

They both report the size correctly, they just use different scales. However, binary scales are most common with computers - the decimal system used by hard-drive manufacturers is the exception, not the rule.

theyarecomingforyou said,

But 1920x1080 was called 1080p - it was never marketed as 2K. Marketing 7780 as 8K is misleading. It's similar to what hard-drive manufacturers did and as they continue to increase in size the consumer gets cheated out of ever more space.

LOL

The_Decryptor said,
It's not really similar, HDD makers are using the correct units, the OS is reporting it wrongly.

really and since when did DOS or Windows use base 10 as an capacity measurement infact which OS's use base 10 at all now tell me who's telling porkies

HDD's are used primarily in the computing environment for storage all computers today work in base 2 or Binary so shouldn't HDD manufacturers use the same measurement standards for capacity as an OS short answer is YES they should

my socalled WD Elements 1TB HDD once formated is only 931GB so really the HDD is only an 931Gigabyte HDD not an 1000 gigabyte drive as WD would have you believe oh no they tell you it's capacity once formated is 1TB what they don't tell you is that's it's capacity if using an OS that utilizes base 10 as it's capacity measurement system and seeing as there is no OS today that does they are lying

The_Decryptor said,
It's not really similar, HDD makers are using the correct units, the OS is reporting it wrongly.

You probably don't have any idea of how electronics work, because there is a reason to be 1024 rather than 1000, this applies mostly to digital circuits, hdd is a physical medium, nonetheless is administered by a chip. There is actually unused bits in the digital part of the harddrive, all because the hdd makers decide to cheat the user, but it's not extraordinarly important.

theyarecomingforyou said,

But 1920x1080 was called 1080p - it was never marketed as 2K. Marketing 7780 as 8K is misleading. It's similar to what hard-drive manufacturers did and as they continue to increase in size the consumer gets cheated out of ever more space.

WHO CARES?! 8k ... 7680 ... SEVEN SIX EIGHT ZERO. Do you even realise how insanely high quality that is? And you're going to complain about 300 odd pixels? lol ... Some people just HAVE to complain about stuff. It's 8k because that sounds better. That's all it is. Nothing else. 7680 is a crap name. 8k works fine.

theyarecomingforyou said,
It's called 8K (that is, 8000) but falls shorts of actually achieving that, at 7680 pixels wide. They're trying to cheat consumers out of 320 pixels!

Ah crap, my 1 TB is actually 931.32 GB.
Ever heard of rounding off? 8K is more marketable than 7680 and consumer friendly, just like 1 TB is better than 931.32 GB.

Spirit Dave said,
WHO CARES?! 8k ... 7680 ... SEVEN SIX EIGHT ZERO. Do you even realise how insanely high quality that is? And you're going to complain about 300 odd pixels?

Firstly, the sarcasm was obvious - it's not my fault that you missed it. Secondly, I never made any comment on the quality of the image - just how misleading the name is.

Athlonite said,

really and since when did DOS or Windows use base 10 as an capacity measurement infact which OS's use base 10 at all now tell me who's telling porkies

HDD's are used primarily in the computing environment for storage all computers today work in base 2 or Binary so shouldn't HDD manufacturers use the same measurement standards for capacity as an OS short answer is YES they should

my socalled WD Elements 1TB HDD once formated is only 931GB so really the HDD is only an 931Gigabyte HDD not an 1000 gigabyte drive as WD would have you believe oh no they tell you it's capacity once formated is 1TB what they don't tell you is that's it's capacity if using an OS that utilizes base 10 as it's capacity measurement system and seeing as there is no OS today that does they are lying

Doesn't OSX report hard drive sizes in base 10 these days?

Athlonite said,

really and since when did DOS or Windows use base 10 as an capacity measurement infact which OS's use base 10 at all now tell me who's telling porkies

HDD's are used primarily in the computing environment for storage all computers today work in base 2 or Binary so shouldn't HDD manufacturers use the same measurement standards for capacity as an OS short answer is YES they should

my socalled WD Elements 1TB HDD once formated is only 931GB so really the HDD is only an 931Gigabyte HDD not an 1000 gigabyte drive as WD would have you believe oh no they tell you it's capacity once formated is 1TB what they don't tell you is that's it's capacity if using an OS that utilizes base 10 as it's capacity measurement system and seeing as there is no OS today that does they are lying

Most Linux users would disagree, as I've actually seen 1TB hard drives show up as such, instead the 931GB you see in Windows.

I just notice that the Nokia PureView 808 is theorically capable of doing 8k with a res of exactly 7728 x 4354, which is a tiny bit more than the 8K proposal. Although that's image only, but what stop Nokia from releasing one can do video? Nothing =)

RommelS said,

Ah crap, my 1 TB is actually 931.32 GB.
Ever heard of rounding off? 8K is more marketable than 7680 and consumer friendly, just like 1 TB is better than 931.32 GB.

Ever hear how rounding errors resulted in billions of dollars lost at NASA. The larger the HDD the larger the rounding error.

Think of trying to use the Imperial Measurement System when trying to cook when the recipe book is giving you Metric system, wrong tools for the wrong job will cause you to screw up your food.

theyarecomingforyou said,

But 1920x1080 was called 1080p - it was never marketed as 2K. Marketing 7780 as 8K is misleading. It's similar to what hard-drive manufacturers did and as they continue to increase in size the consumer gets cheated out of ever more space.

I'm not sure you understand how this works. 1080p refers to the television standards. 2k refers to the film industry standards. There it is actually listed as 2k. There is an apparent movement to unify everything under a single standard for naming on both.

theyarecomingforyou said,

Firstly, the sarcasm was obvious - it's not my fault that you missed it. Secondly, I never made any comment on the quality of the image - just how misleading the name is.

Oh man, shut up was there any sarcasm there. Obvious sarcasm in text on a website with a standard sentence commenting on the simple fact? You are backtracking. People pick holes in everything they can here. You did it and if you were being sarcastic in your post, you brutally failed bad in trying to show that.