TechSpot Op-ed: Are we ready for 4K?

The rest of the industry is finally catching up to what smartphone and tablet owners already knew: people want high-resolution, high pixel density displays. It’s ridiculous that even in 2014, the majority of mainstream laptops being sold feature a display resolution that’s not only lower than mid-to-high-end smartphones with screens a fraction of the size, but they employ vastly inferior panel technology to boot. Thus, the great white hope is 4K.

Kind of a misnomer, “4K” is essentially rounding up the horizontal resolution; the actual display resolution is typically 3840x2160. It maintains the 16:9 aspect ratio that a lot of us still chafe under, but essentially quadruples the resolution of a 1080p display, doubling it in each dimension.

For now we can ignore manufacturing costs for these new, higher resolution panels since those will eventually come down along with the overall price to the end user as economy of scale kicks in. Yet there are still a few mitigating factors that surface when we talk about making the move to a 4K display, especially if you’re a gamer, and these mitigating factors are the kinds of things that keep me using conventional 1080p/1200p displays at home.

New technology often materializes before the infrastructure is fully in place. Intel launched their Ultrabook initiative with Sandy Bridge processors that weren’t really designed to hit those low voltages. Each transition to a new memory technology was very gradual. 4K displays have a similar chicken and egg problem, and there are three key points where we’re just not quite ready.

Read: Op-ed: Are we ready for 4K?

This article is brought to you in partnership with TechSpot

Report a problem with article
Previous Story

Didn't pre-order a Surface Pro 3? You can still pick one up on Friday

Next Story

Apple TV gets 20 percent price cut in UK, Europe

35 Comments

Commenting is disabled on this article.

I think 4k will come into its own once h.265 is adopted

as yet, afaik, there are no hardware h.265 decoders, and very limited h.265 encoders

i have not seen 4k yet but is there really THAT much of a difference between 1080p on say a 50" (or so) TV vs 4k?

i have a feeling there won't be as much of an improvement going from 1080p to 4k as there is from SD to 1080p. like there will be a more noticeable difference from SD to 1080p than 1080p to 4k.

my guess is this 4k stuff won't take off like the current HD did as even if it does it won't be for many years before it starts to get somewhat common.

The difference is marked, comparing screens of a similar physical size side by side in a store. I mean in stage Tennis action shots in 4K you can full make out the cellulite on chick.

My only concern is "When will 4K capable HARDWARE be viable for the everyman" 4K won't be adopted until 4K content forces a demand. Cable TV stations need to offer 4K content, and PC hardware needs to catch up to the point that the "everyman" can afford a 4K gaming rig.

The difference is pretty amazing. I'll admit it's not the same as going from SD to HD, but how could it be? We're talking way diff tech there.

Go to your local store that has a demo running. It's pretty amazing. Also, the small amount of content on my desktop at home looks fantastic on my crappy 39" Seiki!

@ Jason S. ; so in other words... the difference is noticeable but it's not as profound as SD to HD is. this is kind of what i figured.

There are still many Xbox One owners who can't tell the difference between 720p and 1080p, and if gamers can't tell the difference, then who knows if it really will catch on.

4K will never be ubiquitous - ever. Smaller screens (such as tablets and smartphones) cost less to manufacture than larger ones with the same pixel density - increased pixel density DOES increase cost per pixel for the same reason that increased transistor density increases cost in terms of merely integrated circuitry (ALL integrated circuitry); all manufacture of both is based on lithography technologies. Some litho techs are more advanced (therefore costlier) than others - and some are simply a bad mix with some substrates. (That is one big reason why Intel invests a ton in R&D - lithography/substrate research.)

4K content - why? Let's be honest with ourselves, - even 1080p content struggles to find a marketplace outside of niches, and that is even with 1080p display gear practically ubiquitous. In the United States alone, the highest-end for broadcast is 1080i; however, there is very little perceived value even in that. (The only reason that 4K is in Japan is due to governmental interventionalism and mandates - which was ALSO the case with 1080i/1080p in Japan, Europe, and even the US. All of that is still being resisted in parts of all three areas- despite the mandates.) Sometimes, the resisters need to be whacked with the Clue by Four to perceive the value - however, sometimes, even THAT is not enough.

Costs are pretty high (though they're getting lower).
Until manufacturers can get the price down to the point of HDTV's (which is probably in 5-10 years, after the technology becomes cheaper), then I could probably consider, at the very least, the thought of getting a 4K tv.

MAKE PRICE LOW!

5-10 years is way too long. i believe we'll have affordable tv's within 2-3. You look at companies like Seiki that are paving the way for low-cost tvs. and lets be honest, most people ARE buying low cost tv's like Insignia and Vizio.

i bought my Seiki 39" for $400 in January. I literally just saw their 65" tv for $1065 yesterday. While they might not be high quality sets, they are UHD. (in fact, the 39" UHD version is $390 shipped on Amazon right now, today)

Also, remember when HDTVs were finally going mainstream, they were still $2000 for a 50" or so. I've seen high quality 55" 4k tv's for $2400 already.

update: Amazon has a 49" LG UHD set for $1459; 55" version for $1550; Samsung 50" for $2k. I'd call any of those affordable...

Edited by Jason S., Jun 19 2014, 12:05pm :

Nope we are not ready at all

No content
No disc delivery
Stream delivery is a problem because of ISP data caps

Films will have to be re scanned and remastered for 4K treatment, Digitally shot movies\TV Shows may be stuck at lower than 4K resolution due to using early HD digital camera's.

It's going to be a mess and in the end probably only 25% of BLU-RAY content will be able to benefit from 4K.

I want higher resolution and all new gadgets etc. But Realistically 4K has a long way to go before it will be useful.

Also 8K is around the corner as well and 8K has the same issue where probably only 25% of content that is 4K capable will be able to be released in 8K

Each resolution jump eliminates HUGE percentages of existing content from having a benefit.

As far as gaming goes PC gaming is niche, pc gaming capable of playable 4K resolutions is an even smaller niche and console games aren't going to be able to touch 4K for another 6-8 years when the next gen console gt released.

As much as I'd love a 4k or 8k world their is nothing showing me it is going to happen soon.

I'm not sure that we are.

Even the highest end current video cards struggle with 4K when going beyond normal settings, and when you factor in the fact that a huge majority of people use mid-range and low-end cards, you'll get the picture.

Personally, I'm happy enough with a HD TV and monitor. I don't feel the need for a 4K monitor. One thing the article doesn't mention is space. To really notice the improvements with 4K, you'll probably need a big TV (I'm thinking 44' minimum but probably higher). Not everyone will have the space in their workplaces or living rooms for huge TV's and monitors. Consequently I think 4K will always be a niche thing because you simply won't notice the difference with the smaller monitors.

My TV is a 32" 1080p Sony so 4K would be wasted on me. I would rather higher bitrate 1080p streams than a 4K stream.

For the desktop? Hmm maybe but Windows is still dodgy with HiDPI support even in Windows 8.1. I know most of the issues are due to devs not supporting it too (OS X devs were much better at updating apps for retina screens than Windows devs are) but MS have been slow on the HiDPI front too. I remember being promised perfectly HiDPI support in Longhorn/Vista but here we are almost 10 years later with ###### support still :(

Actually, the pixel density of a 32" 1080p TV is almost the same the bigger (55"+) 4K TV's offer. Good 1080p on a 32" TV will look as good as 4K does on bigger TV's. There's still the actual TV size difference, but it still matters. Pixel density is an awesome way to convey reality.

It's also why I don't see the point in large 1080p TV's.

Yeah not really planning 4k gaming on my pc for awhile but i am getting a new tv in a few months and been debating on getting a 60" instead of the 50 in my living room and moving the 50 to the bedroom or upgrading to a 4k 50" for the same price as the 60" im looking at. Really been leaning towards the 60 though and going 4k later cause i dont think theres any 4k content to be had right now.

The major disadvantage of 4K is cable/satellite. The pipes are only so big. Regular HD cable is already compressed all to hell, and any action scene (or scene with a lot of changes, like a strobe-light or confetti filtering down) will already degrade into a blocky mess as compression artifacts blow out the image. It'd be even WORSE on 4K. My HD channel selection is already limited because it takes 3-4 SD "channels" worth of bandwidth to fit just 1 HD channel... imagine a 4K channel taking 2-4 times more bandwidth than an HD channel. It's just not going to fly.

It has nothing to do with the provider... and I don't have satellite anyway, I have cable. It's true of ALL cable/satellite providers. It's a fact of life. A law of physics. Your comment is just completely ignorant.

Except its not. It depends on the satellite they use, if they rent space or have their own.

Satellite providers over here have excellent high bit rate 1080 broadcasts. Same for the cable companies who use the same signals or in some cases fiber.

Either way, your claim is BS. In fact Canal Digital over here win awards for best image and audio quality.

If its a rented satellite with limited rented BW then yes. Also cable has more than enough BW for many high quality full HD broadcasts.

The only tech that has bandwidth issues is terrestrial, which can only send 720 and only very few channels, so they have to use iptv to stream any new channels they ad and only send a few in HD and most in SD.

The claim isn't BS. You clearly don't know what the heck you're talking about. ALL video streams are compressed when sent over cable. ALL OF THEM. It's about bitrates. An HD stream will NEVER look as good as a BlueRay of the exact same movie because BlueRay has a WAY higher bitrate.

AND I'M TALKING ABOUT CABLE, NOT SATELLITE (for the *third* time), though Satellite has the same issue. All video streams are encoded and compressed, and all compression is "lossy", and thus results in "compression artifacts"... just like jpeg compression. The higher you crank it up, the "blockier" the image gets.

This is physics. This is fact. This is math. You can't deny it exists. That's beyond ignorant. YOU DO NOT KNOW WHAT YOU ARE TALKING ABOUT.

pmbAustin said,
.....

IPTV has it even worse.
Currently my provider broadcasts 1080i at bitrate of 8Mbps.
4k will not be possible until the next h.26x spec is ratified and all customer have Fiber to the home. Currently my provider is only offering Fiber to ~ 5% of customers as network upgrades take time.

pmbAustin said,
The claim isn't BS. You clearly don't know what the heck you're talking about. ALL video streams are compressed when sent over cable. ALL OF THEM. It's about bitrates. An HD stream will NEVER look as good as a BlueRay of the exact same movie because BlueRay has a WAY higher bitrate.

AND I'M TALKING ABOUT CABLE, NOT SATELLITE (for the *third* time), though Satellite has the same issue. All video streams are encoded and compressed, and all compression is "lossy", and thus results in "compression artifacts"... just like jpeg compression. The higher you crank it up, the "blockier" the image gets.

This is physics. This is fact. This is math. You can't deny it exists. That's beyond ignorant. YOU DO NOT KNOW WHAT YOU ARE TALKING ABOUT.

of course they're compressed, so is a Blu-ray.

You seem to be the only one who don't know what he's talking about here. Satellite and cable use encryption a CI+ card against piracy. And no you said both cable and satellite before. And besides canal digital here use the EXACT same high quality broadcast on both cable and satellite.

Actually, when compressed in HEVC, you can get 4K content at the same file size as 1080p X264 content with no loss of quality.

The problem is, the new compression scheme isn't implemented "everywhere" like current H.264 compression is... which is done in hardware in most microprocessors/chipsets, and present in TVs, DVRs, DVD Players, computers, phones, etc. It'll be a while before that new compression goes wide enough to matter.

Javik said,
Actually, when compressed in HEVC, you can get 4K content at the same file size as 1080p X264 content with no loss of quality.

Your missing his point.

He's saying that current bandwidth limits ensure that 4K compression will be too high.
Maybe your eye doesn't notice the difference, but 1080p over satelitte/cable already can look pretty shoddy in scenes with alot of colour and movement. 4K will be even worse and it will be a long time till new compression methods are standard and wide spread.

With current technology 4K is only useful for gaming imo and is over kill for movies anyway

HawkMan said,

of course they're compressed, so is a Blu-ray.

You seem to be the only one who don't know what he's talking about here. Satellite and cable use encryption a CI+ card against piracy. And no you said both cable and satellite before. And besides canal digital here use the EXACT same high quality broadcast on both cable and satellite.

The advantage I'd imagine for 4k, is that there's a lesser need for anti-aliasing, so perhaps some performance of the graphic card could be saved there.

bigmehdi said,
The advantage I'd imagine for 4k, is that there's a lesser need for anti-aliasing, so perhaps some performance of the graphic card could be saved there.

Most current FSAA techniques aren't quite the 2/4/8x fixed cost of MSAA I believe.

I'm personally not planning on going 4K until 2016 or later.

bigmehdi said,
The advantage I'd imagine for 4k, is that there's a lesser need for anti-aliasing, so perhaps some performance of the graphic card could be saved there.

Except the tiny problem of everything else being processed at twice the resolution...