Saturday, 1 July 2017

Difference between 4K and UHD,Quantum dot vs OLED vs QLED Televivision Displays..

Now a days  4K is becoming a bit more mainstream, with HDTVs and computer monitors both approaching somewhat normal levels in pricing, let’s look at two terms that have become increasingly conflated with one another: 4K and UHD, or Ultra HD. TV makers, broadcasters, and tech blogs are using them interchangeably, but they didn’t start as the same thing, and technically still aren’t. From a viewer standpoint, there isn’t a huge difference, and the short answer is that 4K is sticking, and UHD isn’t. But there’s a little more to the story.

What is 4K?

In regards to consumer displays, 4K generally equates to a 3840x2160-resolution panel. This means the typical 4K screen will offer 3,840 horizontal pixels and 2,160 vertical pixels. When you multiply these numbers together, you get a panel with more than 8 million pixels. This is four times the pixel density of a traditional 1080p HD panel. See the math below:

3840x2160 = 8,294,400 pixels
1920x1080x4 = 8,294,400 pixels
The term 4K derived from the fact that the film industry reached a 4096x2160 resolution standard, which is a 1.9:1 aspect ratio, but since most home monitors and TVs use a 16:9 ratio, the resolution was scaled down to 3840x2160p, which is also commonly referred to as Ultra HD.

No Caption Provided


Why should I care about 4K?

Studies show that by 2020, the majority of US homes are expected to have a 4K-capable UHD TV.

In terms of picture quality, the increased resolution provides improved image clarity and sharpness. It also allows for larger panels, since 1080p screens stretched across a large surface will begin to look blurry with their low pixel density.

Do I need a 4K player to take advantage of 4K?

Yes. There are standalone 4K players, but gaming PCs with modern graphics cards will also be able to output to 4K as well. The Xbox One S can play 4K movies, but it doesn’t have enough power to render 4K games. While the PlayStation 4 Pro is capable of running some games natively at 4K, many games use a checkerboard rendering shortcut. Microsoft asserts that its upcoming Project Scorpio console will be able to game at 4K.

What are the issues facing 4K?

Aside from the relatively small library of 4K content at the moment, you’ll need a good Internet connection if you intend to stream 2160p video. In addition, because 4K is four times as sharp as 1080p, it can be graphically demanding for gaming.

High Dynamic Range(HDR)

What is HDR?

HDR stands for high dynamic range. You might have heard the term as it pertains to cameras, but when it comes to display technology, it’s actually somewhat different.


4K vs. UHD

The simplest way of defining the difference between 4K and UHD is this: 4K is a professional production and cinema standard, while UHD is a consumer display and broadcast standard. To discover how they became so confused, let’s look at the history of the two terms.
The term “4K” originally derives from the Digital Cinema Initiatives (DCI), a consortium of motion picture studios that standardized a spec for the production and digital projection of 4K content. In this case, 4K is 4,096 by 2,160, and is exactly four times the previous standard for digital editing and projection (2K, or 2,048 by 1,080). 4K refers to the fact that the horizontal pixel count (4,096) is roughly four thousand. The 4K standard is not just a resolution, either: It also defines how 4K content is encoded. A DCI 4K stream is compressed using JPEG2000, can have a bitrate of up to 250Mbps, and employs 12-bit 4:4:4 color depth. 


Ultra High Definition, or UHD for short, is the next step up from what’s called full HD, the official name for the display resolution of 1,920 by 1,080. UHD quadruples that resolution to 3,840 by 2,160. It’s not the same as the 4K resolution made above — and yet almost every TV or monitor you see advertised as 4K is actually UHD. Sure, there are some panels out there that are 4,096 by 2,160, which adds up to an aspect ratio of 1.9:1. But the vast majority are 3,840 by 2,160, for a 1.78:1 aspect ratio.

A diagram illustrating the relative image size of 4K vs. 1080p — except that 4K should be labelled UHD, or 2160p.

Why not 2160p?

Now, it’s not as if TV manufacturers aren’t aware of the differences between 4K and UHD. But presumably for marketing reasons, they seem to be sticking with 4K. So as to not conflict with the DCI’s actual 4K standard, some TV makers seem to be using the phrase “4K UHD,” though some are just using “4K.”


To make matters more confusing, UHD is actually split in two — there’s 3,840 by 2,160, and then there’s a big step up, to 7,680 by 4,320, which is also called UHD. It’s reasonable to refer to these two UHD variants as 4K UHD and 8K UHD — but, to be more precise, the 8K UHD spec should probably be renamed QUHD (Quad Ultra HD). (Read: 8K UHDTV: How do you send a 48Gbps TV signal over terrestrial airwaves?)

The real solution would have been to abandon the 4K moniker entirely and instead use the designation 2160p. Display and broadcast resolutions have always referred to resolution in terms of horizontal lines, with the letters “i” and “p” referring to interlacing, which skips every other line, and progressive scan, which doesn’t: 576i (PAL), 480i (NTSC), 576p (DVD), 720p, 1080i, 1080p, and so on.

Now that there are 4K TVs everywhere, it would take a concerted effort from at least one big TV manufacturer to right the ship and abandon use of 4K in favor of UHD and 2160p. In all honesty, though, it’s too late. That said, the more important problem isn’t really the name; it’s where in the heck we can all get some real 4K content to watch. So far, it’s appearing in dribs and drabs on services like Netflix, Amazon Instant Video, and some proprietary hardware and software products from Sony. That’s not yet enough for 4K to really take off.



What is a QLED or QDLED?

QD LED or QLED is considered as a next generation display technology after OLED-Displays.
 QLED means Quantum dot light emitting diodes and are a form of light emitting technology and consist of nano-scale crystals that can provide an alternative for applications such as display technology. The structure of a QLED is very similiar to the OLED technology. But the difference is that the light emitting centers are cadmium selenide (CdSe) nanocrystals, or quantum dots. A layer of cadmium-selenium quantum dots is sandwiched between layers of electron-transporting and hole-transporting organic materials. An applied electric field causes electrons and holes to move into the quantum dot layer, where they are captured in the quantum dot and recombine, emitting photons. The spectrum of photon emission is narrow, characterized by its full width at half the maximum value.



There are two major fabrication techniques for QD-LED, called phase separation and contact-printing.
QLEDs are a reliable, energy efficient, tunable color solution for display and lighting applications that reduce manufacturing costs, while employing ultra-thin, transparent or flexible materials.

QDLED doesn’t sound so good, though, so many have taken to dropping the ‘D’. Hence, QLED.

Name-changing aside, QLED is essentially a new variation on the quantum-dot TV technology that headlined Samsung’s 2016 SUHD (Super UHD) range. 

We’ll explain what’s actually new when it comes to QLED in a moment, but first, let’s take a quick look at why quantum dots are such a prized asset for televisions. 



QLEDs advantages:

◾Pure color — Will deliver 30-40% luminance efficiency advantage over organic light emitting diodes (OLEDs) at the same color point.
◾Low power consumption — QLEDs have the potential to be more than twice as power efficient as OLEDs at the same color purity.
◾Low-cost manufacture — The ability to print large-area QLEDs on ultra-thin flexible substrates will reduce luminaire manufacturing cost.
◾Ultrathin, transparent, flexible form factors — QLEDs will enable designers to develop new display and lighting forms not possible with existing technologies.



The advantage of this is that they’re capable of emitting brighter, more vibrant, and more diverse colours – the sort of colours that really make HDR content shine, thanks to the high peak brightness that can be achieved.

Quantum-dot TVs are also considered to be more cost-effective to manufacture than OLED ones, meaning consumers can enjoy comparable or even better picture quality at a lower price. That’s never a bad thing. 


A potential sacrifice when it comes to quantum-dot vs OLED technology is contrast ratios, with OLED displays generally thought to be able to produce the deepest blacks – another holy grail for the 2017 AV enthusiast.

But what if someone found a way to combine the best of both worlds…


Quantum dot vs OLED vs QLED

So, we now know that quantum dots are great for brightness, and OLED typically comes up trumps for truly black blacks. Where exactly does QLED fit in?

QLED technology swaps the photoluminescent quantum dots found in current quantum dot TVs for electroluminescent nanoparticles, which means that light can be supplied directly to a display, instead of via an LED backlight, which can distort the pureness of colours, specifically the blacks we’re constantly harping on about. 

If it sounds alot like the light transference process that happens with OLED TVs, that’s because it is. Rather than requiring a separate backlight for illumination, a QLED TV natively controls the light emitted by individual pixels, so similarly impressive contrast ratios should be possible.


QLED, therefore, theoretically combines the best of quantum dot and OLED technology – the clarity and deep blacks of OLED, the superior brightness and colour gamut of quantum dots – and results in a package that could boast a 30-40% luminance efficiency advantage, as well as helping lower power consumption.


We must stress, however, that QLED isn’t just OLED TV with quantum dots – it’s an entirely new technology that promises the benefits of both.

No comments:

Post a Comment