Overblown quantum dot conspiracy theories make important points about QLED TVs

equals42

Ars Scholae Palatinae
1,216
Subscriptor++
Heck go back even further. Remember the whole 5G branding debacle?
That’s what it reminded me of. It happens every “G” since at least 4G/LTE when telecoms who are slower to adopt new networks dilute the terms. My 3G UMTS iPad suddenly became 4G back in the day whereas my LTE phone on different carrier was actually 4G with LTE. Maybe my moderately priced patio TV can suddenly become “Q”?
 
Upvote
9 (9 / 0)
Not all QD’s are the same. The color purity is a function of the manufacturing tolerance because the color they emit is related to the size of the dots. That is why you see the variations in color gamut. QD’s also only down convert from a shorter wavelength to a longer wavelength, so combining phosphor and QD’s doesn’t make any sens…except to marketers.

It‘s very easy to measure a TV, in fact they need to do that to calibrate it. So they could just print out the measurement.

Buy a TV that has some sort of certification that requires a minimum amount of performance and independent testing to ensure compliance. For example one with a UHD alliance certification. Buying a TV that has a certified film makers mode is also a good idea because that means you can get at least one well calibrated setting. Even if you don’t like that setting, it at least means that it should be easier to calibrate the other settings to be more accurate.
 
Upvote
13 (13 / 0)

sunnysocal

Ars Praetorian
472
Subscriptor
I am so fearful of the day I need to replace my nearly 15 year old Panasonic 1080p Plasma. It’s got a huge bezel, only partially supports CEC, but it’s dumb as a box of rocks and is ‘just a tv’. Whatever replaces it will just have an Apple TV plugged into it anyway but I already hate having to think about OLED, QLED, LED, mini-LEE, quantum dots, nits, and especially if I have to worry about it always pulling updates, displaying ads, collecting data, etc.

There’s a non-zero chance we just keep a small, dumb existing projector instead of wading into this mess. 😂
I moved from Pano Plasma to LG OLED maybe five years ago... I'd had the plasmas display for years, and it had performed well, but the new OLED took my breath away. And it still looks great today.

Use an Apple TV, don't connect whatever you buy directly to the Internet, and you will be very happy with a modern display regardless of what acronym the manufacturer slaps on it.
 
Upvote
10 (10 / 0)
D

Deleted member 567875

Guest
I assumed quantum dots were marketing speak for leds and ignored it.
While it is definitely taken advantage of in marketing, in general, it should mean a TV with a high level of BT 2020 coverage that is more power efficient than a non-QD set with similar brightness, due to less light from the backlight being lost during the conversion to white.
 
Upvote
3 (3 / 0)

demonbug

Ars Scholae Palatinae
802
Subscriptor
It's about fucking time. Confusing marketing bullshit has been a stable of the TV industry for far too long:
  • and going back all the way to "flat screen TVs" being chonky CRTs with a flat front (instead of the thin LCD and plasma screens people actually wanted.)
Mostly agree with your point, but flatscreen CRTs were a major, meaningful improvement before LCDs and plasma screens were around. Cheaper CRT TVs, particularly as you went up in size, had massively concave screens that had all sorts of distortion, especially when viewing from off-axis. Something like the Sony Trinitron flat-screen CRTs were much nicer and a significant step forward (at the cost of additional weight and two feint lines from the thick glass and reinforcing wires needed to make a flat front on a vacuum tube).
 
Upvote
18 (18 / 0)

StikyPad

Ars Scholae Palatinae
702
This is just another example of why I would never buy something based on marketing alone (if at all). Sure, maybe you can join a class action and, after 5 years, get a 25% coupon off your next purchase from the company that defrauded you, but it's so much easier to just consult independent reviews to start with; preferably several.
 
Upvote
4 (4 / 0)

FishInABarrel

Ars Praetorian
408
Subscriptor
TVs have been "good enough" for the vast majority of customers for a long time now. They're a commodity product now. Since they pretty much all look great in stores, the brands are desperately resorting to marketing bullshit to pull in buyers.

That should mean it's a great time to buy a TV. They're pretty much all great and cheap. But the line must go up, so now they're enshittifying the product with ads inserted everywhere.
 
Upvote
5 (7 / -2)

Got Nate?

Ars Scholae Palatinae
1,376
Upvote
10 (10 / 0)
In 2003-2009, we were investigating the use of quantum dots for protein tagging, and other diagnostic medical investigation. It got nowhere, because the promises and the capabilities of quantum dots were disappointing and were oversold. That was for the medical industry, which is absolutely huge $$. I question the consumer electronics industry’s use of quantum dots and their capabilities, as much research was abandoned back in the day. It’s always possible that they got “good enough” for use in televisions with some minor improvements.
 
Upvote
3 (3 / 0)

benwaggoner

Ars Praefectus
4,102
Subscriptor
The name QLED itself seems to me like an attempt to piggyback on the popularity of OLED. It's easy for people to get two so similar looking acronyms confused. Again an example of dishonest marketing practices.
That;'s really an oversimplification. QLEDs can have performance in the ballpark of OLED. The color fidelity/gamut can be the same. And LEDs can run brighter than OLED, so in a brighter room a QLED can provide better fidelity with brighter content after ambient light compensation than an OLED can. The tradeoff is that local contrast is less, as you don't actually have the sub pixel-level brightness control of OLED. Mini LED is getting ever closer, though. For most content that doesn't have really sharp edges between bright and dark pixels, either is great. But OLED performs better for things like really bright single-pixel stars on a deep black sky.

Put in another way, QLED is great for Marvel movies and Oppenheimer, while OLED shines with The Expanse and Interstellar (if watching in a darker room).

QLED makes sense as a variation of LED; QD-LED is awkward, as is QD-OLED, which is why the shorter versions tend to be used.

Now, calling something QLED that doesn't actually leverage quantum dots in its image and only provides non-QD performance is another thing entirely. I can't speak to what any manufacturer is or isn't doing.

As the article points out repeatedly, the real question is the display's performance, not how they get there, and RTings is a great source for that kind of analysis. And I question targeting just P3 instead of the broader Rec. 2020 color gamut, as we have displays and content that push significantly beyond what P3 does. (SDR uses Rec. 709/sRGB, digital cinema uses P3, and home video HDR mainly uses 2020).
 
Upvote
3 (4 / -1)
I am firmly of the opinion that while you may see a difference in the store with two TVs side-by-side in picture quality, once you get the thing home and mounted on the wall unless you are an eagle-eye'd videophile it really makes no difference at all.

I'll be buying an 85" TV for my new house sometime soon, and chances are I will buy whatever name brand is on sale for the best price in the $800-$1000 range. I know too many people who have bought the really cheap TCL/Hisense/etc. TVs and had them die shortly after the warranty period to go that cheap. My Samsung, Sony, and LG TVs have lasted forever - my mother is still using the original Samsung 55" LED TV that I bought the best part of 15 years ago when that tech first came out. That's more important to me than the last nth of picture quality anyway.
Most display TVs in stores are seriously cranked up on all settings to make them pop to your eyes. The color accuracy is nowhere near correct. This is not a new practice.
 
Upvote
16 (16 / 0)
QLED being close to OLED that it confuses casual shoppers or grandpa who just wants a new TV set for the living room.

The "quantum" part is what annoys the heck out of me. Marketers think slapping quantum on to anything makes the price double. The problem is that from a physics point of view, they're correct - quantum dots use quantum properties of certain nanoparticles to absorb and re-emit light at the desired wavelength.
You think that's bad. Wait until we have AI QD's that will work better than before.
 
Upvote
2 (2 / 0)

SraCet

Ars Legatus Legionis
16,817
TVs that use QDs are supposed to offer wider color gamuts ... over their QD-less LCD-LED counterparts.

Okay... but, are they? As in, are they supposed to do that?

Before quantum dots, we had white LED backlights and a color filter to ensure that the correct color of light was being shown through the appropriate LCD pixel.

With quantum dots, we have blue LED backlights and the color "filter" layer is made of quantum dots that change the color of the blue light so that it's the color it's supposed to be for any particular pixel.

Either way, you get the correct color of light being shown behind an LCD pixel.

The efficiency gain from quantum dots is obvious. Instead of light being blocked, it's being emitted. Big efficiency win.

But what's the reason to believe that QD displays would offer any wider color gamut?

Note that plenty of professional-grade computer monitors offer excellent color gamut and volumes without quantum dots.
 
Last edited:
Upvote
2 (3 / -1)

SraCet

Ars Legatus Legionis
16,817
Most display TVs in stores are seriously cranked up on all settings to make them pop to your eyes. The color accuracy is nowhere near correct. This is not a new practice.
Not sure if this is true anymore. Modern TVs tend to all have excellent color accuracy. The trend now is to display looped demo video in stores that is designed to make the TV look impressive rather than inaccurate.
 
Upvote
-3 (1 / -4)

benwaggoner

Ars Praefectus
4,102
Subscriptor
It's about fucking time. Confusing marketing bullshit has been a stable of the TV industry for far too long:
  • QD displays not actually containing QDs,
That sounds like false/misleading advertising, not just confusing terminology. QD are real and are used to great effect in lots of great TVs. We should be talking about them when they meaningful contribute to picture quality.

  • "mini-LEDs" being LCDs with a handful of LEDs in the backlight (chosen to be confused with micro-LED),
Mini LEDs are really one LED per zone; they're generally a lot closer together than the old "zone lit" or "full array local dimming" TVs were. All of which are better than the cheap edge-lit and global backlight TVs. Those worked okay for SDR but can't do good HDR. Among other things, as draw power and produce heat proportional the the brightest pixel on the screen at the time, they can't get nearly as bright in a portion of the screen as OLED or miniLED. Otherwise they'd blow the power budget or cook themselves to death. A good QD-OLED can make 1% of the screen ~10x as bright as a global backlight TV. Which is really all you need; those bright pixels are best used for specular highlights and other small details; a full screen of 4000 nits is a like staring at the inside of a tanning bed.

I argue it's really impossible to do HDR worth its name with a global backlight TV. You can get the more saturated colors, but very little of the greater contrast and dynamic range.

  • "QLED" being LCDs with some QDs somewhere (chosen to be only a tiny dash away from OLED),
QLED is pretty much always at least full array backlight plus quantum dots.
  • "LED" being LCD with LED instead of CFL tube backlight (also to be confused with OLED),
LED was the original term; OLED came later (organic light emitting diode). No intentional confusion there. And in modern practice, the practical difference is that OLEDs are smaller but can't get as bright.
  • "HD TV" being the lower res 720p standard (to be confused with actually for real "Full HD"),
720p IS HD, back to the launch of ATSC 1.0. And it was the better HD for any fast moving content like sports, as ATSC could do 720p60 or 1080i30. And that's what lots of streaming content and TVs were. We don't really talk about HD versus Full HD anymore as there's not many 720p only TVs anymore.

  • and going back all the way to "flat screen TVs" being chonky CRTs with a flat front (instead of the thin LCD and plasma screens people actually wanted.)
You've got your history backwards there. The flat (and "flatter") screen CRTs were available and desired for many years before LCD or plasma. Compare a mid 50's and a mid 80's CRT; screens got much flatter. This was especially important for computer monitors so you didn't get distortion at the edge.

Bullshit is hardwired in the collective marketing brain, and it's time for a lobotomy.
I don't think there is nearly as much bullshit versus "trying to communicate complex technical features as something that can be printed on a cardboard box" going on. The major TV companies ship devices with the features they say they have.

Sheesh, trying to communicate a critical feature like "psychovisual accuracy of tone mapped color volume across common ambient light conditions" to a customer is daunting, and that's what really makes a TV look good in a living room. Even explaining it to people who work in the business is incredibly challenging unless they're an actual image scientist. It has, literally, taken me three whiteboards, four dry erase marker colors, some YouTube videos, and interpretive dance to do even a mediocre job of it.
 
Upvote
12 (13 / -1)

benwaggoner

Ars Praefectus
4,102
Subscriptor
Realistically speaking the vast majority of folks buying TCL screens (probably most screens?) are maximizing inches per $ and picture quality beyond a minimum doesn't factor into the decision (see: every AirBnB I visit). Folks on Ars actually paying attention to the underlying tech are in the tiny, tiny minority.
As someone who just spent 10 days in hotels and hospital rooms, having all the TVs locked into Vivid Mode almost makes the panel quality irrelevant. On the occasions I happened to be traveling with a factory remote and could fix it, I could watch hotel TV. Otherwise I just use my MacBook Pro.

They somehow even disable Filmmaker Mode on hotel TVs.
 
Upvote
3 (4 / -1)

Granadico

Ars Scholae Palatinae
1,161
With the value proposition of "best screen size/dollar in an aisle display at Walmart", brands like TCL became dominant. They've even invested the engineering R&D to competing at the mid-high end, even in this article there are examples of models that have image quality nearly as good as the top brands, but at half the price.

And TBH even with shitty deceptive marketing practices, with the advances in technologies most people buying the Walmart Black Friday Special are getting image quality as good or better than the 20-year old 32" LCD or 30-year old CRT they're replacing. So they're happy and they keep buying.
This is the really depressing part. I get that not everyone is a huge film nerd and cares about IQ and motion blur and stuff like that, but the amount of people I know who have like 80" TVs that look godawful and have never changed the settings and just think that the size of the TV is the only thing that matters is depressing. Basically everyone I've told how I paid $1K for my 42" OLED thinks I'm crazy because they can find a 42" TV at Walmart for like $200 or something. It probably isn't readily apparent to people how vast the quality difference is unless the screens are side by side, or again they just don't care because they're half on their phone watching the same movie for the past 20 years.
 
Upvote
9 (9 / 0)

benwaggoner

Ars Praefectus
4,102
Subscriptor
Backlights and heat are my main concern for a TV (rarely used for watching TV these days). Picture is now second place I mean sure a nice clear video is great of course.
Backlights are really part of picture quality; you can't really separate them.

My Samsung backlight probably edge light has been bad for a few years two bright spots on the upper left. A few consumer product review organizations point out backlight style and heat are big causes of failure and to limit brightness in the TV settings to slow down light failure or heat damage.
How old is the TV? Two bright spots on the upper left could be a zone failure as well.

But 100% about heat being a big issue. A key reason why OLED and mini LED backlights are able to deliver so much local brightness at the same power levels is because it can spend power and make heat only where it is needed. An edge lit display can't, so the power and heat produced is that required by the single brightest pixel on screen. Thus that pixel can't get anywhere near as bright.

And the smaller the illumination zones.

Local heat can be an issue too; plenty of TVs won't sustain peak local brightness for more than 30 seconds or so due to heat reasons.

And for those worried about TV lifespan and OLED burn-in, controlling light, heat, and color is key. I've been using a LG C1 as my daily driver PC/Mac work monitor for getting on four years, and it still looks perfect. I run it in Filmmaker Mode (which makes it act like a monitor, just showing the pixels the GPU makes without motion smoothing, extra sharpening, etcetera). It also uses the proper 6500K color temperature, while many TVs default to the old Japanese broadcast standards of 9300K, which is quite a bit bluer/cooler. Since it's the blue OLEDs that age the fastest, correct color calibration extends their lifespan. And I use ambient light compensation, so the TV brightness goes down when my room is darker, and I keep my lights dim if I'm mainly using the monitor. That also puts much less wear and tear (on my eyes too!) than just leaving it default brightness all the time.
 
Upvote
7 (7 / 0)

Fatesrider

Ars Legatus Legionis
24,977
Subscriptor
Quantum enshittification.

Unless there are hard, enforced standards, marketing folks will always rob technology-brand credibility. If ABC expensive-TV corp has spent billions branding a technology, a marketing manager at XYZ discount TV corp is all too willing to take advantage. They're compensated based on the sales they drive to a given unit ... why would they care if the value consumers ascribe to the technology diminishes? That's someone else's problem when said marketing manager has cashed a few years of bonuses and moved on to their next job.

See also: "AI" (even before the genAI craze, folks were still slapping 'AI' on things that were effectively a few sets of if-then logic)
This doesn't help, either:
“These quantum dots are nano-sized molecules that emit a distinct colored light of their own when exposed to a light source,” TCL says.
What the fuck are "nano-sized molecules"? That's marketing, not technology. Empty contemporary, techy-sounding buzz-words that describe nothing.

I mean, I get that marketing is all about breathless obfuscation and elevating excitement for a product no one would buy should any acceptable amount of truth about it actually be included. It's all built on begging the question and argument by gibberish logical fallacies - such as the above cited quote.

The whole fucking "quantum" LED's thing is just an overly hyped way of saying something exciting without actually relaying any information about it. You could call it "dancing angles LED's" and MEAN the same thing, but that's not techy enough to create much engagement. And it'd undermine the mystery of what the tech is.

Bullshit tiny things that are so expensive, it's more profitable to put in a lot fewer of them (if any) and still tacking on that name because the name is just a name and describes nothing.

So just more Madison Avenue misdirection.

Then again, by this time, if anyone is actually believing advertising, then its on them when, not if, that advertising misleads them. Those who think truth in advertising is a thing don't get how far the rules can stretch to accommodate selling something. It seems that as long as no one is killed, nothing will happen to companies that stretch the truth more than a singularity does to incoming matter.

And even then, sometimes nothing continues to happen for an unacceptably long period of time.

In the end, they just want a screen that lasts long enough in a showroom to be practical as a comparison for consumers and that's about it. Anything more than that is wasted profit.
 
Upvote
0 (3 / -3)

ruet

Ars Praefectus
3,285
Subscriptor
Some of them certainly are. If you do the slightest bit of research about features and image quality, you can pick out a TCL model that's 90%-99% as good as a mid-upper range Samsung, LG, Sony etc. model for substantially cheaper.

But you can also get total crap models from TCL and other discount brands. The Black Friday Specials and supermarket-only models will be substantially worse than anything bought in the last 10-20 years by someone who's even slightly picky about image quality.
I did my research on the 50" and it performed as expected. I bought the 65", in the same model line, based on my experience with the 50". It worked out. I bought the 40" because it was the cheapest 40" I could get my hands on. It's... fine. 40" is, or was when I made my purchase, a strange product segment for TVs.
 
Upvote
2 (2 / 0)

benwaggoner

Ars Praefectus
4,102
Subscriptor
What I gathered was that presence of quantum dots is largely irrelevant and we should be judging purely on color reproduction.
Yes, we should be judging based on accuracy of the picture (which is both brightness and color, which interact in some non-obvious ways).

Quantum dots, when actually used, are a powerful tool to allow for highly saturated colors. Good TVs use them as advertised. The issue here is whether some manufacturers are labeling as QD TVs that don't use enough to actually deliver any benefit.

But a good picture quality test will catch those issues.

But won't catch everything important about a TV. Often reviews will only capture how accurate the TV is in reference dim ambient light. The biggest advantage of LED is that it can remain psychovisually accurate in brighter environments than a similarly priced OLED. Ambient light compensation is really important, but rarely quantified.

Local contrast (being able to have a very bright and very dark pixel right next to each other) isn't always evaluated either, and that's a key advantage of OLED. Getting ever smaller zones has been a key area of innovation in LED TVs (with the single-pixel miniLED the ultimate implementation), and I've not seen much quantification of the impact of different zone size and light bleed mitigation technology impact.
 
Upvote
3 (3 / 0)

benwaggoner

Ars Praefectus
4,102
Subscriptor
lol, and there I though QD stand for quad definition aka better than typical HD resolution. And yeah, whatever lawsuit Samsung throw at TCL will not make their overpriced tv worth buying, especially when you can barely see any difference in the store except the 4-5 fold price tag.
Thinking of Quad HD? (W)QHD is 2560x1440 at 16:9. Or perhaps quarter HD? qHD is 960x540. Need to keep your capitalization straight!

If people think TV tech nomenclature is confusing, enjoy VESA naming. Quick, what's the difference between QWXGA, WQUXGA, and QSXGA+?
 
Upvote
3 (3 / 0)
I am so fearful of the day I need to replace my nearly 15 year old Panasonic 1080p Plasma. It’s got a huge bezel, only partially supports CEC, but it’s dumb as a box of rocks and is ‘just a tv’. Whatever replaces it will just have an Apple TV plugged into it anyway but I already hate having to think about OLED, QLED, LED, mini-LEE, quantum dots, nits, and especially if I have to worry about it always pulling updates, displaying ads, collecting data, etc.

There’s a non-zero chance we just keep a small, dumb existing projector instead of wading into this mess. 😂
Bought an LG OLED G series a few years ago as an upgrade to the panasonic. It is definitely noticeable as a better TV in a lot of ways. It is not connected to a network, just various inputs, a BD and ATV primarily.
 
Upvote
1 (1 / 0)

SraCet

Ars Legatus Legionis
16,817
...
  • "mini-LEDs" being LCDs with a handful of LEDs in the backlight (chosen to be confused with micro-LED),
I think maybe you're the only person who has been confused by this. No "normal" people know what micro-LED displays are.

Mini-LEDs are physically much smaller than the LEDs that are typically used in non-mini-LED backlights so I'm not sure what else they could be called other than "mini-LED."
 
Upvote
0 (0 / 0)

SraCet

Ars Legatus Legionis
16,817
...
Local contrast (being able to have a very bright and very dark pixel right next to each other) isn't always evaluated either, and that's a key advantage of OLED. Getting ever smaller zones has been a key area of innovation in LED TVs (with the single-pixel miniLED the ultimate implementation), and I've not seen much quantification of the impact of different zone size and light bleed mitigation technology impact.
Rtings.com is what you're looking for.
 
Upvote
2 (2 / 0)
Remember the time when processors can just be evaluated by its speed? And marketing decided to make different brands of the same thing to improve the targeting of their price gouging.
No, and neither do you.

It's never been possible to compare the performance of two systems strictly by the speed of the central processor unless you knew that all other relevant technical standards had been held constant.

The old "MHz myth" -- the idea that the Pentium 4 was drastically faster than lower-clocked CPUs made by competitors (or Intel's own Pentium 3) purely on the basis of clock was recognized as a false marketing claim 21 years ago -- but the issue far predates the Pentium 4. Other factors, including bus width, bus clock, RAM latency, memory bandwidth, and whether or not a system used features like DMA to avoid loading the CPU have always had significant impacts on performance, over and above clock speed.

A 486 equipped with a math co-processor could run rings around an 80386 that lacked one, even if the 386 was running at a higher clock speed. The first Pentium (without MMX) was slower than the later Pentiums (with MMX) at the same clock speed, because Intel used the MMX extension launch to improve other aspects of the P5 microarchitecture.

CPU performance has never been something you could evaluate strictly with reference to clock speed. Not in the 1980s and not today.
 
Upvote
12 (12 / 0)

SraCet

Ars Legatus Legionis
16,817
No, and neither do you.

It's never been possible to compare the performance of two systems strictly by the speed of the central processor unless you knew that all other relevant technical standards had been held constant.

The old "MHz myth" ...
I don't know if the OP was referring to MHz when he said "speed."

Intel has been differentiating its i3/i5/i7 products along several different axes other than the pure CPU performance that customers will likely experience, e.g., virtualization features that most customers won't care about, efficiency features (e.g. turbo boost, SpeedStep) that are likely completely irrelevant to e.g. desktop users, etc.
 
Upvote
0 (0 / 0)