Skip to main content

TV brightness wars: how bright does your TV need to be?

If you’re really into the latest TV tech, the fact that Samsung, LG, Sony, Philips, TCL, Hisense, and Vizio are all caught up in a TV brightness battle isn’t exactly news. But if you’re just getting into some TV research and you’re starting to read and hear about nits, and how this TV is brighter than that TV, and you’re getting this feeling that brightness is some sort of yardstick for how good a TV is, then you might rightly wonder: How bright is bright enough? How many nits do I need?

So, let’s talk about TV brightness – why it’s important, how important it really is, and when enough is enough – or if it ever will be.

The nitty-gritty

Nits are a measurement of brightness. When I measure a TV’s light output using a spectrophotometer and/or colorimeter (I personally use an X-Rite i1Publish Pro 2 and a Portrait Displays C6), the result I get is expressed in nits. So that’s the word I use whenever I talk about how bright a TV can get. You may hear that one TV’s peak brightness is 1,000 nits, but another can push up to 1,500 nits. Without knowing your nits from a 9-inch nail, you probably can surmise that the higher number is more desirable. And, on balance, it is. But I’ll dig into that some more in a moment.

Recommended Videos

Now let’s talk about why brightness is valuable. What does brightness do for an image that makes us say, “That’s good! My eyes love to see it!”?

A man measures TV brightness using a SpectroPhotometer.
Measuring brightness is more exciting than it looks. I promise. Image used with permission by copyright holder

Brightness helps us in two meaningful ways. First, it makes images generally easier to see — a bright TV image is easier to see in an already bright room. Also, brightness is on the positive end of contrast, and contrast is the most easily recognizable aspect of picture quality to the human eye. Your eyes are way better at perceiving contrast than they are at perceiving, say, color fidelity. The untrained eye is less likely to spot that a shade of red doesn’t appear just right, as it is to notice that the image in general either does or does not really seem to pop off the screen.

When you can have bright, sparkling elements on a screen comingling with dimmer or darker elements, you get contrast. And high-contrast images generally are very pleasing. Exciting, even!

SDR — the old standard

Brightness in today’s TVs means something very different than it did just nine or 10 years ago. That’s because in 2014 we started getting HDR — or high dynamic range — in both our TVs and the video itself, thus changing dramatically how brightness was used by a TV.

Before we had HDR, we made do with standard dynamic range. We were quite happy with it, but I’d argue that’s because we didn’t know what we were missing. The SDR standard was developed around the aging cathode ray tube – that’s the CRT in CRT TV.

The SDR standard allowed picture information sent to your TV in a range from 0.1 candelas per meter squared (aka nits) — that’s your blackest — to the brightest and whitest of 100 nits.

A close-up of a flower's center on a Samsung PN60F8500 Plasma TV. front main
A Samsung PN60F8500 Plasma TV Image used with permission by copyright holder

That doesn’t mean that more modern SDR TVs like plasma and LCD TVs could only put out 100 nits of brightness. It just meant that in SDR, TVs never got brightness information above 100 nits. From there, it’s all about what the TV does with that information.

And because we like bright images, what most TVs have done is just sort of raise the roof. So they would map out all the tones from 0.1 to 100 across a much broader range, let’s say, from 0.1 all the way up to 700. This meant that the brightest elements of the picture would be put out at 700 nits, while something that was coded for 80 nits might come in at 450, and so on. They just moved the scale.

And this allowed us to have what we call high APL, or average picture level, even though SDR’s maximum information for brightness was pretty low.

The heyday of HDR

Let’s now zoom forward to the current HDR days. Now we can include information in a video signal that goes from 0.1 all the way up to 1,000, or 4,000. And if Dolby Vision had its way, it would go all the way up to 10,000 nits. This seems insane, and it kind of is. But there’s some merit to the idea. I’ll get to that. My point is that HDR kind of flipped the script on what brightness meant for a TV and its viewer.

In the early days of HDR, most TVs could not get as bright as the information in the video signal, so instead of tone-mapping everything up, they had to tone it down to the operable range of the TV. So, say we now have a TV with a peak brightness of 700 nits. The HDR video you watch probably was mastered up to 1,000 nits. So what does a TV do with 800, 900, or 1,000-nit video signal info, if it can’t go that high? It tones things down. Again, a sliding scale. The TV just maps it out so that you still get to see everything; the 1,000-nit signal comes out at 700 nits, the 900-nit signal comes out at 625 nits, etc.

By the way, those tone-mapping numbers may not be a reflection of reality — they are just to help illustrate.

As you can imagine, people wanted TVs that could do at least 1,000 nits. If the video signal has it, they wanted to see it on their TVs. And eventually, we got 1,000-nit TVs. And then 1,500-nit TVs, and 2,000 … and 3,000. Now, there are some TVs that can get even brighter, but those are few and far between. The point is that most HDR is mastered up to 1,000 nits — sometimes up to 4,000. but that is rare. In many cases, HDR TVs can put out more brightness than the video signal calls for. But why would we want that?

Let’s go back to what I was saying before. The brighter the room, the more brightness you want from your TV so that the picture can still look like it has great contrast. This brings back the APL, or average picture level, that I was talking about earlier. The TV can just make everything it displays brighter for you so it is easier to see.

But, really, for the best experience, you would be viewing in a light-controlled room. And all of that brightness power should — no, needs — be reserved for what we call specular highlights, or just highlights.

Imagine you’re looking at a scene of a shiny car with the sun reflecting off the chrome. You want the sun’s reflection to be very bright. But the rest of the image doesn’t need to be insanely bright. In fact, it’s better if it isn’t because it makes that bright reflection have more impact. Because? You got it. Contrast.

The more separation there is between a bright object and the surrounding objects, the more contrast, and therefore the higher the visual impact.

So, in a brighter room, if you want that bright highlight to have a lot of impact, it’s going to need to be insanely bright in order for it to stand out against the high average picture level, or just generally bright screen image. If your screen’s average brightness level is already punching up into the 700-nit territory, then you’ll need some serious power to make the highlight look bright against that.

In a darker room or even one that isn’t just soaked with sun, the need for brightness power doesn’t really change. You want that power on tap. But how that power is used? That is everything.

Not just brighter, smarter

With great power comes great responsibility. You don’t want to be watching a movie, all snuggled up with your snacks and your drinks, vibing on the couch, feeling pretty chill, when all of the sudden — BAM! That sun in the sky behind Maverick’s plane is so bright you are basically blinded. That’s not a good experience!

And because many of you have had that exact kind of experience, I know you might wonder why in the world you could ever want a TV that can get any brighter than the one you already own.

It’s important that the TV be smart enough to know where to route the brightness.

Well, it comes down to that responsibility piece I just mentioned. It’s fine if a TV can get super bright. It’s just really important that the TV be smart enough to know where to route the brightness. A really smart TV, one with great image processing, will know that if an object is relatively tiny, it can be juiced way up so that it sparkles and has a high impact on-screen, but if another, larger object gets too bright, it could really damage the picture. Or someone’s eyeballs. So let’s not make that part of the screen too bright.

That’s my version of the TV processor’s internal monologue. It may also lament not having access to four full-bandwidth HDMI 2.1 ports, but I digress.

How much is too much?

So, we come back around to the question I posed at the beginning of this article: When is enough, enough? How much brightness do we really need?

Well, for me, the answer to how much brightness we really need is actually less important than asking the question: Does the TV do a good job with all the brightness it has on tap?

I’m fine if a TV can punch up to 4,000 nits if that TV can keep that power reined in and deploy it only when it is needed. That way, it can be a ridiculously awesome bright-room TV or even an outdoor TV. But it could also work well in the evenings, or in darker rooms, and save that peak brightness power for when it will have maximum impact when watching HDR content.

Otherwise, you have a TV you can’t comfortably watch at night, and that’s no good.

And to be clear, this is about bright colors as well as just pure bright white light.

A wall of QD-OLED TVs on display at CES 2023.
Image used with permission by copyright holder

Now, OLED TVs are unlikely to ever be able to punch much above, say, 2,500 nits or so. The newest MLA OLEDs and QD-OLEDs that were JUST announced can punch into the 2,000 nit territory — which, by the way, is a serious feat of engineering – and since those TVs also have perfect black levels, the images that leap off those screens is liable to be epic. What I’ve seen so far was indeed epic.

And because LED/LCD TVs struggle to get perfect black levels, they can muscle in contrast with high brightness. But even though we’ve already discussed some limitations to how much brightness they can really use, one thing we haven’t talked about yet is whether they can even exert that kind of control.

Unless they have tens of thousands of mini-LEDs in hundreds or thousands of zones so that almost every individual backlight is addressable – can be turned on or off or dimmed independently – then the feat of only lighting up the super-bright object without lighting up stuff around it is going to be hard to achieve. Also, as those peak highlights get brighter, the more risk there is of halo effect and/or blooming.

Anything you can do, I can do brighter

Anyway. We’ve gotten this far and I just realized that we never really got into one of the other reasons we got into this brightness war in the first place. It’s just plain one-upmanship. OLED TVs started raking in accolades and so, QLED TV makers started saying, yeah, but your OLEDs can’t get THIS bright, can they? So OLED TV makers said, well, we don’t really need to, but we’re gonna get brighter anyway for the sake of better HDR. And so they did. And so on and so forth, and honestly, I don’t see this ending soon.

The one-upmanship will continue, the hurling around of huge nit numbers will go on, and hopefully, the picture quality will actually benefit, rather than this devolving into a petty metrics war. Because as soon as the value to the consumer begins to dim, that’s when I will start to balk at all this brightness business.

Caleb Denison
Digital Trends Editor at Large Caleb Denison is a sought-after writer, speaker, and television correspondent with unmatched…
The Google TV Streamer added one feature that all such devices need
The rear of the Google TV Streamer device.

When a company drops a handful of press images alongside the announcement of a new product, we rarely get to see the back of the thing. And I get it — there's nothing particularly photogenic about a bunch of ports and cords. And that's mostly true of the new Google TV Streamer, which looks more like a router than it does a successor to the venerable Chromecast dongle.

But we did get the briefest glimpse of the rear of the device in Google's promo video. In it, you see the USB-C power cable and a hint of the Ethernet port. And those are great.

Read more
How I added a handful of hidden YouTube TV channels for the Olympics
Olympics channels on YouTube TV.

We're halfway through the Paris Olympics, and something just hadn't felt right. NBC and Peacock have done pretty well with the Paris Games. You can watch pretty much everything live, or catch up later in the day in the U.S. But this is 2024, and it just seemed like I didn't have any options -- and options in high-res -- as I might have expected, being a subscriber to YouTube TV.

Turns out, I was right. And it's a good reminder of one of my larger complaints about the biggest live streaming service you can get. (That's the pessimistic view. The optimistic view is that this is still a cool YouTube TV tip.)

Read more
Game Pass is coming to your Amazon Fire TV, even if you don’t own an Xbox
A woman holding a remote while looking at an Amazon Fire TV with the Xbox app on it. It's on the Cloud Gaming menu with Fallout 76, Senua's Saga Hellblade 2 and more on it.

You'll soon no longer need an Xbox console if you have an Amazon Fire TV. Microsoft and Amazon announced Thursday that the Xbox app is coming to Fire TV devices in July.

The Xbox app works with Cloud Gaming, which means with an Xbox Game Pass Ultimate subscription, you can stream a huge catalog of games from the Xbox library, including many first- and third-party titles that come to the service on launch day. Huge games like Starfield, Senua's Saga: Hellblade 2, and the Fallout series are just some examples, and the available games are always changing.

Read more