If you’ve been TV shopping recently, we’re betting you’ve had all kinds of terminology thrown at you — from HDR to Dolby to QLED, OLED, and 4K. Then there’s 1080p and 1080i. While you won’t typically see 1080i in sales copy, it’s something you’ll often see when you’re using your TV (depending on the type of content you’re watching). But what does it mean, and how does it compare to 1080p? Here’s a quick comparison of 1080p vs. 1080i.
The difference between 1080p and 1080i
We’ll start with the abbreviations — 1080i is short for 1080 interlaced scan, while 1080p is short for progressive scan. The difference between these two formats is how they’re displayed on your screen. In an interlaced scan (1080i), the image is displayed by illuminating odd and even pixel rows in an alternating fashion. Your TV does this so rapidly (each field flashes 30 times per second) that your eyes aren’t capable of noticing the switch, so at any given moment you see what appears to be a fully-assembled picture.
Progressive scan (1080p), other the other hand, scans every row of pixels progressively, refreshing every row on the screen 60 times per second. Technologically speaking, this is harder to pull off, but it’s generally agreed upon that progressive scan produces superior images compared to those produced via interlaced scan. This is why you’ll often hear 1080p touted as “true” or “full” HD by people hoping to differentiate it from 1080i or 720p. The benefits of progressive scan become especially apparent during scenes with lots of motion — just take a look at the pictures below and note the stark differences in clarity and sharpness.
Generally speaking, you need a TV bigger than 42 inches in order to discern 1080i vs. 1080p — and that’s also dependent on how far away you’re sitting. Generally, for fast-moving images, 1080p offers superior image quality that prevents the appearance of the screen “tearing” that can occur with 1080i.
Another thing to consider is that nearly all new HDTVs you can buy today are capable of de-interlacing 1080i video signals so they look just like 1080p, which makes it even harder to notice a difference.
How cable service reduces your picture from 1080p to 1080i
If you’ve noticed that the HD content you watch on your cable or satellite box pales in comparison to the picture quality you get from your Blu-ray player, or you get frustrated when your TV’s info bar shows that you’re watching 1080i even though you have a 1080p TV, you’re not alone in your disappointment. There is a reason for this, however.
The only way cable and satellite companies can deliver 3,000 channels (ok, maybe we’re exaggerating a little), many of them in HD, is by compressing their video signals in an effort to squeeze more information into a crowded pipeline. This compression robs the signal of its pristine clarity and sharpness and can introduce blocky color gradations into the picture. For a highly revealing example of the difference, try tuning in to one of your locally broadcast HD stations on both your cable/sat box and through your TV’s tuner (you will need an HD antenna for this). Now, switch back and forth between the two and note the difference in quality.
4K vs. 1080p and 1080i
These days, 1080p and 1080i are old hat compared to the much more publicized 4K format available with most new HDTVs (often classed as UHD TVs). With 4K resolution, picture clarity is sharper and more colorful than ever. Consumers can also enjoy sitting quite a bit closer to their living room TV without noticing any sort of distortion in the image. This is because 4K TVs display close to four times the number of pixels as a standard 1080p set. Simply put, the more pixels on display, the better the picture quality. Better yet, most UHD sets will also upconvert a standard HD image, making your regular HD sources look closer to actual 4K.
In terms of content, it’s still a tough bet when it comes to cable and satellite boxes. With many providers just catching up to 1080p broadcasting, full 4K from your cable company
Frequently Asked Questions
Is 1080i or 1080p better for PS4 games?
If you want maximum crispness from your PS4 games, it’s better to opt for 1080p over 1080i. That’s because 1080p produces sharper, superior images compared to those produced via interlaced scan (1080i). The benefits of 1080p become especially apparent during scenes with lots of motion—i.e. PS4 gaming sessions.
Is 1080i or 1080p better for movies?
It’s generally agreed upon that 1080p is superior to 1080i for movies, especially those with fast moving images. It’s during scenes with lots of motion that the quality of 1080p over 1080i becomes especially pronounced: That’s because 1080p scans every row of pixels progressively, refreshing every row on the screen 60 times per second, whereas 1080i is displayed by illuminating odd and even pixel rows in an alternating fashion every 30 seconds.
What are the benefits of 1080p over 1080i?
The general consensus is that 1080p is noticeably superior to 1080i, since its image quality is more crisp and realistic. This makes 1080p an especially good option for movies with lots of motion, and video games. However, it’s important to note that if you’re watching movies or TV over cable, your picture may be automatically downgraded from 1080p to 1080i. Cable companies compress video signals so they can deliver more channels, which robs them of their pristine clarity.