Skip to main content

Intel’s new $249 GPU brings 1440p gaming to the masses

An exploded view of Intel's Arc A580 GPU.
Intel

Intel is trying to redefine what a “budget GPU” really means in 2024, and it’s doing so with the new Arc B580 GPU. In what Intel itself described as its “worst kept secret,” the B580 is the debut graphics card in Intel’s new Battlemage range of discrete GPUs, and it’s arriving at just $249. That’s a price point that’s been relegated to 1080p for decades, but Intel says the B580 will change that dynamic.

It’s a 1440p GPU, at least by Intel’s definition. That’s despite the fact that Intel is comparing the card to GPUs like the RTX 4060 and RX 7600, both of which are more expensive than the B580 and squarely target 1080p. Intel says it can deliver higher performance than these two GPUs while undercutting the price, all in an attempt to capitalize on 1440p gamers. “1440p is becoming 1080p,” as Intel’s Tom Petersen put it in a pre-briefing with the press.

Performance for the Intel Arc B580 compared to the Nvidia RTX 4060.
Intel

How much faster? By Intel’s metrics, the B580 is 10% faster than the RTX 4060 on average — Intel didn’t provide any performance comparisons to the RX 7600. That may not sound huge, but there are a handful of games where the B580 looks very impressive. In Cyberpunk 2077, for example, the B580 is 43% faster, and in Resident Evil 4, it’s 32% faster.

Recommended Videos

It shouldn’t come as much of a surprise that the B580 shines bright in games like these. Unlike the RTX 4060, which has 8GB of VRAM, the B580 comes with 12GB. At a higher 1440p resolution with maxed-out graphics settings, the additional VRAM is put to good use in games constrained by memory, such as Resident Evil 4. Less in VRAM-constrained games, the performance delta is unsurprisingly smaller.

Intel's Arc B580 compared to the A750 GPU.
Intel

For gen-on-gen improvements, Intel says the B580 is 24% faster than last-gen’s A750 on average. Once again, the A750 is restricted to 8GB of VRAM, so you see larger improvements in games like The Last of Us Part One. 

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

Intel made several comparisons to the A750, from performance per watt to performance per dollar. It’s an interesting choice, considering Intel has an Arc A580 — you can read about it in our Intel Arc A580 review. The naming here certainly suggests that Intel has higher-end ambitions with Battlemage, which has been rumored for some time.

For now, however, we only know about two GPUs. The B580 is the card Intel is pushing right now, which arrives on December 13 for $249. Like the Arc A750 and A770, it will have a Limited Edition version created by Intel, as well as several board partner designs from brands like ASRock, Acer, and Sparkle.

Specs for the Intel Arc B580 and B570.
Intel

The B570 is a cut-down version of this card — you can see the spec comparison above — that will launch on January 16 for $219. Intel didn’t provide any performance metrics for the B570, though based on the differences in specs, it shouldn’t be too far behind the B580 in games.

Right now, we’re simply going based on the performance Intel has shared. As usual, we’ll need to wait until the B580 is here to validate Intel’s performance claims and see how the card stacks up against a wider swath of GPUs — particularly those targeting 1440p like the RTX 4060 Ti and RX 7700 XT.

The back shroud of the Intel Arc B580 GPU.
Intel

Regardless, $249 is a very attractive price. As it stands, neither Nvidia nor AMD have gone down to that low of a price this generation. The most inexpensive options are the RTX 4060 at $299 for Nvidia and the RX 7600 at $269 for AMD. It seems Intel wants to keep the B580 at that price, too. With its Limited Edition design, at least, Intel confirmed that the cards are being made in Vietnam. That will, hopefully, get around the proposed tariffs on China and Mexico that could go into effect next year.

Jacob Roach
Lead Reporter, PC Hardware
Jacob Roach is the lead reporter for PC hardware at Digital Trends. In addition to covering the latest PC components, from…
Intel’s new Arrow Lake CPUs can still consume a ton of power
Pins on Core i9-12900K.

Intel has made a big deal about the efficiency of its upcoming Arrow Lake CPUs, which are looking to earn a spot among the best processors when they release later this week. Some early benchmark results HXL on X (formerly Twitter) show that the CPUs can still draw a ton of power if you stray from Intel's default power settings, however.

The post, which you can see below, shows the Core Ultra 9 285K peaking at 370 watts of power draw in Cinebench R23's multi-core test. The CPU itself is blacked-out, but you can tell it's the Core Ultra 9 285K from the 24 cores picked up by Cinebench. The Core Ultra 9 285K has a maximum turbo power of 250W, according to Intel, and a base power of 125W.

Read more
Intel is bringing back one of its most frustrating types of CPUs
Intel Core Ultra Series 2 Lunar Lake chipset.

Intel's new Arrow Lake CPUs are a big deal. They utilize an entirely new architecture and come with a new socket, which will help them compete for a spot among the best processors. However, it looks like some upcoming Core 200-series CPUs (the non-Ultra versions) might not use the Arrow Lake/Lunar Lake architecture at all -- they might be rebranded CPUs sporting older CPU tech.

The assumption comes from results in the Crossmark benchmark that were posted to Bapco and first pointed out by Everest on X (formerly Twitter). The result shows the Core 5 210H, but it's not the performance that's interesting. It's the specs. The result shows that the CPU comes with eight cores and 12 threads. That's the rub. Arrow Lake and Lunar Lake don't come with Hyper-Threading, so each core only comes with a single thread.

Read more
No, generative AI isn’t taking over your PC games anytime soon
Cyberpunk 2077 running on the Samsung Odyssey OLED G8.

Surprise -- the internet is upset. This time, it's about a recent article from PC Gamer on the future of generative AI in video games. It's a topic I've written about previously, and something that game companies have been experimenting with for more than a year, but this particular story struck a nerve.

Redditors used strong language like "pro-AI puff piece," PC Gamer itself issued an apology, and the character designer for Bioshock Infinite's Elizabeth called the featured image showing the character reimagined with AI a "half-assed cosplay." The original intent of the article is to glimpse into the future at what games could look like with generative AI, but without the tact or clear realization of how this shift affects people's jobs and their creative works.

Read more