Skip to main content

The complete Intel Arc GPU lineup has just been revealed

Intel Arc Alchemist was just benchmarked by SiSoftware in a series of OpenCL benchmarks. Although the benchmarks are unofficial and don’t give a full overview of the card’s performance, they can serve as an indication of how well the entry-level Intel A380 will match up against AMD and Nvidia.

SiSoftware has also detailed the higher-ranked Arc Alchemist models. It looks like Intel will release its new discrete gaming GPUs in three tiers dubbed A300, A500, and A700.

A render of an Intel Arc Alchemist graphics card.
Image credit: Wccftech Image used with permission by copyright holder

The benchmarked Arc GPU in question is the entry-level Intel A380. SiSoftware used its own Sandra tool for testing, which is why these results should be taken with a small dose of skepticism and are more of an insight into how well the card may perform as opposed to a proper benchmark. SiSoftware also gives us more information about the GPU’s exact specs. However, the benchmark doesn’t divulge anything else about the system the card was tested with. All we know is that SiSoftware was using the 64-bit version of Windows 10 and the latest Intel graphics drivers, and that turbo mode was enabled in every test.

Recommended Videos

The A380 was put through a series of tests, and the results aren’t exactly surprising. This is going to be a budget GPU that SiSoftware speculates to be priced at $199, so the performance will match that of similar offerings from Nvidia and AMD. In some OpenCL tests, the Intel A380 showed similar numbers to the Nvidia GeForce RTX 3050 and GTX 1660 Ti, as well as the AMD Radeon RX 6500XT.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

According to SiSoftware, the A380 runs on Intel’s DG2-128 GPU. The graphics card features 128 execution units (EUs), 1,024 shader processors (SP), 16 tensor cores, and has 6GB of memory with a speed of 14Gbps across a 96-bit memory bus. Compared to its benchmarked rivals, it seems to have the lowest power requirements, needing only 75 watts compared to the 130W TDP of Nvidia’s RTX 3050.

Intel Arc A380 performance.
Image source: SiSoftware Image used with permission by copyright holder

SiSoftware has also divulged more information about the other Intel Arc Alchemist GPUs. The A300 lineup represents the entry-level segment and all of the models run on the Intel DG-128 processor. The midrange A500 and higher-end A700 series graphics cards will all use the DG2-512 GPU.

Starting with the middle-of-the-pack A500 series, these GPUs will feature 384 EUs, 3,072 SPs, and will come with a huge memory increase over the entry-level GPU, boosting the memory size to 12GB across a 192-bit bus clocked at 16Gbps.

The A700 series will, once again, feature a performance jump that is rumored to bring it up to par with the Nvidia GeForce RTX 3070 Ti. The GPU will come with 512 execution units, 4,096 shader processors, and 16GB of GDDR6 memory with a 256-bit bus. The entire Arc Alchemist lineup has been confirmed by Intel to use TSMC’s N6 process node.

While this benchmark may not give us a true-to-life overview of the performance offered by this entry-level GPU, we may soon start seeing more accurate tests as we get closer to the launch date. Intel has recently teased that the desktop version of Arc Alchemist will be released in the second quarter of 2022.

Monica J. White
Monica is a computing writer at Digital Trends, focusing on PC hardware. Since joining the team in 2021, Monica has written…
AMD just revealed a game-changing feature for your graphics card
AMD logo on the RX 7800 XT graphics card.

AMD is set to reveal a research paper about its technique for neural texture block compression at the Eurographics Symposium on Rendering (EGSR) next week. It sounds like some technobabble, but the idea behind neural compression is pretty simple. AMD says it's using a neural network to compress the massive textures in games, which cuts down on both the download size of a game and its demands on your graphics card.

We've heard about similar tech before. Nvidia introduced a paper on Neural Texture Compression last year, and Intel followed up with a paper of its own that proposed an AI-driven level of detail (LoD) technique that could make models look more realistic from farther away. Nvidia's claims about Neural Texture Compression are particularly impressive, with the paper asserting that the technique can store 16 times the data in the same amount of space as traditional block-based compression.

Read more
Intel may fire the first shots in the next-gen GPU war
Intel Arc A770 GPU installed in a test bench.

The GPU market is about to start heating up in just a few short months, and that's not just due to AMD and Nvidia. According to a new report, Intel plans to release its highly anticipated, next-gen Arc Battlemage graphics cards sooner than many have expected, and the GPUs might drop at just the perfect time to steal some sales away from AMD and Nvidia.

The tantalizing news comes from a report by ComputerBase. The publication claims that during Embedded World 2024, an event that took place in Germany, Intel's partners implied that Arc Battlemage GPUs might launch before this year's Black Friday. Realistically, this implies that Intel would have to hit the market in early November at the latest, giving its partners and retailers enough time to make the products readily available during the Black Friday shopping craze.

Read more
How Intel could win the GPU war this year
Intel Arc A580 graphics card on a pink background.

Intel faced an uphill climb with Arc Alchemist, and it looks like it might have another fight ahead with its next-gen Battlemage GPUs. The competition is always fierce, and AMD and Nvidia have big plans for the year ahead.

Despite the clouds that loom on the horizon, Intel might still surprise us with Battlemage -- in a good way. Here's where Intel Arc Battlemage is currently at, and why it might have a shot at being one of the best GPUs of the year.
Déjà vu
Prior to the release of Intel Arc Alchemist, one of the main complaints was that the general public was kept in the dark a lot of the time. The release date was pushed back more than once, and the information about the GPUs was fairly scarce compared to the constant hype we've all grown used to with Nvidia and AMD leaks.

Read more