Skip to main content

Playing games in your browser is about to get a lot better

Google has just unveiled a huge improvement for browser games — WebGPU. The new API might revolutionize the idea of playing games in the browser, and it won’t be limited to just Google Chrome.

WebGPU will give web apps more access to the graphics card, enabling new levels of performance. The API is already out, and Google seems to have big plans for it going forward.

A woman sits by a desk and plays a game on a laptop equipped with an AMD processor.
AMD

The idea of playing AAA games in the browser is still far off, but it’s possible that Google’s new tech might make browser gaming less of a strictly casual affair. The company just announced WebGPU, now available in Chrome 113 in beta. The API will soon become available to the world — once it’s out of beta, it will be enabled by default in Chrome, which is the most popular browser by far.

Recommended Videos

Google lists the benefits of the new API by saying: “WebGPU is a new web graphics API that offers significant benefits such as greatly reduced JavaScript workload for the same graphics and more than three times improvements in machine learning model inferences.”

Please enable Javascript to view this content

WebGPU is said to make it easier for developers to implement GPU-based solutions in web apps. This doesn’t have to be limited to games — as mentioned by Google, machine learning apps can also benefit. While language learning models like ChatGPT, Bing Chat, and Google’s own Bard don’t really need to rely on the GPU, there are other apps that can, such as deepfake software.

Gaming is a huge thing that could stand to benefit from this, though. We might be able to see some much more complex games that run entirely in the browser, or browser ports of existing games. However, this is not going to happen overnight, and without some engagement from developers, the tech might lie there unused.

Google recognizes that this new update is only the first step toward the final end product. It promises that future updates will bring new enhancements, including improved access to shader cores. For the time being, WebGPU is only about to take its first steps, although Google has been working on it since 2017.

WebGPU certainly sounds interesting, but it’s too early to say how big of an impact it will have — Google didn’t show a demo of it just yet. For the time being, it’s available on Windows, ChromeOS, and macOS. Google plans to bring it to Safari and Firefox in the future.

Monica J. White
Monica is a computing writer at Digital Trends, focusing on PC hardware. Since joining the team in 2021, Monica has written…
We just learned something surprising about how Apple Intelligence was trained
Apple Intelligence update on iPhone 15 Pro Max.

A new research paper from Apple reveals that the company relied on Google's Tensor Processing Units (TPUs), rather than Nvidia's more widely deployed GPUs, in training two crucial systems within its upcoming Apple Intelligence service. The paper notes that Apple used 2,048 Google TPUv5p chips to train its AI models and 8,192 TPUv4 processors for its server AI models.

Nvidia's chips are highly sought for good reason, having earned their reputation for performance and compute efficiency. Their products and systems are typically sold as standalone offerings, enabling customers to construct and operate them as the best see fit.

Read more
I finally switched to Microsoft Edge for this one feature
The Microsoft Edge browser on a flat surface.

Microsoft Edge has gotten increasingly better over the years, but I've stuck with Google Chrome -- perhaps by habit, if nothing else. After all, a web browser is the kind of application I don't want to think about. That's why the flashier features of recent updates to Chrome, Edge, or even Arc haven't swayed me. I don't use Copilot, Collections, or even tab groups. That left me defaulted to Chrome.

I'm now using Microsoft Edge, though -- and it's not because of the most common complaints about Chrome, such as its well-documented memory usage. No, no. My reason for deciding to leave Chrome for Edge is based on a feature that was actually launched way back in 2022. For the longest time, I ignored the Edge sidebar -- after all, the less clutter in my web browser, the better.

Read more
Arc wants to be a ‘browser that can browse for you’
A screenshot of the meeting feature in Arc Browser.

Following Apple's recent Worldwide Developers Conference (WWDC), there's been a lot of buzz around the topic of personal context in AI. The latest announcement from The Browser Company, the team behind the Arc browser, ties into that perfectly. Starting now, Arc will automatically detect when you have an upcoming meeting, nudge you about it, and even let you join it directly from the browser window. While the feature sounds neat, the way it was achieved is far more groundbreaking.

Never having to miss another meeting sounds pretty sweet. The feature, shared by Ben Cunningham of The Browser Company, was shown in a short demo video (with an interesting background track choice) tucked away in the browser's sidebar. The video shows a small calendar icon ticking down until the user's next meeting. Hovering over it brings up more of the calendar, including several more upcoming meetings. Once the meeting is about to start, it pops up below the calendar, where you can now tap on "Join" to go straight to the video call.

Read more