Skip to main content

A.I. and Google News: The push to broaden perspectives

Image used with permission by copyright holder

There is no editorial team at Google News. There is no building filled with hundreds of moderators monitoring the thousands of stories hitting the web every second, making sure the full story is presented. Instead, Google uses artificial intelligence algorithms, as well as its partnerships with fact-checking organizations providing headlines from credible, authoritative sources.

“Humans are generating the content,” Trystan Upstill, Google News engineering and product lead, told Digital Trends. “We think of the whole app as a way to use artificial intelligence to bring forward the best in human intelligence. In a way, the A.I. is controlling this fire hose of human stuff going on.”

“We think of the whole app as a way to use artificial intelligence to bring forward the best in human intelligence.”

A.I. is a big part of the redesigned Google News app, which was recently announced at the annual Google I/O developer conference in Mountain View, California. The algorithms filter or demote stories after detecting the spread of misinformation, and they also understand terms and fragments of text coming through the news cycle, aligning them with fact checks from partner organizations.

But one of the A.I.’s main tasks is to provide a full picture of major, nuanced stories through a feature called “Full Coverage.” It’s a small button you can press on stories, which will lead you to similar articles from a variety of publications — including ones you do not follow or may not like. The main section of Google News shows content tailored to you, but “Full Coverage” does not respect your likes and dislikes — everyone sees the same information pulled together by the A.I.

That includes modules for fact checks, frequently asked questions, a timeline of events, international coverage, and even tweets from primary sources. Everyone reading “Full Coverage” sees the same information, which Upstill said is crucial.

“The core premise we have is that in order to have a productive conversation about something, everyone basically needs to be able to see the same thing,” he said.

While the breadth of data the algorithms pull is impressive, it’s entirely on the user to click on the small “Full Coverage” button to read more perspectives on the topic at hand. It’s why the button features Google’s red, green, blue, and yellow colors — it stands out from a page that’s mostly black and white.

“Fundamentally, we’re trying to build tools that are easy, that people can use to develop their understanding,” Upstill said. “A part of the challenge for people to break out of their bubbles and echo chambers is that it’s just hard; it’s hard work, and we set out to make that easy.”

Pulling together a variety of sources has always been a part of Google News’ roots. The desktop service began right after the 9/11 attacks in 2001, when people were scrambling to find as much information as they could about the tragic event.

“It came to the table with this idea that in terms of understanding a story, you shouldn’t read a single article,” Upstill said. “You should read a set of articles around that story to really position what you’re reading. That is a key message that resonates with people even today, in this age of people having increasingly polarized views.”

“You should read a set of articles around that story to really position what you’re reading.”

Google has been criticized for helping people stay in their bubbles. Search results are personalized based on location and previous searches, and people end up seeing what they want to see rather than the full picture. Upstill said Google isn’t in the business of censorship, and “in Search, if you come in and say ‘give me the fake news publication’ or type ‘fakenews.com,’” it will show up. But with Google News, Upstill said you shouldn’t find disreputable sources.

The new Google News app is currently rolling out on both Android and iOS, and the desktop redesign will go live early next week. Both will share the same features, but the desktop version will have a different format.

Editors' Recommendations

Julian Chokkattu
Former Digital Trends Contributor
Julian is the mobile and wearables editor at Digital Trends, covering smartphones, fitness trackers, smartwatches, and more…
Android is getting an AI overhaul. Here’s what it looks like
Android 15 on stage at Google I/O 2024.

At Google I/O 2024, Google has shown off a lot of new improvements coming to Android, thanks to Gemini. With Gemini, Android will be aware of the context on the screen in a variety of scenarios, which will make your life a lot easier. At least, according to Google.

Circle to Search, which was first shown off during Samsung’s Galaxy Unpacked event earlier this year, will be getting some new enhancements. Now, Circle to Search will be a great new study buddy for students. Why? Because it will be able to help with homework, like physics and math. You can simply circle a prompt on the screen, and it will give you step-by-step instructions on how to solve it.

Read more
Everything announced at Google I/O 2024
Rose Yao on Google I/O 2024 stage.

Android, Wear OS, and Pixel may be Google's household names, but it was Google Gemini, its emerging AI technology, that stole the limelight at Google I/O 2024. The company's annual software celebration sets the stage for everything the company has planned for the coming year, and this year, CEO Sundar Pichai unambiguously declared that Google is in its "Gemini era." From AI searches in your Google Photos to virtual AI assistants that will work alongside you, Google is baking Gemini into absolutely everything, and the implications are enormous. Here's an overview of everything Google announced this year.
Gemini takeover

Users upload more than 6 billion photos to Google Photos every day, so it's little wonder that we could use a hand sifting through them all. Gemini will be added to Google Photos this summer, adding extra search abilities through the Ask Photos function. For instance, ask it "what's my license plate again" and it'll search through your photos to find the most likely answer, saving you from needing to manually look through your photos to find it yourself.

Read more
Google Lens now works with videos, and it’s super impressive
Google Gemini on smartphone.

Google just showed off a new Google Lens video search feature at Google I/O 2024. With it, you can do a Google search just by recording a video with your phone.

In a stage demo showing off the feature, Google's Rose Yao is troubleshooting some issues she’s having with a record player she recently bought. She doesn’t know what make or model it is, and the needle won’t stay on the record when it’s playing. She has no idea where to start. With the new Google Lens video search, just taking a short video and uploading it allows her to search for an answer.

Read more