Skip to main content

What’s that liquid? IBM’s flavor-identifying ‘e-tongue’ will tell you

IBM Hypertaste: An AI-assisted e-tongue for fast and portable fingerprinting of complex liquids

With its Watson technology, IBM has helped create a pretty convincing artificial brain. But now it’s seemingly ready to move onto other body parts as well — and it’s settled on the tongue as a next step. As developed by computer scientists at IBM Research, the A.I.-assisted e-tongue is a portable device, equipped with special sensors, that allow it to taste and identify different liquids.

Recommended Videos

“We’re very good as humans at being able to recognize different liquids,” Patrick Ruch, one of the researchers working on the e-tongue project, told Digital Trends. “While we can’t necessarily work out the exact quantities of components within liquids, we can do things like recognize the same liquid over and over again. That’s something we set out to replicate with this project.”

Please enable Javascript to view this content

The handheld tongue (which isn’t quite as gross as it sounds) takes the form of a sensor array, which can be dipped in different liquids to sample their taste. Using pattern matching technology, augmented by machine learning, it’s able to work out the composition of the liquids it tastes and match them to different liquids already in its data set.

In a demonstration this week at the 11th World Conference of Science Journalists in Lausanne, Switzerland, the e-tongue was demonstrated by distinguishing between different brands of local bottled water. It managed to do this both accurately and consistently.

IBM Research

“What we’ve developed with this project is an end-to-end solution to convert chemical fingerprints into digital fingerprints,” Ruch continued.

One obvious application for this technology would be in the culinary industry, he said, where it could be used for things like sampling the vintage of different red wines. However, its classification use cases go far beyond that.

“It could be very useful for any scenario in which you want to check the composition of a particular liquid very quickly,” Ruch noted. “For instance, if you want to check that a particular food has come from the producer it says on the label this could be used. You can also imagine it in the case of non-foodstuffs, where an industrial supplier is supplying you with raw materials and you want to be sure that it’s always coming from the same place. You can easily re-label liquid, but you can’t change the chemical identity of the liquid without changing its function.”

He suggested that it could additionally be used for things like sampling different biofluids, such as urine, to make health-related diagnoses.

“The goal is definitely to grow the database of liquids,” Ruch said. “We’ve shown off the platform as a proof-of-principle, so the next step would be to create modifications depending on the use case.”

Luke Dormehl
Former Digital Trends Contributor
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
Clever new A.I. system promises to train your dog while you’re away from home
finding rover facial recognition app dog face big eyes

One of the few good things about lockdown and working from home has been having more time to spend with pets. But when the world returns to normal, people are going to go back to the office, and in some cases that means leaving dogs at home for a large part of the day, hopefully with someone coming into your house to let them out at the midday point.

What if it was possible for an A.I. device, like a next-generation Amazon Echo, to give your pooch a dog-training class while you were away? That’s the basis for a project carried out by researchers at Colorado State University. Initially spotted by Chris Stokel-Walker, author of YouTubers:How YouTube Shook Up TV and Created a New Generation of Stars, and reported by New Scientist, the work involves a prototype device that’s able to give out canine commands, check to see if they’re being obeyed, and then provide a treat as a reward when they are.

Read more
To build a lifelike robotic hand, we first have to build a better robotic brain
Robot arm gripper

Our hands are like a bridge between the intentions laid out by the brain and the physical world, carrying out our wishes by letting us turn thoughts into actions. If robots are going to truly live up to their potential when it comes to interaction, it’s crucial that they therefore have some similar instrument at their disposal.

We know that roboticists are building some astonishingly intricate robot hands already. But they also need the smarts to control them -- being capable of properly gripping objects both according to their shape and their hardness or softness. You don’t want your future robot co-worker to crush your hand into gory mush when it shakes hands with you on its first day in the office.

Read more
This basic human skill is the next major milestone for A.I.
Profile of head on computer chip artificial intelligence.

Remember the amazing, revelatory feeling when you first discovered the existence of cause and effect? That’s a trick question. Kids start learning the principle of causality from as early as eight months old, helping them to make rudimentary inferences about the world around them. But most of us don’t remember much before the age of around three or four, so the important lesson of “why” is something we simply take for granted.

It’s not only a crucial lesson for humans to learn, but also one that today’s artificial intelligence systems are pretty darn bad at. While modern A.I. is capable of beating human players at Go and driving cars on busy streets, this is not necessarily comparable with the kind of intelligence humans might use to master these abilities. That’s because humans -- even small infants -- possess the ability to generalize by applying knowledge from one domain to another. For A.I. to live up to its potential, this is something it also needs to be able to do.

Read more