Skip to main content

DeepSqueak is a machine learning A.I. that reveals what rats are chatting about

DeepSqueak helps researchers decode rodent chatter

Just as Aquaman can talk to fish, so too can researchers at the University of Washington chat with rats. Or, at least, they can better understand what rats are squeaking about thanks to a deep learning artificial intelligence (A.I.) system with the ingenious name “DeepSqueak.”

Recommended Videos

Since rats are frequently used in medical experiments, the technology makes it easier for researchers to monitor rat stress levels or other metrics when they are being used in testing. While it’s not exactly a Google Translate for rodent-to-human conversation, it could be useful for decoding the natural patterns of vocalizations made between rats as they communicate — thereby making it easier to understand their responses to stimuli.

“Rats and mice express themselves through ultrasonic vocalization at frequencies that are too high for humans to hear,” John Neumaier, a professor in the Department of Psychiatry and Behavioral Sciences, told Digital Trends. “In the past, researchers have recorded these to gain better insights into the emotional state of an animal during behavior testing. The problem was that manual analysis of these recordings could take 10 times longer to listen to when slowed down to frequencies that humans can hear. This made the workload exhaustive and discouraged researchers from using this natural read out about animals’ emotional states.”

DeepSqueak was developed by Kevin Coffey and Russell Marx, two researchers in Neumaier’s lab. “An undergraduate student isolated and annotated hundreds of calls by hand,” Coffey told us. “Those calls were used to train a rudimentary network that could be used to find thousands of new calls. This larger data set was cleaned by hand, and some calls were randomly augmented, stretched and squished to improve generalizability.”

Previous attempts to carry out this analysis using software was less reliable. Being easily tricked by background noises and generally worse at categorizing sounds. These studies also focused only on simple aspects of vocalizations, such as their frequency range, which can correlate with happy or unhappy emotional states. With DeepSqueak, however, it is possible to look at more subtle features, such as the order of calls and their temporal association with actions and events.

Right now, not enough is known about exactly what all rat squeaks refer to. But tools like this make a very promising platform to build upon. As biologists compile more calls over time, it hopefully won’t be long before a rodent Rosetta Stone becomes a reality.

A paper describing the work was recently published in the journal Neuropsychopharmacology.

Luke Dormehl
Former Digital Trends Contributor
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
The BigSleep A.I. is like Google Image Search for pictures that don’t exist yet
Eternity

In case you’re wondering, the picture above is "an intricate drawing of eternity." But it’s not the work of a human artist; it’s the creation of BigSleep, the latest amazing example of generative artificial intelligence (A.I.) in action.

A bit like a visual version of text-generating A.I. model GPT-3, BigSleep is capable of taking any text prompt and visualizing an image to fit the words. That could be something esoteric like eternity, or it could be a bowl of cherries, or a beautiful house (the latter of which can be seen below.) Think of it like a Google Images search -- only for pictures that have never previously existed.
How BigSleep works
“At a high level, BigSleep works by combining two neural networks: BigGAN and CLIP,” Ryan Murdock, BigSleep’s 23-year-old creator, a student studying cognitive neuroscience at the University of Utah, told Digital Trends.

Read more
Clever new A.I. system promises to train your dog while you’re away from home
finding rover facial recognition app dog face big eyes

One of the few good things about lockdown and working from home has been having more time to spend with pets. But when the world returns to normal, people are going to go back to the office, and in some cases that means leaving dogs at home for a large part of the day, hopefully with someone coming into your house to let them out at the midday point.

What if it was possible for an A.I. device, like a next-generation Amazon Echo, to give your pooch a dog-training class while you were away? That’s the basis for a project carried out by researchers at Colorado State University. Initially spotted by Chris Stokel-Walker, author of YouTubers:How YouTube Shook Up TV and Created a New Generation of Stars, and reported by New Scientist, the work involves a prototype device that’s able to give out canine commands, check to see if they’re being obeyed, and then provide a treat as a reward when they are.

Read more
New A.I. hearing aid learns your listening preferences and makes adjustments
Widex Moment hearing aids.

One of the picks for this year’s CES 2021 Innovation Awards is a smart hearing aid that uses artificial intelligence to improve the audio experience in a couple of crucial ways.

Among the improvements the Widex Moment makes to conventional hearing aids is reducing the standard sound delay experienced by wearers from 7 to 10 milliseconds seconds down to just 0.5 milliseconds. This results in a more natural sound experience for users, rather than the out-of-sync audio experience people have had to settle for up until now.

Read more