Skip to main content

Stop Alexa with a wave of your hand with Elliptic Labs’ ultrasound technology

Elliptic Labs has been showing off a new application for its ultrasound-based gesture technology at MWC in Barcelona, and we caught up with the company to get a demo. The idea is that smart speakers with ultrasound virtual sensor technology inside can detect the presence of people and respond to a range of gestures.

Recommended Videos

Using a prototype consisting of a speaker with Amazon’s Alexa onboard and a Raspberry Pi, Elliptic Labs showed us how you can trigger Alexa with a double tap palm gesture or cut it off in mid-flow with a single palm tap. The gestures can work from some distance away, allowing you to control your smart speaker without having to touch it or utter a word.

If you’re unfamiliar with Elliptic Labs, we met up with the company a couple of years back when it first began to roll its ultrasound gestures out into phones. The hope was that ultrasound might replace proximity sensors in phones and the technology was subsequently integrated into Xiaomi’s Mi Mix handsets, allowing the manufacturer to shrink the bezels right down. The ultrasound sensor can detect when your hand or face is near and turn the screen on or off accordingly. Specific gestures can also be used to scroll around, snap selfies, or even play games.

With more microphones, Elliptic Labs tech can detect more specific gestures or positioning. In a phone with two microphones, this might allow you to wave your hand to turn the volume up or down. Most smart speakers have several microphones now, so there’s a great deal of potential for more gesture controls, or even for triggering specific actions when someone enters or leaves a room.

Elliptic Labs sees ultrasound as free spectrum that’s not being exploited right now and the company is very optimistic about the potential applications.

“Any space where there are humans is fair game,” Guenael Strutt, Elliptic Labs’ VP of Product Development, told Digital Trends. “The possibilities are infinite.”

In the second demonstration we saw at MWC, the smart speaker was hooked up to a light. By placing your hand on one side of the speaker and holding it there you could turn up the light level, while holding your hand at the other side dimmed the bulb. It’s easy to imagine how this same gesture could work to tweak volume levels.

We tested out both prototypes for ourselves and found them very easy and intuitive to use. The technology doesn’t require direct line of sight, because the sound can bounce off a wall, so even if your speaker is tucked behind a lamp or the arm of the couch, you can still use these gestures to control it. We think the stop gesture is the most potentially useful, because it can be tricky to use voice commands to stop Alexa when it starts speaking or plays the wrong song.

There’s no official support for ultrasound tech in smart speakers just yet, but Elliptic Labs has been talking to all the major players – Amazon, Google, and Apple. The company has also been working with chip manufacturers like Qualcomm and suppliers further up the chain in smart speaker manufacture to try and integrate the technology into the chipsets and components that go into smartphones and smart speakers.

Having tried it out, we expect more manufacturers to adopt it in the near future. Smart speakers may be an easier sell than smartphones, though, unless Elliptic Labs can get ultrasound technology into the chipsets that manufacturers buy.

One of the key challenges for smartphones is reducing the power draw of the ultrasound sensor and working out clever ways to determine when it should be listening. Advances in machine learning and processor speed could make an important difference here and Elliptic Labs has been working to determine the optimal model for gesture detection.

We’re excited to see what these ultrasound pioneers come up with next.

Simon Hill
Former Digital Trends Contributor
Simon Hill is an experienced technology journalist and editor who loves all things tech. He is currently the Associate Mobile…
Google’s Pixel Weather app just got two new features. Here’s how they work
The Pixel Weather app on a Google Pixel 9.

The Pixel Weather app has been the focus of a lot of attention lately as Google revamps the user experience and adds more features. Now, there's more good news: two of those promised functions — the Pollen count card and immersive vibrations — are newly available, at least for some users.

Thanks to "immersive weather vibrations," the Pixel Weather app vibrates to match the animated backgrounds it displays, with intensity levels that mirror the precipitation amount (because it's not just rainfall), according to 9to5Google. Of course, if you don't like the feature, you can disable it in the account menu.

Read more
2025 could finally be the year of a budget-friendly Samsung Galaxy Z Flip
A person closing the Samsung Galaxy Z Flip 6.

The idea of a more budget-friendly Samsung clamshell has gained steam as well-known leakers drop more and more hints that a new Galaxy Z Flip is on the way. Today, another leak from someone in the know adds even more credence to that rumor.

Ross Young made a post on X where he suggested that Samsung might release a Z Flip 7 FE in 2025 with the clamshell design fans have waited for. Young has a proven record for accurate leaks, and their work in the supply chain gives him a unique insight into what companies are working on.

Read more
Google just announced Android 16. Here’s everything new
The Android 16 logo on a smartphone, resting on a shelf.

No, that headline isn't a typo. A little over a month after Android 15 was released to the masses in October, Google has already announced Android 16 and begun rolling out its first developer beta of the newest Android version.

If this seems like a much earlier release than usual, that's because it is. We typically expect the first developer beta of the next Android update to arrive in February. For Android 16, however, Google has pushed the timeline up by a few months and launched Android 16 Developer Preview 1 in mid-November.
Why Android 16 is launching so much earlier

Read more