Skip to main content

The creator of Internet Explorer wants to read your mind with a bracelet

CTRL Labs Oreilly Demo

Forget Amazon Echo-style voice controls or touchscreen gestures; controlling a computer with your thoughts is the ultimate way to interface with machines. There’s just one problem: for the technology to work as well as it could, it might be necessary to drill a hole in your head and insert a chip in your brain. That, as they say in the trade, is a dealbreaker.

Recommended Videos

A New York startup called CTRL-Labs has a different idea, though. Founded by Thomas Reardon, the creator of Microsoft Internet Explorer, it describes itself as an “applied research neuroscience company” with designs on decoding your neural activity. But unlike many of its rivals in this space it won’t actually venture too close to your cranium to do it. And it certainly won’t be brandishing any drills or other cutting implements.

Carry out feats like typing 200 words-per-minute without physically touching a keyboard.

Instead, CTRL-Labs has developed an electronic wristband that promises to make possible non-invasive mental control of computers, smart prosthetics, and a range of other devices. This brain-computer interface uses the voltage bursts that result from contractions in the arm’s muscle fibers.

By analyzing these signals, slight body movements can be transformed into computer inputs. Better still, as the video below makes clear, even the intention of movement can be read as movement.

Admittedly, this isn’t a brain-computer interface in the way that we might think of one. The team describes it this way because it may be the fastest means yet of turning the brain’s conscious instructions into useful actions. In this case, the technology becomes a natural extension of thought and movement. Using CTRL-Labs’ prototype device — which looks a bit like the studded armband a 1990s superhero drawn by Rob Liefeld might have sported — users can carry out feats like typing 200 words-per-minute without having to physically touch a keyboard.

“We think about your arm as a pipe of information from your brain to the world,” Adam Berenzweig, director of R&D at CTRL-Labs, told Digital Trends. “We don’t have too many other ways of getting the information out of our brain, except through control of muscles.”

A new universal input device?

There is, of course, voice. Using our voices, thoughts can be communicated as verbal instructions near-instantaneously. Thanks to advances in artificial intelligence, voice control is now a viable technology for the first time in history. But as Berenzweig pointed out, voice control isn’t perfect for every scenario. “Voice is great for some things, but it’s not ideal for all circumstances,” he said. “There are privacy issues, loud environments, and other times when it’s simply not convenient.”

“I could see a future where people are wearing this device all day.”

A subtle gesture, on the other hand, can be carried out in virtually any context. It’s also a very versatile control method. Our hands turn out to be pretty handy hunks of meat. We can type with them. We can hold pens with them. We can drum our fingers. We can ball our fingers up and form fists.

The ability to recognize any of these gestures using one ultra-sensitive wearable could lead to the biggest leap forward in computer interfaces since the invention of the mouse. It could be even more versatile, in fact, since the mouse is really only an analog for pointing; transforming that one universal human gesture into a computational metaphor.

“I could certainly see a future where people are wearing this device all day, and it’s the thing that is used to interact with people’s phones, the lights in their house, and the radio in their car,” Berenzweig continued. “After people are used to it, it’s easy to imagine that people will [wonder why they need a keyboard or mouse at all] when they’re sitting at their computer.”

CTRL Labs

He suggested that it could prove to be a generational thing, in which our CTRL-Labs armband-wearing kids view today’s input devices the same way they skeptically look at bits of retrograde tech detritus like VHS tapes and Game Boys. “Did you guys seriously used to use those?” they’ll ask us, one hand subtly contorting as they simultaneously gesture out a quick IM to their school friends.

Coming soon to an arm near you

“A really big use case for this is going to be virtual reality and augmented reality,” Berenzweig said. “Right now, VR can offer really amazing, immersive experiences visually. But then to control them you’ve just got these sticks where your hands should be. It really limits an experience, which is very much defined by what kind of control you have.”

“What we’re putting out this year is not mass consumer-ready, but the technology works.”

Using CTRL-Labs’ technology, the idea of controller-free VR suddenly becomes a whole lot more possible. As the world becomes increasingly “smart,” with connected Internet of Things devices all around us, it’s equally easy to imagine how technology such as this could be used to let us interact with everything from our smart thermostats to our smart locks. What self-respecting geek hasn’t, at some point, wished that they could control the world around them with a simple Jedi Knight-style wave of the hand? Such things may not remain science fiction for too much longer.

The bigger question, however, is how this will translate to other interface methods. Some gestures are natural to us, like pointing at an object to indicate interest. Others, like American Sign Language, have more of a learning curve. We can train ourselves to use them as second nature — much like a pianist learns to turn the music in their brain into finger movements on a keyboard — but this requires effort.

In an age of intuitive, effortless interfaces like voice and smartphone swipes, will we be willing to put in the work? And, if so, how many of us? If this tech is going to become the universal interface Berenzweig believes it can be, the answer had better be “lots and lots.”

CRTL Labs

We’ll get the chance to find out soon.

“We’re committed to shipping something this year,” he said. “It will be a smaller rollout to developers [initially]. We’ve currently got a signup sheet for people to register their interest. It’s still early in the productization. What we’re putting out this year is not mass consumer-ready, but the technology works. Our goal now is to get it in the hands of developers so they can start exploring exactly what is possible with it.”

We await the verdict with bated breath, arm muscle fibers twitching with excitement.

Luke Dormehl
Former Digital Trends Contributor
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
Brainwave-reading temporary tattoos could take wearable tech to the next level
Researchers at TU Graz present tattoo electrodes from the printer, which are particularly attractive for long-term medical diagnostics.

Outside of Soundcloud rappers and professional wrestlers like the late Bam Bam Bigelow, head tattoos aren’t something most of us encounter on a regular basis. If researchers at Austria’s Graz University of Technology and a couple of other European research labs have their way, that could all be about to change. Head tattoos may become the norm  -- mind-reading, electrode-sporting head tattoos. And, like the old joke about the waiter and the fly in the man’s soup, we’re all going to want them.

Before we get any further, no, the tattoos aren’t permanent. “We use temporary tattoos like the ones for kids; essentially the same that you would use to transfer a cartoon or a drawing for creative purposes onto your skin,” Francesco Greco, assistant professor at the university's Institute of Solid State Physics, told Digital Trends. “We use the [standard tattoo] paper as a substrate and then print on top of it circuits made out of conductive polymer, using an inkjet printer.”

Read more
In the future, touchscreens will be obsolete. This lab designs what’s next
FIGLAB

Chris Harrison is thinking about the future. His. Yours. Ours. Everyone’s. More specifically, he’s thinking about how the world will be using computers, and what those computers might look like, a quarter-century from now. Since Harrison is 35 years old today, that’s right around the time that he may be contemplating retirement.

It's Harrison's job to think about these things. He is director of the Future Interfaces Group at Carnegie Mellon University’s Human-Computer Interaction Institute. Located in a solar-powered, century-old building on the western side of Carnegie Mellon’s Pittsburgh campus, FIGLAB, as it is affectionately called, boasts three studios loaded to the gills with everything from high-tech sensors to CNC milling machines and laser cutters.

Read more
NASA wants your help designing a mini payload for moon exploration
The JPL-led challenge is seeking tiny payloads no larger than a bar of soap for a miniaturized Moon rover.

If you have a genius idea for a tiny exploratory robot, then NASA wants to hear from you. The space agency is calling on the public to submit their designs for miniature rovers which could be used to explore the moon as part of the Artemis project or even help establish a long-term moon base in the "Honey, I Shrunk the NASA Payload" challenge.

"As human space exploration evolves toward a permanent presence on the lunar surface, In situ Resource Utilization (ISRU) will become increasingly important," the challenge website states. "Resupply missions are very expensive. We need to develop practical and affordable ways to identify and use lunar resources, so that our astronaut crews can become more independent of Earth."

Read more