Skip to main content

The creator of Internet Explorer wants to read your mind with a bracelet

CTRL Labs Oreilly Demo

Forget Amazon Echo-style voice controls or touchscreen gestures; controlling a computer with your thoughts is the ultimate way to interface with machines. There’s just one problem: for the technology to work as well as it could, it might be necessary to drill a hole in your head and insert a chip in your brain. That, as they say in the trade, is a dealbreaker.

Recommended Videos

A New York startup called CTRL-Labs has a different idea, though. Founded by Thomas Reardon, the creator of Microsoft Internet Explorer, it describes itself as an “applied research neuroscience company” with designs on decoding your neural activity. But unlike many of its rivals in this space it won’t actually venture too close to your cranium to do it. And it certainly won’t be brandishing any drills or other cutting implements.

Carry out feats like typing 200 words-per-minute without physically touching a keyboard.

Instead, CTRL-Labs has developed an electronic wristband that promises to make possible non-invasive mental control of computers, smart prosthetics, and a range of other devices. This brain-computer interface uses the voltage bursts that result from contractions in the arm’s muscle fibers.

By analyzing these signals, slight body movements can be transformed into computer inputs. Better still, as the video below makes clear, even the intention of movement can be read as movement.

Admittedly, this isn’t a brain-computer interface in the way that we might think of one. The team describes it this way because it may be the fastest means yet of turning the brain’s conscious instructions into useful actions. In this case, the technology becomes a natural extension of thought and movement. Using CTRL-Labs’ prototype device — which looks a bit like the studded armband a 1990s superhero drawn by Rob Liefeld might have sported — users can carry out feats like typing 200 words-per-minute without having to physically touch a keyboard.

“We think about your arm as a pipe of information from your brain to the world,” Adam Berenzweig, director of R&D at CTRL-Labs, told Digital Trends. “We don’t have too many other ways of getting the information out of our brain, except through control of muscles.”

A new universal input device?

There is, of course, voice. Using our voices, thoughts can be communicated as verbal instructions near-instantaneously. Thanks to advances in artificial intelligence, voice control is now a viable technology for the first time in history. But as Berenzweig pointed out, voice control isn’t perfect for every scenario. “Voice is great for some things, but it’s not ideal for all circumstances,” he said. “There are privacy issues, loud environments, and other times when it’s simply not convenient.”

“I could see a future where people are wearing this device all day.”

A subtle gesture, on the other hand, can be carried out in virtually any context. It’s also a very versatile control method. Our hands turn out to be pretty handy hunks of meat. We can type with them. We can hold pens with them. We can drum our fingers. We can ball our fingers up and form fists.

The ability to recognize any of these gestures using one ultra-sensitive wearable could lead to the biggest leap forward in computer interfaces since the invention of the mouse. It could be even more versatile, in fact, since the mouse is really only an analog for pointing; transforming that one universal human gesture into a computational metaphor.

“I could certainly see a future where people are wearing this device all day, and it’s the thing that is used to interact with people’s phones, the lights in their house, and the radio in their car,” Berenzweig continued. “After people are used to it, it’s easy to imagine that people will [wonder why they need a keyboard or mouse at all] when they’re sitting at their computer.”

CTRL Labs

He suggested that it could prove to be a generational thing, in which our CTRL-Labs armband-wearing kids view today’s input devices the same way they skeptically look at bits of retrograde tech detritus like VHS tapes and Game Boys. “Did you guys seriously used to use those?” they’ll ask us, one hand subtly contorting as they simultaneously gesture out a quick IM to their school friends.

Coming soon to an arm near you

“A really big use case for this is going to be virtual reality and augmented reality,” Berenzweig said. “Right now, VR can offer really amazing, immersive experiences visually. But then to control them you’ve just got these sticks where your hands should be. It really limits an experience, which is very much defined by what kind of control you have.”

“What we’re putting out this year is not mass consumer-ready, but the technology works.”

Using CTRL-Labs’ technology, the idea of controller-free VR suddenly becomes a whole lot more possible. As the world becomes increasingly “smart,” with connected Internet of Things devices all around us, it’s equally easy to imagine how technology such as this could be used to let us interact with everything from our smart thermostats to our smart locks. What self-respecting geek hasn’t, at some point, wished that they could control the world around them with a simple Jedi Knight-style wave of the hand? Such things may not remain science fiction for too much longer.

The bigger question, however, is how this will translate to other interface methods. Some gestures are natural to us, like pointing at an object to indicate interest. Others, like American Sign Language, have more of a learning curve. We can train ourselves to use them as second nature — much like a pianist learns to turn the music in their brain into finger movements on a keyboard — but this requires effort.

In an age of intuitive, effortless interfaces like voice and smartphone swipes, will we be willing to put in the work? And, if so, how many of us? If this tech is going to become the universal interface Berenzweig believes it can be, the answer had better be “lots and lots.”

CRTL Labs

We’ll get the chance to find out soon.

“We’re committed to shipping something this year,” he said. “It will be a smaller rollout to developers [initially]. We’ve currently got a signup sheet for people to register their interest. It’s still early in the productization. What we’re putting out this year is not mass consumer-ready, but the technology works. Our goal now is to get it in the hands of developers so they can start exploring exactly what is possible with it.”

We await the verdict with bated breath, arm muscle fibers twitching with excitement.

Luke Dormehl
Former Digital Trends Contributor
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
NASA wants your help designing a mini payload for moon exploration
The JPL-led challenge is seeking tiny payloads no larger than a bar of soap for a miniaturized Moon rover.

If you have a genius idea for a tiny exploratory robot, then NASA wants to hear from you. The space agency is calling on the public to submit their designs for miniature rovers which could be used to explore the moon as part of the Artemis project or even help establish a long-term moon base in the "Honey, I Shrunk the NASA Payload" challenge.

"As human space exploration evolves toward a permanent presence on the lunar surface, In situ Resource Utilization (ISRU) will become increasingly important," the challenge website states. "Resupply missions are very expensive. We need to develop practical and affordable ways to identify and use lunar resources, so that our astronaut crews can become more independent of Earth."

Read more
Groundbreaking A.I. brain implant translates thoughts into spoken words
ibm-chip-human-brain-robot-overlord

Researchers from the University of California, San Francisco, have developed a brain implant which uses deep-learning artificial intelligence to transform thoughts into complete sentences. The technology could one day be used to help restore speech in patients who are unable to speak due to paralysis.

“The algorithm is a special kind of artificial neural network, inspired by work in machine translation,” Joseph Makin, one of the researchers involved in the project, told Digital Trends. “Their problem, like ours, is to transform a sequence of arbitrary length into a sequence of arbitrary length.”

Read more
Prosthetics that don’t require practice: Inside the latest breakthrough in bionics
most advanced hand prosthetic

Paul Cederna dreams of a hand for every occasion.

“I can imagine somebody that has this entire suite of hands,” he said. “They’re a farmer, and they’re working on their tractor and welding and harvesting the corn -- and they’ve got this heavy-duty hand that is incredibly durable, which can open and close and lift heavy, heavy things. But the farmer also happens to be a pianist. When they go inside, they put on another super lightweight hand where the fingers spread and move really fast. All this hand needs to do is to push piano keys to play the piano.”

Read more