Skip to main content

A.I. can tell if you’re a good surgeon just by scanning your brain

Could a brain scan be the best way to tell a top-notch surgeon? Well, kind of. Researchers at Rensselaer Polytechnic Institute and the University at Buffalo have developed Brain-NET, a deep learning A.I. tool that can accurately predict a surgeon’s certification scores based on their neuroimaging data.

Recommended Videos

This certification score, known as the Fundamentals of Laparoscopic Surgery program (FLS), is currently calculated manually using a formula that is extremely time and labor-consuming. The idea behind it is to give an objective assessment of surgical skills, thereby demonstrating effective training.

Please enable Javascript to view this content

“The Fundamental of Laparoscopic Surgery program has been adopted nationally for surgical residents, fellows and practicing physicians to learn and practice laparoscopic skills to have the opportunity to definitely measure and document those skills,” Xavier Intes, a professor of biomedical engineering at Rensselaer, told Digital Trends. “One key aspect of such [a] program is a scoring metric that is computed based on the time of the surgical task execution, as well as error estimation.”

The team of researchers on this project wanted to see if they could predict the FLS score of surgeons by using optical brain imaging. Thanks to a concurrent neural network, they demonstrated that they were able to do this with a high level of accuracy. This work is based on previous research in which functional near-infrared spectroscopy (fNIRS) was shown to be effective at classifying different motor task types, thereby providing a potential means of manual skill performance level. In this latest project, the researchers used the same fNIRS data to predict ultimate performance scores used in surgical certification.

“These results are a stepping stone toward leveraging neuroimaging and deep learning for neurofeedback to improve surgical skill acquisition, retention, and the certification process,” Intes continued. “The advantage of these approaches is that they should enable a more personalized training regimen with bedside feedback for optimal skill acquisition. Current approaches are singularly focusing on task repetition without potential for fast and objective feedback.”

This work is part of a continuous effort to enhance the way that surgical skills are taught and assessed. On its own, this latest piece of research is not going to fundamentally change that. However, going forward it could lay the groundwork for new ways of improving surgical task execution — and personalized approaches to training — by using neuroimaging assessment.

“We are currently using the FLS score as the means to assess surgical skills,” Intes said. “We hope that, with further studies, we will be able also to go beyond this metric and discover [a] new set of neurobiomarkers that will provide finer insight on surgical skill learning and execution.”

A paper describing the research is available to read in the journal IEEE Transactions on Biomedical Engineering.

Luke Dormehl
Former Digital Trends Contributor
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
Clever new A.I. system promises to train your dog while you’re away from home
finding rover facial recognition app dog face big eyes

One of the few good things about lockdown and working from home has been having more time to spend with pets. But when the world returns to normal, people are going to go back to the office, and in some cases that means leaving dogs at home for a large part of the day, hopefully with someone coming into your house to let them out at the midday point.

What if it was possible for an A.I. device, like a next-generation Amazon Echo, to give your pooch a dog-training class while you were away? That’s the basis for a project carried out by researchers at Colorado State University. Initially spotted by Chris Stokel-Walker, author of YouTubers:How YouTube Shook Up TV and Created a New Generation of Stars, and reported by New Scientist, the work involves a prototype device that’s able to give out canine commands, check to see if they’re being obeyed, and then provide a treat as a reward when they are.

Read more
Scientists asked A.I. to analyze the fossil record. This is what it found
T Rex

There have been a total of five extinction events in Earth’s history. The first, the Ordovician-Silurian extinction, took place 440 million years ago, wiping out small marine organisms. A second, the Devonian extinction, occurred 365 million years ago, killing off an assortment of tropical marine species. After that, 250 million years ago, was the Permian-Triassic extinction, followed by Triassic-Jurassic extinction 210 million years ago, and the Cretaceous-Tertiary extinction 65 million years ago. Things have been just about OK in the years since then -- even if 2020 has done its very best to make us feel otherwise.

Scientists have long hypothesized that mass extinctions result in productive periods of evolution, a process sometimes referred to as creative destruction or “radiations.” The idea is that these periods in which large numbers of species disappear correlates with the arrival of new species.

Read more
What’s the carbon footprint of A.I.? This clever tool breaks it down
brain network on veins illustration

-

Deep-learning A.I. is the machine learning technology that powers everything from cutting-edge natural language processing to machine vision tools. It may also be powering climate change -- as a result of the massive energy consumption and CO2 emissions associated with training these deep-learning models. As the use of deep learning has exploded, so has the compute power associated with them, although this effect is rarely studied.

Read more