Skip to main content

School shooters leave clues. Could A.I. spot the next one before it’s too late?

In the light of recent deadly school shootings in the United States, educators, parents, and security experts are looking to technology to help solve the problem. At the forefront is the use of artificial intelligence.

“Our goal is to make sure a kid never wants to bring a gun to school,” Suzy Loughlin, co-founder and chief council of Firestorm, a crisis management firm, said. Toward that end, in partnership with the University of Alabama School of Continuing Education, the company has developed a prevention program that looks for early warning signs in kids who may be at risk of committing future violent acts.

Dubbed BERTHA, for Behavioral Risk Threat Assessment Program, the idea grew out of the 2007 mass shooting at Virginia Tech when 32 people were murdered — one of the deadliest in U.S. history. The February shooting at the Marjory Stoneman Douglas High School in Parkland, Florida that killed 17 people brought more attention to the issue, underscored again in May, by the Santa Fe High School shooting in Texas where 10 students and teachers were killed.

School shooting victims since 1989, Source: Mother Jones

Incident Fatalities Injured Total Victims Year
Santa Fe High School shooting (Santa Fe, TX) 10 13 23 2018
Marjory Stoneman Douglas High School shooting (Parkland, Florida) 17 14 31 2018
Umpqua Community College shooting (Roseburg, Oregon) 9 9 18 2015
Marysville-Pilchuck High School shooting (Marysville, Washington) 5 1 6 2014
Isla Vista mass murder (Santa Barbara, California) 6 13 19 2014
Sandy Hook Elementary massacre (Newtown, Connecticut) 27 2 29 2012
Oikos University killings (Oakland, California) 7 3 10 2012
Northern Illinois University shooting (DeKalb, Illinois) 5 21 26 2008
Virginia Tech massacre (Blacksburg, Virginia) 32 23 55 2007
Amish school shooting (Lancaster County, Pennsylvania) 6 5 11 2006
Red Lake massacre (Red Lake, Minnesota) 10 5 15 2005
Columbine High School massacre (Littleton, Colorado) 13 24 37 1999
Thurston High School shooting (Springfield, Oregon) 4 25 29 1998
Westside Middle School killings (Jonesboro, Arkansas) 5 10 15 1998
Lindhurst High School shooting (Olivehurst, California) 4 10 14 1992
University of Iowa shooting (Iowa City, Iowa) 6 1 7 1991
Stockton schoolyard shooting (Stockton, California) 6 29 35 1989

The risk assessment program is conceived of as a safety net to catch children who may need help and intervention before they become suicidal or violent. As demonstrated after each previous incident, administrators, parents, and students wonder why early warning signs — like cyberbullying, allusions to guns, and references to the Columbine High School shooting in Colorado, in 1999 — weren’t noticed earlier.

Using AI to search for clues

The challenge has been the difficulty of sifting through the mountains of data generated in forums and social media accounts to find the few needles that might alert a school counselor or psychologist that a child is in trouble. So, to filter out such clues online, administrators are enlisting artificial intelligence tools.

“Our goal is to make sure a kid never wants to bring a gun to school.”

“We’re the AI component,” explained Mina Lux, the founder and CEO of New York-based Meelo Logic. Her company is working on the BERTHA program with Firestorm to perform the initial heavy lifting of sorting through what has become known as big data. “Our focus is knowledge automation to understand the context.”

Meelo’s software can trace comments and postings back to their original source. The company refers to the process as causal reasoning, but it’s more analogous to finding patient zero, the original individual about whom someone else may have expressed concern.

“Usually, there’s an initial outburst online, and they are purposely making that public — it may be a call for help,” Hart Brown, the COO of Firestorm, explained. “And in 80 percent of the cases, at least one other person knows, so even if the first post is private, someone else is likely to make it public.”

The AI program provides the initial screening, based on slang terms used, context, location, and related links. Then, Firestorm’s BERTHA dashboard flags activity for possible intervention. That’s where people — counselors, teachers, psychologists — step in to assess whether there’s a real threat, whether a child needs extra attention because they’re exhibiting anger or suicidal tendencies, or if the activity is benign.

The challenge has been the difficulty of sifting through the mountains of data generated in forums and social media.

“But no one person is responsible for making the decision,” said Brenda Truelove, a program administrator at the University of Alabama who has been working with Firestorm on the program and an associated e-learning program for educators nationwide. “One person might miss something, so it’s a team of people who decide what to do.”

Truelove noted that the program is based on experience from teachers, forensic psychologists, and other experts to create a formula for dealing with potential crises.

Does increased safety mean less privacy?

While the potential of AI in preventing future school shootings may be promising, such tracking and data analysis raise inevitable concerns about privacy and accuracy, and whether safety overrides any concerns.

Bryce Albert, a ninth-grade student at Marjory Stoneman Douglas High School, was in the third-floor hallway when the shooter started firing at students. As Albert saw the killer coming down the hall, a teacher let Albert into a classroom and he survived. Since that experience, he has had a change of thought about privacy.

Firestorm

“Before, I was like, don’t go into my stuff,” Albert told Digital Trends, about authorities tracking his social media usage. “But now, I’ve totally changed my mind.”

Meelo’s Lux emphasized that the AI programs do not access any private accounts; all the information is public. Firestorm’s Loughlin underscored the fact that they do not collect or store the data themselves. It’s maintained by the individual schools, which already have experience keeping student records. (Firestorm charges a license fee of $2,500 per school, while the University of Alabama offers a free online training course for Alabama K-12 educators. Schools can also work on their own early warning projects for free by using Firestorm’s basic nine-step formula for establishing such programs.)

Lux acknowledges that subtleties of language, such as sarcasm, can prove challenging for any AI research. Meelo focuses on textual analysis, rather than the sort of image analysis other AI companies, like Cortica, study. Still, there’s room for misinterpretation even for human participants.

“It’s hard to get emotion through texting,” Albert acknowledged.

On the other hand, a dispassionate program doesn’t play favorites or ignore the kinds of emotional or behavioral changes that might indicate that trouble is ahead.

AI is still only an initial filter or tool to stop future school shootings.

“Ultimately, it can be more accurate by eliminating as much bias as possible,” Firestorm’s Brown said. An HR person or counselor might minimize a person’s emotional outbursts, for example, saying it happens all the time. The unblinking computer makes no such excuses.

“But it still requires a human touch to follow through, to interview that person,” Brown said. “The computer won’t determine if that person needs to be expelled or needs counseling.”

AI is a tool, not a solution

All of the experts Digital Trends spoke with for this story emphasized the fact that AI is still only an initial filter or tool to stop future school shootings. It can generate alerts about children at risk, but it cannot tell educators how, when, or even if they should intervene. Schools still need their own team of experts — from teachers who know the students to psychologists — and are likely to continue to need them.

“Statistics show that with every school shooting that happens, there’s a higher probability of another school shooting happening,” Lux said.

Ultimately, the human element is the most important factor. “Talk to people and ask for help,” Albert said. “Don’t be afraid to ask for help.”

John R. Quain
Former Digital Trends Contributor
John R. Quain writes for The New York Times, Men's Journal, and several other publications. He is also the personal…
A.I. headphones could warn distracted pedestrians when there’s traffic around
PAWS headphones 1

Headphones have the ability to seal us in our own isolated sound bubbles; putting an invisible wall around wearers, even in public spaces. At least, it can feel that way. In reality, while the world might seem like it disappears when you put on your fancy AirPods Pro, it doesn’t actually. As walking across a busy street without paying attention would quickly remind you.

Could machine intelligence help where human intelligence fails us?

Read more
Aptera’s 3-wheel solar EV hits milestone on way toward 2025 commercialization
Aptera 2e

EV drivers may relish that charging networks are climbing over each other to provide needed juice alongside roads and highways.

But they may relish even more not having to make many recharging stops along the way as their EV soaks up the bountiful energy coming straight from the sun.

Read more
Ford ships new NACS adapters to EV customers
Ford EVs at a Tesla Supercharger station.

Thanks to a Tesla-provided adapter, owners of Ford electric vehicles were among the first non-Tesla drivers to get access to the SuperCharger network in the U.S.

Yet, amid slowing supply from Tesla, Ford is now turning to Lectron, an EV accessories supplier, to provide these North American Charging Standard (NACS) adapters, according to InsideEVs.

Read more