How AI Can Collect Clues to Prevent Future School Shootings – Digital Trends

Share Your Thoughts: Facebooktwitterlinkedin

We can all agree that preventing an act of violence from striking our schools is vital. The goal of Firestorm is to mitigate crises, and we do this by being there before the weapon ever comes to the scene.

School shooters leave clues, and using artificial intelligence (AI) as a tool can help identify early warning signs before an act of violence is committed.

Related: #ShareTheFormula to prevent incidents of violence

How AI Can Collect Clues to Prevent Future School Shootings

From Digital Trends, authored by John Quain, July 9, 2018 – read the full article here

In the light of recent deadly school shootings in the United States, educators, parents, and security experts are looking to technology to help solve the problem. At the forefront is the use of artificial intelligence.

“Our goal is to make sure a kid never wants to bring a gun to school,” Suzy Loughlin, co-founder and chief council of Firestorm, a crisis management firm, said. Toward that end, in partnership with the University of Alabama School of Continuing Education, the company has developed a prevention program that looks for early warning signs in kids who may be at risk of committing future violent acts.

Dubbed BERTHA, for Behavioral Risk Threat Assessment Program, the idea grew out of the 2007 mass shooting at Virginia Tech when 32 people were murdered — one of the deadliest in U.S. history. The February shooting at the Marjory Stoneman Douglas High School in Parkland, Florida that killed 17 people brought more attention to the issue, underscored again in May, by the Santa Fe High School shooting in Texas where 10 students and teachers were killed.Firestorm How to ShareTheFormula - Twitter Instagram Post 060718v1

The risk assessment program is conceived of as a safety net to catch children who may need help and intervention before they become suicidal or violent. As demonstrated after each previous incident, administrators, parents, and students wonder why early warning signs — like cyberbullying, allusions to guns, and references to the Columbine High School shooting in Colorado, in 1999 — weren’t noticed earlier.

USING AI TO SEARCH FOR CLUES
The challenge has been the difficulty of sifting through the mountains of data generated in forums and social media accounts to find the few needles that might alert a school counselor or psychologist that a child is in trouble. So, to filter out such clues online, administrators are enlisting artificial intelligence tools.

“Our goal is to make sure a kid never wants to bring a gun to school.”

“We’re the AI component,” explained Mina Lux, the founder and CEO of New York-based Meelo Logic. Her company is working on the BERTHA program with Firestorm to perform the initial heavy lifting of sorting through what has become known as big data. “Our focus is knowledge automation to understand the context.”

Meelo’s software can trace comments and postings back to their original source. The company refers to the process as causal reasoning, but it’s more analogous to finding patient zero, the original individual about whom someone else may have expressed concern.

“Usually, there’s an initial outburst online, and they are purposely making that public — it may be a call for help,” Hart Brown, the COO of Firestorm, explained. “And in 80 percent of the cases, at least one other person knows, so even if the first post is private, someone else is likely to make it public.”

Read the full story at Digital Trends.

The images below are from the May 10th, 2018 event at the National Press Club in Washington, D.C. Firestorm experts, partners, and the Albert Family from Parkland, Florida discussed ways we can prevent incidents of violence in our Nation’s schools.

Learn more about how you can #ShareTheFormula to prevent future episodes of violence.

Share Your Thoughts: Facebooktwitterlinkedin