Insight: Researching School Shootings: AI's Capabilities and Limitations in Prevention
This rapidly increasing occurrence of gun violence in educational institutions has left parents, educators, and school authorities searching for answers.
In response to this escalating issue, many schools are investing in advanced AI and tech products, which are being promoted to districts in search of assistance for detecting potential shooters on campus. The pressure on school officials to safeguard students has transformed the field of campus security from a niche sector into a multibillion-dollar industry.
Due to limited funds, equipment, and personnel, public schools often struggle to monitor every security camera and thoroughly inspect each student's backpack. Thus, AI presents an enticing option that can quickly identify threats that might evade human surveillance.
I have compiled data on approximately 2,700 instances of school shootings since 1966, as well as other security challenges such as hoaxes, online threats, botched plots, near misses, and cases of students being caught with guns.
Solving the array of school security-related problems is complicated, as schools are vast public institutions that serve as hubs for various activities beyond traditional school hours. On a typical weeknight, a high school may host events like basketball games, drama club meetings, adult English language classes, and church gatherings in its cafeteria – potentially creating security loopholes amid the hustle and bustle.
Currently, two popular AI applications, computer vision and pattern analysis using large language models, offer promising prospects for monitoring campuses in ways that humans cannot.
AI is employed in schools to analyze the signals from metal detectors, examine objects visible on CCTV, identify the sound of gunshots, monitor doors and gates, scan social media for threats, search for red flags in student records, and recognize students' faces to identify potential intruders.
These AI systems function optimally when addressing well-defined and easily categorized problems, such as identifying a weapon or an intruder. If everything operates seamlessly, when an AI-equipped camera sees a stranger holding a gun, the software would flag the face of an unauthorized adult and object classification would identify the weapon. This would prompt other AI systems to lock the doors, call 911, and send text-message alerts.
AI's Limitations
However, AI models can be unreliable. When an AI model is trained to identify pictures of weapons, an algorithm compares new images to the patterns of weapons in its training data. AI doesn't perceive a weapon since a computer program cannot understand anything. When presented with millions of images of guns, the model learns to spot specific shapes and patterns that indicate a firearm. The challenge lies in setting the threshold between a gun and non-gun.
Is it preferable to avoid false alarms for every umbrella or receive notifications for every concealed handgun?
AI software mistook this CCTV image of Brazoswood High School in Clute, Texas, for a gun, triggering a lockdown and prompting police to rush to the scene. The dark spot was actually a shadow on a drainage ditch that aligned with a person walking.
Cameras can capture poor-quality images in various conditions, such as low light, bright light, rain, snow, and fog. Should schools rely on AI to make life-or-death decisions based on low-resolution footage that the algorithm may not be able to process accurately? The same AI vendor used by Brazoswood had its contract canceled by a large transit system in Pennsylvania, which claimed the software was unreliable in detecting guns.
It is vital for schools to grasp the boundaries of AI's capabilities and limitations.
Implementing AI software doesn't fundamentally alter the functional qualities of cameras or hardware. When a gun and a metal water bottle produce identical signals, adding AI to a metal detector doesn't transform its operation. This is why one such AI screening vendor is currently being investigated by the FCC and SEC for alleged misleading marketing claims made to schools across the country.
Weighing the Costs
The primary expense in maintaining school security is the physical equipment (cameras, doors, scanners) and the employees who operate them. AI software on an outdated security camera allows the security solutions company to generate revenue without the need for new equipment or school funding. While cost-saving is ideal, it becomes a concern when AI misinterprets a harmless object as a threat, leading to a police response for what is, in fact, just a false alarm.
Instead of individual schools selecting the best suitable solutions based on merit, vendors lobby for the allocation of local, state, and federal funding into specific products, compelling schools to purchase from one company. Given the rapidly evolving AI landscape, schools should have the freedom to choose the most effective product available, rather than being pressured into contracting with a single vendor.
Schools represent distinct settings and necessitate security solutions – both hardware and software – tailored specifically for them. This means companies must analyze and comprehend the nature of gun violence on campuses before developing an AI product. For example, a scanner designed for stadiums that only allows fans to carry in a limited number of items would be ineffective in a school where students carry backpacks, binders, pens, tablets, cell phones, and water bottles daily.
To make AI technology effective and successful for schools, businesses need to tackle the most crucial security issues on campuses. Through my research on numerous shooting incidents, I've noticed that a frequent occurrence is a student who regularly carries a gun in their backpack, and they open fire during a quarrel. Inspecting every student and backpack isn't a feasible option as students waste countless hours in security queues rather than attending classes. Inspecting bags is not a simple task, and there are still instances of violence within schools equipped with metal detectors.
Traditional methods like image classification from CCTV or retrofitted metal detectors don't tackle the root cause of teenagers freely carrying firearms at school every day. Addressing this issue necessitates more sophisticated AI with enhanced sensors than any current product provides.
Unfortunately, contemporary school security measures are rooted in the past rather than envisioning a better future. Medieval fortresses were an unsuccessful endeavor that resulted in the concentration of risks instead of risk reduction. We're currently strengthening school architecture without comprehending why European empires abandoned castle construction centuries ago.
The future of AI security in schools can help us create open campuses with seamless, unnoticeable security layers. When an incident does happen, open spaces provide the most options for finding cover. Children must never feel trapped in a classroom again, like the Uvalde, Texas, gunman did when he shot sixteens students and two teachers in 2022.
Schools serve as the border between a troubled past and a safer future. AI has the potential to either stunt or encourage our progress toward a more secure future. It's up to us to decide.
Read also:
Schools should carefully consider the potential for false alarms when implementing AI systems for gun detection, as misidentifying harmless objects as threats could result in unnecessary police responses.
The limitations of AI in school shootings prevention become apparent when dealing with poor-quality images or setting the threshold between a gun and a non-gun, as seen in the case of Brazoswood High School's false alarm.