New Tech Detects Guns Before School Shootings?
New AI technology aimed at weapon detection is being sought after for schools to stop mass shootings before they occur.
After the deadly school shooting in Uvalde, Texas many are seeking answers. Some wish to know how an 18-year-old with little to no known gun experience acquired such expensive equipment, some wish to know why the police stood down and allowed the shooter to remain in the building with innocent children for so long, but most everyone is searching for solutions to prevent another violent incident like this. Despite calls for more gun control, knowing that law enforcement did nothing while 19 children and 2 teachers were gunned down, many have little confidence in disarming law-abiding citizens further, and so some are suggesting a look into advanced weapon detection technology which could offer sound security and recognize shooters long before they walk into a classroom.
AI weapon detection is now being considered. Instead of installing metal detectors or arming teachers, some security companies are developing advanced AI systems that are better at recognizing potential shooters and their weapons sooner. The company Deep North has developed advanced facial recognition technology which interprets human behavior and signs of aggression as well as weapons.
While still a new concept, AI is being utilized by scientists to predict health concerns, and even by politicians to write laws. AI, in essence, is made up of nothing more than a network of algorithms. In theory, the more data that is inputted into AI systems, the more accurate they become. When scanning people for weapon detection, the technology is pretty sound, but the concept of scanning humans to predict their actions is a concept that has yet to be deemed legal. This veers into “pre-crime” ideology which may interfere with individuals’ right to due process.
While AI systems are making headway with weapon detection, the fact that they are being directed to assume a person’s intent and how they will act based on algorithms is highly concerning being that algorithms have a history of being wrong, or even disruptive when correct, and have even discriminated against minorities because they follow data that is inputted by a justice system that has been known to make mistakes.
Data collection is also another concern. Weapons detection can be done with simple metal detectors, and teachers who are armed can stop a shooter without falsely accusing a peaceful person of displaying potentially harmful characteristics, but AI systems can follow data patterns that lead to incorrect assessments and are subject to hacks. Since the pandemic, numerous schools have had their computer systems hacked and their data breached. This is of great concern being that data harvesting is a growing breed of criminal activity.
While some AI weapon detection can reveal more weapons and harmful substances than metal detectors, other advanced AI technologies which predict crime are likely to subject innocent students to serious police investigations. As some students already fear a heightened security state, this would interfere with their learning potential. School security is a serious issue which must be addressed, but turning schools into prison-like surveillance states with pre-crime abilities is a suggestion that many students, teachers, and parents are wary of pursuing.