After a gunman killed 21 people, including 19 children, in the Robb Elementary School massacre in Uvalde, Texas last week, the United States is once again facing the devastating impact of gun violence. While lawmakers have so far failed to pass meaningful reform, schools are looking for ways to prevent a similar tragedy on their own campuses. Recent history, as well as government spending records, indicate that one of the most common responses from education officials is to invest in more surveillance technology.
In recent years, schools have installed everything from facial recognition software to AI-based technologies, including programs that are supposed to detect signs of wielding weapons and online screening tools that analyze student communications at the search for mentions of potential violence. Startups selling this technology have claimed that these systems can help school officials intervene before a crisis occurs or respond more quickly when a crisis does occur. Pro-gun politicians have also advocated for this kind of technology and argued that if schools put in enough oversight, they could prevent mass shootings.
The problem is that there is very little evidence that surveillance technology effectively stops these kinds of tragedies. Experts even warn that these systems can create a culture of surveillance in schools that harms students. In many schools, camera networks running AI-based software would join other forms of surveillance that schools already have, such as metal detectors and on-campus police.
“In an attempt to stop, say, a shooter like what happened in Uvalde, these schools have actually extended a cost to the students who attend them,” Odis Johnson Jr, executive director of the Johns Hopkins Center for Safe and Healthy Schools,” Recode said. “There are other things we now need to consider when looking to fortify our schools, which make them feel like prisons and the students themselves feel suspicious.”
Yet schools and other venues often turn to surveillance technology in the wake of gun violence. The year following the 2018 mass shooting at Marjory Stoneman Douglas High School, the local Broward County school district installed analytics surveillance software from Avigilon, a company that offers AI-based reconnaissance that follows the appearances of the students. After the mass shooting at Oxford High School in Michigan in 2021, the local school district announced it would be testing a gun detection system sold by ZeroEyes, which is one of several startups making software who scour security camera feeds for weapon footage. Similarly, New York City Mayor Eric Adams said he would look into weapons detection software from a company called Evolv, following a mass shooting in the city’s subway.
Various government agencies have helped schools purchase this type of technology. Education officials have applied for funding from the Department of Justice’s School Violence Prevention Program for a variety of products, including monitoring systems that look for “warning signs of…aggressive behaviors.” ,” according to a 2019 Recode document received via a public records request. And generally speaking, surveillance technology has become even more important in schools during the pandemic, as some districts used Covid-19 relief programs to purchase software designed to ensure students were socially distancing. and wore masks.
Even before the mass shooting in Uvalde, many schools in Texas had already installed some form of surveillance technology. In 2019, the state passed legislation to “toughen up” schools, and in the United States, Texas has the most contracts with digital surveillance companies, according to an analysis of government spending data conducted by the Dallas. MorningNews. State investment in “security and surveillance” services has risen from $68 per student to $113 per student over the past decade, according to Chelsea Barabas, an MIT researcher who studies security systems. deployed in Texas schools. However, spending on social work services fell from $25 per student to just $32 per student over the same period. The gap between these two areas of spending is greatest in the most racially diverse school districts in the state.
The Uvalde School District had already acquired various forms of security technology. One such monitoring tool is a visitor management service sold by a company called Raptor Technologies. Another is a social media monitoring tool called Social Sentinel, which is supposed to “identify all possible threats that could be made against students and/or school district personnel,” according to a 2019-school year document. 2020.
It is not yet known exactly what surveillance tools may have been used at Robb Elementary School during the mass shooting. JP Guilbault, CEO of Social Sentinel’s parent company, Navigate360, told Recode that the tool plays “an important role as an early warning system beyond shootings.” He claimed that Social Sentinel can detect “suicidal, murderous, bullying and other harmful language that is public and linked to names identified by the district, school or staff, as well as social media identifiers and hashtags associated with pages identified by the school. ”
“We are not currently aware of any specific links linking the shooter to the Uvalde Consolidated Independent School District or Robb Elementary on any public social media sites,” Guilbault added. The Uvalde shooter posted disturbing photos of two guns on his Instagram account before the shooting, but there is no evidence that he publicly threatened any of the schools in the district. He private messaged a girl he didn’t know that he was planning to shoot an elementary school.
Even more advanced forms of surveillance technology tend to miss the warning signs. So-called weapon-detection technology has accuracy issues and can flag all sorts of items that aren’t weapons, like walkie-talkies, laptops, umbrellas and eyeglass cases. While designed to work with security cameras, this technology will not necessarily detect hidden or covered weapons. As critical studies by researchers like Joy Buolamwini, Timnit Gebru and Deborah Raji have demonstrated, racism and sexism can be inadvertently embedded in facial recognition software. One company, SN Technologies, offered a facial recognition algorithm to a New York City school district that was 16 times more likely to misidentify black women than white men, according to an analysis conducted by the National Institute of Standards and Technology. There is also some evidence that recognition technology can identify children’s faces less accurately than those of adults.
Even when this technology works as advertised, it is up to officials to be prepared to act on the information in time to prevent any violence from occurring. While it remains unclear what happened during the recent mass shooting in Uvalde – in part because local law enforcement shared conflicting accounts of their response – it is clear that the having enough time to respond was not the problem. The students called 911 multiple times and law enforcement waited over an hour before confronting and killing the shooter.
Meanwhile, in the absence of violence, surveillance makes schools worse for students. Research by Johnson, Professor Johns Hopkins, and Jason Jabbari, a research professor at Washington University in St. Louis, found that a wide range of surveillance tools, including measures such as cameras security and dress codes, were detrimental to the academic performance of students in schools. who used them. That’s partly because deploying surveillance measures — which, again, rarely arrest mass shooters — tends to increase the likelihood that school officials or school law enforcement will punish or suspend shooters. students.
“Given the rarity of school shootings, digital surveillance is more likely to be used to address minor disciplinary issues,” explained MIT researcher Barabas. “The expanded use of school surveillance is likely to amplify these trends in ways that disproportionately impact students of color, who are frequently disciplined for offenses that are both less serious and more discretionary than white students. .”
All of this is a reminder that schools often don’t use this technology the way it’s marketed. When a school rolled out Avigilon’s software, school administrators used it to track when a girl went to the bathroom for lunch, supposedly because she wanted to stop the bullying. A facial recognition company executive told Recode in 2019 that its technology was sometimes used to track the faces of parents who had been barred from contacting their children by court order or court order. . Some schools even used surveillance software to track and monitor protesters.
These are all consequences of the fact that schools feel they have to go to extreme lengths to keep students safe in a country rife with weapons. Because these weapons remain an important part of daily life in the United States, schools are trying to adapt. This often means that students have to adapt to surveillance, including surveillance that shows limited evidence of work and may actually hurt them.