Most “School Shootings” Didn’t Really Happen

Most
Most "School Shootings" Didn't Really Happen

Are school shootings in the USA really as bad as we think? There’s no denying that any violence in schools is unacceptable, but it turns out the numbers many of us turn to may paint a false image of the current state of events. The shocking truth, as reported by NPR recently, is that we really have no idea just how many times per year a gun goes off in an American school.

And that, of course, leaves us in the perfect situation for anti-gun rights lobbyists to take advantage of mistakes and misinterpretations to convince us that we are in much more danger if we don’t give up arms than if we keep them.

Key Facts

• The U.S. Education Department’s 2016 report claimed that “nearly 240 schools” had reported at least one shooting at a school in the previous year. Technically, that would mean there was 240 shooting incidents in the space of a single year, which seems like an absurdly high number.

• NPR, who originally reported on this issue, personally reached out to all 240 of the schools who claimed shootings had occurred. Here’s what’s interesting about the data: many of the schools stated the shootings never happened.

• NPR, in tandem with the research organization Child Trends, analyzed the data and cross-referenced it with the government’s Civil Rights Data Collection reports. What they found was overwhelming; not half, nor even one third of the events were confirmed by the schools reporting them. In fact, just 11 out of 240 events were confirmed as being real.

• It seems the other events were, in fact, events, but in many cases, they either didn’t happen at all or they didn’t even involve a gun. Over 100 events just plain never happened; another few dozen were completely misinterpreted.

• A small number of the “reported shootings” didn’t meet the government’s classification for a shooting. This means an event happened, and it involved a gun, but wasn’t severe enough to meet the correct criteria. It seems these incidents were reported anyway.

• Results were also befuddled by the fact that nearly 25 percent of schools just plain refused to respond to requests for information. Schools either refused directly or just plain did not answer requests, choosing to ignore NPR.

• How does a major problem like this end up happening? It seems at least part of the fault lies with language contained in the Civil Rights Data Collection for the previous year. The survey asked, “Has there been at least one incident at your school that involved a shooting (regardless of whether anyone was hurt)?”

• The problem with this question is that the government defines an incident as “any discharge of a weapon at school-sponsored events or on school buses.” Ergo, if students were on a field trip, at a game at another school, or even at a local car wash, and an officer fired a gun, it would technically be an incident.

• It also seems there was confusion about changes to the survey from the previous release. The system in California used to categorize the information contains two codes: Education Code 48915[c][2]), used to identify non-gun incidents (such as threats with scissors), and (48915[c][1]), used to identify gun incidents.

• With these issues being just a single number off, it’s very easy for typos to occur if the system is set up in the right way. Some also suspect there may have been confusion with how administrators were trained to adapt to the changing guidelines and definitions year-to-year.

• In Cleveland, the question from the previous survey categorized incidents as scenarios which involved, “possession of a knife or a firearm.” However, this was updated to reflect the new guidelines for the 2015 to 2016 year. This lead to nearly 37 cases being filed as shooting events when they involved only knives.

• At least one of the incidents from Reda Middle School turned out to be a child firing a cap gun on a school bus. No real gun was ever identified and no one was harmed.

• Other inconsistencies are also present, such as misnaming the school within the database or citing the wrong date for the event. At least one of the events listed happened after the scope of capture for the survey had already ended.

• Schools have also complained about the survey and its lack of clarity, citing “lack of clarity in the definitions of key terms.” One internal report reads that school boards “indicated dissatisfaction with the categories provided, specifically that the CRDC categories did not align with the categories used in state reporting, other federal reporting, and/or their own district databases.”

• Child Trends administrator, Deborah Temkin, explained why including possession of a knife and gun in the same question could lead to inflated numbers that aren’t really as severe as they seem. “Best practices in data collection are not to include double-barreled items,” she said. Why? Because there’s really no way to determine what’s being referred to when you do. For example, an “explosive device” could be a large-scale bomb or a small firecracker.

• There are also issues with reporting consensual events, such as paintball fights or school-sponsored gun safety seminars. While they involve possession of a gun, potentially on school property, schools may be confused about how to report them depending on the language used.

• As for making corrections, the CRDC does accept them, but only for a specific period of time. They responded in July by stating, “The CRDC accepts correction requests for up to one year from the moment the submission period opens. For the 2015-16 collection, the corrections period closed on June 30, 2018.” They refused to correct the data even after NPR presented them with the facts, but agreed to “put a note on the file.”

Voice Your Opinion: