About Those School-to-Prison Pipeline Numbers…

Gerard Lawson

Two years ago the Center for Public Integrity (CPI) released a study reporting that Virginia led the nation in sending students from schools — 16 out or 1,000 — to police or the courts. That finding fueled demands to overhaul k-12 disciplinary practices to reduce the so-called “school-to-prison pipeline.”

Well, it turns out that those numbers were wildly inflated.

“When we look at the official Juvenile Justice records to see who actually went to court from the schools, the number is actually 2.4 per 1,000,” says Gerard Lawson, an associate professor at Virginia Tech’s school of education.

Lawson and his colleagues conducted the research to find what factors led to student involvement in the pipeline and how those factors could be mitigated, according to a Virginia Tech news story. Here’s what they found:

The researchers launched the study in January 2016, drawing data from several state agencies, including the departments of Juvenile Justice, Education, and Criminal Justice Services.

“At the very outset we realized the numbers weren’t matching up,” said Lawson, who is also president-elect of the American Counseling Association. “We scoured the data between the Departments of Education and Juvenile Justice, matching localities, dates of birth, dates of offensives, types of offenses, and we realized that the number of students actually ending up in court was much lower than that first impression.”

There is a checkbox on a Department of Education tool gathering data about suspensions and expulsions which asked, “Was this reported to law enforcement?”

“In most cases, when the box was checked, it appears that it represented an informal report to law enforcement — an administrator running into the school resource officer in the hallway, for example, and mentioning that a student had been suspended.” Lawson said. “The ‘report’ may have gone no further than the officer responding, ‘Thanks—good to know.’ With a bit of semantic imprecision, that checkbox elevated Virginia’s numbers dramatically.”

However, Lawson’s study did confirm two trends highlighted by the CPI study: students with disabilities were more likely to be suspended, and African-Americans, representing 23% of the student population in Virginia, accounted for 49.4% of the court referrals.

“We need to rethink discipline,” says Lawson. “Should a middle-schooler get arrested for flipping the bird at a teacher? The stakes are too high. A single suspension makes it less likely for a student to graduate from high school, and involvement with the Court system makes that less likely still. The repercussions can be lifelong.”

Bacon’s bottom line: Lawson should be commended for debunking the misinformation that Virginia is an outlier in the realm of school discipline. I always wondered about that claim — I never heard a logical explanation of why Virginia school officials supposedly referred so many more kids to law enforcement than their peers in other states. But when I wrote about the CPI research, I never thought to dispute it. Now we know why the numbers were so high.

However, I have to question one of Lawson’s insinuations. Middle-schoolers have been arrested for flipping the bird to their teachers? Really? If true, such actions are absolutely outrageous, and disciplinary procedures do need reform. But my “spidey sense” makes me suspicious. It would take a judge about two nano-seconds’ reflection to throw that out of court. I find it hard to believe that such a thing has ever happened. Perhaps Lawson was just using hyperbolic rhetoric, not to be taken literally. Or perhaps I’m just naive.

Returning to the main storyline… Let’s play a little parlor game, shall we? How much media attention will Lawson’s story get compared to the the Center for Public Integrity’s flawed report? Will the Center for Public Integrity ever correct its findings?

Update: The editors of the Center for Public Integrity offer an extended rebuttal of Lawson’s findings (and Bacon’s Rebellion’s reporting of those findings). You can read their comment here.

Update: Gerard Lawson has responded to my question about “flipping the bird,” defends his contention that it is very difficult to rank the states for law-enforcement referrals, and offers other observations. Read his comment here.

There are currently no comments highlighted.

19 responses to “About Those School-to-Prison Pipeline Numbers…

  1. “Returning to the main storyline… Let’s play a little parlor game, shall we? How much media attention will Lawson’s story get compared to the the Center for Public Integrity’s flawed report? Will the Center for Public Integrity ever correct its findings?”

    You can start calling for me to wear a tinfoil hat. Our governor is a deep state actor with national political ambitions and a Republican General Assembly. In other words, he needed to make his mark as governor in order to maximize his potential in the expected Clinton Administration. A social justice warrior could have taken that “mistake” a long way. Maybe Hillary took him off the VP list before he could capitalize on the “mistake”.

    Think I am paranoid about the deep state? Maybe. Keep a close eye on the USS Fitzgerald investigation. I come down the Chesapeake Bay at night after fishing north of the Bay Bridge. There are always container ships lined up to get into Baltimore. I can see them with my eyes, they are very visible on my $2,000 radar set. Are we really meant to believe that the Aegis destroyer with all systems operational and a staffed CIC “failed to detect” a container ship as large as a stadium? Then, they could call because the ship’s radio was damaged? Those destroyers usually have 2 helicopters on board with radios. The pilots life vests have separate radios. Something very odd with this one.

    • I was astounded by the Fitzgerald accident as well. We’ve learned that the Japanese ship was on auto-pilot, which explains why the Japanese did nothing to avert the accident. But it is incredible that the Fitzgerald did not see the ship coming and/or failed to change course. That does not augur well for a possible war against China with its fleet of quiet diesel submarines.

      • Maybe the reporting is just off. Early reports said the destroyer “failed to detect” the cargo ship. That just doesn’t seem possible unless the cargo ship was running without any lights and the destroyer’s radars were disabled.

  2. The Center for Public Integrity has contacted me to express its displeasure with the blog post. The center stands by its work and says it will point out flaws in the Virginia Tech study.

    • I would honestly be happy to hear that the data was right and the Virginia Tech study was wrong. Not because it benefits Virginia (it would not) but because the organization which sponsored The Buying of the President, despite being very left leaning, has been effective in decrying the influence of money in politics. To think it could have bungled a published study (or worse) would have been a shame.

  3. https://www.usni.org/magazines/proceedings/2017-06/fitzgerald-when-big-ocean-gets-small?utm_source=U.S.+Naval+Institute&utm_campaign=640be4981b-EMAIL_CAMPAIGN_2017_05_10&utm_medium=email&utm_term=0_adee2c2162-640be4981b-222758357&mc_cid=640be4981b&mc

    A bit of background on how that accident might have happened. The bottom line is that the captain is probably going to be cashiered. Everything is always the skipper’s fault.

    • Thanks for posting the article. It was a very sad event taking the lives of those sailors. I still think there’s quite a bit that’s unclear. If there are standing orders to wake up the captain when any ship gets within 5 miles and a cargo vessel travels at 15 knots then the captain should have been awakened 20 minutes before the collision. Arleigh Burke class destroyers can “get out of the hole” pretty well for a ship their size. Looking at the damage to the Fitzgerald it’s hard to imagine they were taking evasive action. The bow of the container ship appears to have hit the destroyer at an acute angle. Even if they somehow failed to act as the huge container ship closed on them I would have expected the officer of the deck to turned hard to port at flank speed once the collision seemed imminent. That would have resulted in more of a scraping type damage. It may not have made any difference regarding the lives of the sailors. The damage looks like the destroyer took a direct shot amid ship … meaning they were not taking evasive action. Either there was an unbelievable level of confusion and miscommunication on that destroyer or something strange was occurring at the time of the collision. I guess we’ll see as the investigation unfolds.

      • I agree.

        My first visceral reaction on reading the first news accounts was that this likely had to be a terrorist’s attack. How else could it be explained, this total lack of situational awareness and any apparent countermeasures undertaken whatsoever before and/or during and/or even after the event, absent a well planned neutralization of the ship, rendering it unable to respond even to the most obvious risk imaginable under perfect sailing conditions despite heavy local traffic?

        How could there be in such a case so many failures of so many systems and established protocols over so long a period of time during which the most advanced and agile of US naval warships acted as if comatose, irrevocably brain dead?

        If there was no terrorist act, these events should trigger a investigation throughout all US Navy fleets and commands searching for major long term systemic dysfunctions and patterns of incompetence, such as had developed in major naval commands that lead up to the 1941 Pearl Harbor debacle, an event that remains a white wash to the very day.

  4. The Center for Public Integrity stands by our reporting 100 percent. Your criticism of our work is erroneous, as was a substantial portion of Virginia Tech’s original description of its professors’ research. The Center could have explained this to you had you bothered to call us. By not giving us a chance to respond before you posted your piece, you have violated a basic tenet of journalistic fairness.

    In May, we contacted Virginia Tech’s internal news division to ask for a correction in a write-up that mischaracterized our work. Virginia Tech also did not bother to contact us before posting its write-up about the professor’s research, which questioned our findings. Virginia Tech corrected its piece after we contacted Virginia and Professor Lawson. The professor even emailed us after we explained to him his erroneous claims concerning our findings, saying: “I have provided some suggestions about how to contextualize the Virginia Tech study, including removing the references to refuting the CPI study.”

    Here is what our report said: We said that based on U.S. Department of Education national data, Virginia’s “referrals” of students to law enforcement were the highest of any state, collectively. The rate of referral worked out to be nearly 16 students per 1,000. A “referral” to law enforcement is a broad category. It could be that an incident was “referred” to a school police officer on site, or to an officer who does not work on site. It could be a referral to a court, such as a filing for truancy directly with a court. A referral could result in an arrest, or a court appearance or both, or it could result in neither of those outcomes. We explain this in our story.

    Yet the notion that our findings were “wildly inflated” has been repeated in this blog. Professor Lawson says that his research found that 2.4 students per 1,000 students were sent to court from schools. We have not reviewed his research. But again, we did not make the claim that all referrals to law enforcement reported in the US Department of Education database resulted in an arrest or a mandatory court appearance. By definition, a referral meant a report to a police officer—or court authorities—or both. The reasons for referrals were not in the database.

    Here is some of that language in our story that is helpful: “The Education Department didn’t require that schools explain why, during the 2011-12 school year, they referred students to law enforcement. And a referral did not necessarily have to end in an arrest or charges filed, at least not immediately. But by definition, it did mean that students’ behavior was reported to police or courts.”

    In a note accompanying our chart ranking states’ rates of referrals, we also explained this:

    “The Center analyzed discipline and enrollment data from the 2011-12 U.S. Department of Education Civil Rights Data Collection. The data was self-reported by school districts or state education agencies and more than 98 percent of school districts are included. Schools didn’t have to explain why they referred a student and a referral didn’t have to end in an arrest. But it did mean that students were reported to police or courts, or both, in response to an incident. Hawaii failed to report a single referral, an unexplained error. The Center combined data from individual schools and then calculated the rate of referrals in each state per 1,000 students to account for differences in population.”

    In addition, we object to claims that the Virginia Tech research means that Virginia should be downgraded compared to other states’ ranking—or that the research “debunks” our findings because Lawson said referrals to law enforcement in Virginia included “informal” interactions with school police officers. He offered the example of a school administrator mentioning to a police officer that a student had been suspended—and suggested that this kind of low-level interaction was included as a “referral.”

    But the Virginia Tech researchers have not studied how all school districts in the country decide what should be counted as a “referral.” They seem to assume Virginia only includes “informal” interactions. They cannot possibly shift the ranking that we found for Virginia, using the U.S. Department of Education database, without conducting the same research nationally that they did in Virginia. It’s possible that other districts also include interactions that did not result in an arrest or court referral in their counts of “referrals to law enforcement.”

    In its original post on Lawson’s research, Virginia Tech tried to claim that our ranking was inflated because schools counted informal interactions. Virginia Tech had to withdraw that claim after we spoke with Lawson. Moreover, we do not understand how Lawson can possibly decide that “most” of these interactions counted as referrals are similar to simply mentioning to a police officer at school that a student had been suspended.

    In fact, we included, as an example of what some might consider an “informal” referral, the case of Elijah Coles, a fifth grader in Henrico County who was placed in a room and threatened by a school police officer with arrest in the future. Elijah’s mother told us that her son was singled out among a group of children involved in horse play. She arrived to school to find her son in a room alone, with an officer who she said told her he would arrest her son if he thought the boy’s behavior in the future merited his arrest.

    Elijah’s mother complained and she showed us a response to her from school officials that said: “Henrico County Public Schools does not direct the decisions of Henrico law enforcement officials, including decisions regarding charges or potential charges.” The school would not discuss the case. The interaction between the officer and Elijah could be considered a referral to law enforcement, but also “informal.” Public defenders and other children’s lawyers told us they are concerned about multiple student-police interactions because officers remember them, and they can lead toward an eventual citation to court or an arrest.

    As part of our due diligence after crunching the U.S. Department of Education data, we sought statewide records from Virginia juvenile courts to attempt to measure how many students ended up as a result of a referral from police at schools or called to schools. We were told tracking was not reliable because referrals from school police or from those who were summoned to schools might be classified as “police” rather than “school.” We chose to take the state officials’ advice and not rely on those records although we did review them.

    We did obtain a sampling of local police records to try to trace what happens with some students. We chose areas where parents had complained about excessive use of police.

    In Chesterfield County, for example, local police sent us records showing that 3,538 criminal complaints had been filed in juvenile court against students over three years. Nearly half of the complaints involved children 14 or younger and more than half were for simple assault or disorderly conduct.

    This data established that referrals to law enforcement agencies in some jurisdictions go beyond the informal—even if they do not go beyond mandatory court “intake” interaction. We reported this: “Police spokeswoman Elizabeth Caroon said not all complaints included an arrest, and not every complaint led to a hearing in court. In an email, she said that some students are ‘diverted’ to counseling or other programs by juvenile court intake officials empowered to decide which go to court.”

    In our subsequent reporting, police chiefs in some jurisdictions told us that they did not want to get pulled into discipline matters as often as they were. School administrators also informed advocates working on legislation that Virginia law contained strict language that made them feel compelled to inform police officers whenever a child’s behavior could be interpreted as criminal. Some jurisdictions have taken steps to clarify the role of school police in recent years.

    The Editors
    Center for Public Integrity

  5. Instead of going back and forth, if both would simply share their data, methodologies, and tools, we could get to the bottom of this very quickly. Without it, its all speculation and he said/she said.
    I do give CPI +1 for posting some form of the data, although nothing usable.
    I award VT +.5 for this gem
    “Because every school, district, and state may interpret what is reported to law enforcement differently, it’s difficult to know exactly where Virginia would rank among states in terms of sending its students into the criminal justice system.”
    Highlighting the importance of data standards, and how even if we had both datasets, we still really do not know, because what is “a” in Norfolk, is “x” in Richmond, and “plaid” in Roanoke.
    Criminal justice data in the commonwealth is the exact same, particularly around law enforcement organizations. What Virginia Beach considers worthy of publishing is different from Arlington, is different from Danville, etc. Putting it in maps/charts/etc. doesn’t make any sense because nothing lines up.
    My point is you cannot get a clear picture, let alone a good picture.
    Trust me when I say that it is very similar across the board for local and state governments in Virginia. There are a few bright spots, but it most certainly looks bad all around.

    • Re the wish to share data, we can point you to the database we used. You can go to the US Department of Education’s website to peruse the ‘referrals to law enforcement’ data by individual school or district using the system’s own sorting devices. Or you can also request a disc containing data submitted nationwide that you can also sort yourself using computer-assisted tools. https://ocrdata.ed.gov/DataAnalysisTools#
      That’s what we did with the 2011-2012 data schools submitted. There is now data available for the 2013 year that you can sort.

      • Hi Susan,
        Thanks for the response, its helpful but generally, I was already aware of where it is housed. Sharing the data in this context includes the data that you specifically used, as well as pointing to the canonical source. Mistakes happen, things get lost in translation, etc., so sharing what you used provides another layer here.

  6. “The lady doth protest too much, methinks” — and after reading what she wrote on behalf of CPI (longer than the post she was commenting on) a second time, I fail to see anything there that actually contradicted Jim’s factual description of the situation, only his conclusions, which are the blogger’s prerogative.

    Now, about that destroyer and the container ship . . . . SH’s link is right on target. This occurred in one of the busiest places for ship traffic in the World! I’ve been through the comparable Straits of Gibraltar on a Navy ship multiple times and the ship was on high alert; the Captain was right there on the bridge each time, his career on the line, checking the radar and CIC tracking updates personally. That the Fitzgerald’s captain wasn’t, that the OOD didn’t record the strange manuvers of the Japanese ship (which would have taken many minutes) in the ship’s log, that no alarm was sounded waking the Captain and others prior to the collision, and that there could be any confusion whatsoever about the exact time of these events, each strikes me as highly unusual. Of course we don’t have all the facts yet, but what a mess!

  7. “The lady doth protest too much, methinks.”

    Yes, and by a Country Mile. Indeed, after the lady’s explanation, how could anyone anywhere at any time rely on anything “The Center for Public Integrity” says about any subject large or small. What a joke.

    • CIP does good work for the greater good. You may be opposed to them naturally because of the political party that you side with, but like Judicial Watch, they try to give it to you unbiased.
      Which is incredibly hard….in my opinion CIP does a good job.

      Judging an organization by one statement you found to be long-winded is your prerogative.

      As far as I know, they are the only organization that has provided political tv spending data retroactively, free for the public to consume.
      That is an asset to the U.S. that benefits all, except for politicians and lobbyists.

  8. Lots of arm-waving and empty noise. And as Walter Mondale would say “Where’s the beef?” I don’t see any information that supports the criticism of VT. VT might well have made some mistakes, but let’s see where.

    Counting every referral to the police as a bad mark doesn’t make sense. I’m working with the Fairfax County Police Department on a community issue. Every time the Department receives a call from the public, answers an alarm or an officer has an interaction with an individual, the Department records the activity as a Service Call. But only when there is evidence that a crime has been committed or a crash, red light run, speeding, etc., does the Department also count the contact as a crime or traffic incident. A referral of a student by a teacher or administrator to the school police officer would count as a service call, but not as crime report. Similarly, every time an officer pulls over a driver to remind him/her that a taillight is out is not a traffic incident. Sounds to me that, like many others, CPI was engaged in results-oriented research.

    • When it comes to groups like this I want to know more about them in terms of what else they’re involved in and who funds them. I’m not at all impressed by their chosen name.

    • I agree that CIP deserves credit for being transparent about its research methods and for explaining its methods to Bacon’s Rebellion readers. Few authors of other studies we have cited on this blog have done the same.

  9. Posted on behalf of Gerard Lawson:

    Thanks for the attention you have given to the STPP issues in Virginia. I wanted to respond to the piece you pondered near the end of your post. A student getting arrested for “flipping the bird” may be a bit hyperbolic, but we do see that there is a substantial number of kids who are arrested for “disorderly conduct.” Anecdotally, that can be (but is not always) the place holder for an administrator or a SRO who has run out of patience with a student.

    Incidentally, I have tried to be careful about not saying we have refuted the CPI study, because they were looking at referrals to “police or courts” and we only looked at who actually ended up before an intake officer. Because each school, district, and state, reports those number differently, a ranking like that is challenging at best. We have no way to know if the data are reported consistently or reliably, unless you use a specific event in time, that is consistently measured. A petition for appearance in court is consistent. What it means to be “reported” to the police varies considerably in every school.

    You may also be interested to know that, of the students who appeared before an Intake Officer for a school based offense, 48% of the time a petition was issued, and 10.1% of the time a petition and a detention order were issued (the student was held in custody). So nearly 40% of the time, the Intake Officer handles these issues informally through programs, community service, or other diversion activities.

    We do use too much exclusionary discipline in Virginia, and there is an issue to with too many kids being arrested (especially minority and disabled students), but it is not nearly the scope or scale that was implied.

Leave a Reply