Your editor usually doesn’t write immediate follow-ups on commentaries. But yesterday’s Dropout Nation takedown of use of faulty data by Manhattan Institute pundit Max Eden and other opponents of reforming school…
Your editor usually doesn’t write immediate follow-ups on commentaries. But yesterday’s Dropout Nation takedown of use of faulty data by Manhattan Institute pundit Max Eden and other opponents of reforming school discipline generated plenty of discussion both in social media and in e-mails. Thanks to those discussions, the flaws in the studies used by Eden and his counterparts, most-notably Michael Petrilli of the Thomas B. Fordham Institute and Jason Riley of the Wall Street Journal, have been exposed.
As you would expect — and has become his wont — Eden dodged the report and questions raised by other reformers and education policy scholars. Save for arguing that Oakland Unified School District, whose ban on suspensions for disruptive behavior and other minor infractions was mentioned in his piece, supposedly fell behind academically because of that effort, Eden offered little defense of either his US News & World Report op-ed or his overall arguments.
But while Eden said little, what he did say revealed even more sloppiness in his arguments and thinking. Which given that he and other foes of school discipline reform are helping the Trump Administration and U.S. Secretary of Education Betsy DeVos justify their plans to ditch the federal government’s obligation to protect the civil rights of poor and minority children, is worrisome.
In the case of Oakland, Eden declared that research from Stanford University’s Sean Reardon showing that the district’s improvement in student achievement of 4.3 years over a five-year period trailed behind the overall state average made his “case” for his conclusion. The problem? For one, Reardon’s research, which focused solely on how districts improve academic progress for children from third grade to the end of middle school (as well as how poverty affects achievement), never looked at the impact of school discipline policy (or even overuse of suspensions) on achievement. Put simply, there’s no way that Eden can use Reardon’s data to reach or support his conclusions.
It gets worse. As it turns out, Eden probably didn’t mean to mention Reardon’s study, but Boston University grad student Dominic Zarecki’s study of Los Angeles Unified School District’s implementation of a ban on suspensions for minor infractions, the white paper at the heart of Eden’s US News op-ed. The study does mention that it did an analysis of Oakland Unified academic achievement after implementation of its school discipline reform effort to compare results with that of L.A. Unified. Zarecki does note that it found that Oakland Unified trailed the rest of the state in improving student achievement by the 2015-2016 school year, arguing that it proves his study’s declaration that suspension bans damage achievement.
But Zarecki also admits that “we cannot conduct a full difference-in-difference analysis for Oakland because we lack data to measure the change in academic growth”. Zarecki also concedes that Oakland would likely have “had a relatively low growth rate even without the suspension ban”, which, given its decades-long struggles on the education front, goes without saying. As Brian Stanley, executive director of the Oakland Education Fund, noted yesterday, the district “has had fairly low academic growth for a long time.” [Stanley, by the way, offers a rather insightful and data-driven account of Oakland’s school discipline reform efforts that opponents and supporters of school discipline reform should check out.]
This oversight could be considered if Zarecki provided his analysis of Oakland Unified (which is likely based on two years of school-level data instead of at least four years student-level data) in an appendix to the main study. He did not, which means there is no real way for to understand how Zarecki reached this particular conclusion.
Of course, this is one of the many flaws Dropout Nation and others have identified. Another is that Zarecki’s study focuses not on increases and decreases in actual achievement and out-of-school suspensions for minor infractions, but on differences in differences, essentially looking at growth over the short time frames being measured. The problem with so-called difference-to-difference research design is that it can inflate what would otherwise be minor increases and decreases in standard deviations during the time periods measured. Especially when measuring two-year periods instead of four years and beyond (which would tell more about the success or failure of any implementation or program).
Put simply, Zarecki’s study, already flawed because of its focus on school level data, lack of granularity and other issues, likely yielded inflated results. Zarecki himself admits this when he notes that the two additional analyses he used to check his work didn’t yield similar conclusions.
Given that Zarecki’s study is really more of a class paper that hasn’t been peer reviewed and probably hasn’t been looked over by his doctoral advisor, you can somewhat excuse those flaws. [The fact that his career has been in education research, including time as research director for the California Charter Schools Association, makes this excuse rather weak.] But Eden, a longtime education policy wonk who spent time working for Rick Hess at the American Enterprise Institute before landing at Manhattan Institute (and who still co-writes pieces with Hess on occasion), can’t justify why he ran with this shoddy work. If your editor can sniff out the weaknesses in Zarecki’s study, then Eden can do so, too.
The fact that Eden ran with Zarecki’s study and conclusions despite all of its flaws isn’t shocking. As mentioned earlier in his wrong citation of Reardon’s study, Eden is sloppy, both in his research and his thinking. This becomes even more clear when you look at his claim to fame, a report released last yeara by Manhattan Institute on school climate throughout the city and the school discipline reform efforts undertaken by the New York City Department of Education under Former Mayor Michael Bloomberg and his successor, Bill de Blasio.
In that report, Eden concludes that the school discipline reform efforts by Bloomberg, de Blasio and their respective chancellors have led to traditional district schools in the Big Apple becoming less safe for teachers and children. How? By comparing responses of teachers and children in the traditional district to peers in charters on the city’s annual school climate survey. As any researcher can immediately note, such surveys have little usefulness as objective evidence, because they are based on subjective opinions that can change based on who is working in classrooms, because survey designs can be flawed with leading questions yielding results favorable to the pollster, and because survey designs can change drastically from year to year. Eden himself admits this in the study when he notes that he could only measure results on five questions from the city’s school climate survey because the wording had been consistent over time.
What makes Eden’s results even less-reliable is the fact that he didn’t just simply measure the raw results from the surveys over the five-year period (2011-2012 to 2015-2016) being measured, which is the most-reliable way of analyzing what is already unreliable data. Instead, Eden cobbled together a “distribution-of-differences” analysis in which any change of 15 percentage points on each of the questions represented “a substantial shift” in attitudes on school safety, especially for each school in the district. How did he arrive at 15 percentage points instead of, say, 20 or 10 or even five? Eden doesn’t explain. This gamesmanship, along with the lack of explanation, makes Eden’s analysis even less reliable than it already is.
If Eden was being intellectually honest and simply compared the raw numbers themselves, he would have reached different conclusions. Between 2011-2012 and 2015-2016, the percentage of teachers citywide (including charter schools) agreeing or strongly agreeing that “my school maintains order and discipline” remained unchanged at 80 percent. Exclude charters results from the survey, and the percentage of teachers just within the New York City district agreeing or strongly agreeing that “my school maintains order and disciplined” increased from 77 percent to 78 percent over that period, according to a Dropout Nation analysis of the city’s survey data from that period. This happened even as the number of out-of-school suspensions meted out by principals in district schools declined.
Even when using subjective data, Eden’s arguments don’t stand up to scrutiny, a point made by Daniel Losen of the Civil Rights Project at UCLA during testimony at a December hearing held by the U.S. Commission on Civil Rights at which Eden also testified. It doesn’t even stand up to the brief on overuse of suspensions in Big Apple schools released today by Center for American Progress, which uses objective data to look at the number of days children lose when they are kept out of school
Again, this isn’t a surprise. In a report on school safety released last October, Eden reached the conclusion that New York City’s charter schools were “safer” than traditional district counterparts not by comparing raw data from the Big Apple’s school climate survey or even using more-objective data such as incident reports over a period of several years. Instead, he cobbled together an index that gave scores to each of the questions on the survey, then crafted a secondary index in which charters that scored five or more percentage points higher on that first index over a traditional district school, would be rated higher. This approach to analysis is amateur hour at its worst.
The thing is that Eden’s shoddy work product could easily be ignored if not for the fact that he, along with Fordham’s Petrilli, is a leader in the effort to convince the Trump Administration and DeVos to reverse the Obama Administration-era Dear Colleague guidance pushing districts to end overuse of suspensions and other forms of harsh school discipline against poor and minority children. The four-year-old guidance, a keystone of federal efforts to spur school discipline reform, has long been the bete noir of so-called conservative reformers everywhere.
Because Eden, along with Petrilli and even Riley’s Wall Street Journal, likely has the ear of DeVos’ appointees (including Kenneth Marcus, the former George W. Bush appointee who will likely end up overseeing the agency’s Office for Civil Rights), the shoddiness of his data and that of his allies matters even more now than ever. Bad policy backed by slipshod data equals damage to children, especially those from Black, Latino, and American Indian and Alaska Native households most-likely to be suspended, expelled and sent to juvenile justice systems (the school-to-prison pipeline) as a result of districts and other school operators overusing the most-punitive of school discipline.
Which is why shoddy polemicism by the likes of Eden and other opponents of school discipline reform deserve to be exposed and denigrated. School reformers know better than to use bad studies to champion worse policies.