Tag: Robert Balfanz

Banishing Bad Anti-School Discipline Reform Reports

Last week’s Dropout Nation exposes of the use of shoddy data and analysis by anti-school discipline reform types such as Max Eden of the Manhattan Institute and Thomas B. Fordham…

Last week’s Dropout Nation exposes of the use of shoddy data and analysis by anti-school discipline reform types such as Max Eden of the Manhattan Institute and Thomas B. Fordham Institute President Michael Petrilli generated a lot of discussion, both on social media and within education policy circles. This is good. Exposing intellectual sophistry, especially the kinds of data manipulation and trumpeting of poorly-constructed research as done by Eden, Petrilli and their ilk (along with their willing ignorance of high-quality studies based on longitudinal student data that they prefer to ignore) is critical to honest policy and practice in the overhaul of American public education.

Yet we must continually remember that the problem with bad studies based on shoddy data is that they don’t disappear. If anything, they are recycled over and over again, both by advocates who deliberately engage in sophistry in order to further their cause (and influence policymakers who want to agree) as well as by well-meaning pundits who only read the executive statements, less-than-thorough news reports and little else.

Two analysts at the D.C. Policy Center, Chelsea Coffin and Kathryn Zickuhr made this mistake earlier this month when they cited several low-quality anti-school discipline reform studies in their otherwise-interesting policy paper advising the District of Columbia’s city council to provide adequate support for implementing a proposed ban on meting out suspensions for minor infractions. As some of you may know, the Nation’s Capital is considering a proposal from Councilmember David Grosso (who chairs the council’s education oversight panel) that addresses concerns raised by families, traditionalists and some reformers, both over the overuse of harsh discipline by both D.C. Public Schools and charter school operators, as well as revelations that some operators have been understating their out-of-school suspension levels.

One mistake made by Coffin and Zickuhr? Citing the Thomas B. Fordham Institute’s latest study of Philadelphia’s school discipline reform efforts. As Dropout Nation pointed out last week, the report’s assertions that asserts that reducing suspensions for non-violent offenses have little effect on achievement is based on two years of school level data that doesn’t actually measure how the reforms impact individual or subgroups student learning. It also doesn’t consider how well individual schools implemented the reforms in that period, a matter that is discussed by the University of Pennsylvania’s Consortium for Policy Research in Education in a similar study also released last month. [By the way: D.C. Policy Center doesn’t even link to University of Pennsylvania’s findings.] As a team of researchers led by Karega Rausch, a leading expert on school discipline who now heads research for the National Association of Charter School Authorizers, pointed out last year in a report for the Center for Reinventing Public Education, longitudinal student data, which shows how children are affected by changes in discipline policies, is the best measure, one that Fordham’s researchers could have accessed if they worked with the City of Brotherly Love’s traditional district.

Another problem with Coffin’s and Zickhur’s report? That it also links to Eden’s ‘study’ released on school climate throughout the city and the school discipline reform efforts undertaken by the New York City Department of Education under Former Mayor Michael Bloomberg and his successor, Bill de Blasio. As your editor also noted last week, it is also too flawed to be taken seriously. One reason why? Eden didn’t just simply measure the raw results from the Big Apple’s school climate studies over the five-year period (2011-2012 to 2015-2016) being measured, which is the most-reliable way of analyzing what is already unreliable subjective data. Instead, Eden cobbled together a “distribution-of-differences” analysis in which any change of 15 percentage points on each of the questions represented “a substantial shift” in attitudes on school safety, especially for each school in the district. How did he arrive at 15 percentage points instead of, say, 20 or 10 or even five? Eden doesn’t explain. The data alchemy, along with the substandard nature of the underlying survey data, makes Eden’s report even less-reliable than it already appears.

Your editor can’t totally blame Coffin and Zickhur for relying on shoddy research. As with everything in education policy, it takes years for the release of high-quality research. In the case of impacts of school discipline reforms that are currently being undertaken in places such as Philadelphia, the need for four-to-eight years of longitudinal student data to gain a good handle on what is happening will make life more-difficult for pundits and wonks who care a lot about policy wins and making big splashes. Which means it will be tempting to base opinions and recommendations on shoddier work product, especially from big-named think tanks who are willing to shovel out shoddy white papers instead of doing solid work.

That said, Coffin and Zickhur could have easily looked at University of Pennsylvania’s report, whose interviews provide much-stronger insights on the challenges districts can face during the first two years of implementing a discipline reform (as well as how schools are implementing them at the beginning), or even gain access to a study of Minneapolis Public Schools’ pilot program to use restorative justice for children facing expulsion for violent infractions (which gives an idea of possible benefits as well as issues in implementation at scale). Both have limitations, but can add some color to the discussion if properly limited. [Happily, Coffin and Zickuhr do cite one of University of Chicago Consortium on School Research’s two reports on school discipline reform efforts in Chicago, which, unfortunately, don’t provide longitudinal student achievement results.] A call to school and community leaders on the ground working on this issue would have also help. This includes Oakland Education Fund Executive Director Brian Stanley, who helped implement the Bay Area district’s ban on suspensions for minor infractions.

As for other wonks and polemicists (as well as traditional news reporters) looking to write more-thoughtful pieces on school discipline reform? Your editor offers some advice. The first? Always read beyond the executive summaries. This includes reading the list of cited references and sources usually in the back of a report or study. Put this way: If the study’s citation and reference lists include the likes of Eden and his Manhattan Institute colleague, Heather Mac Donald (the latter of whom focuses law enforcement and immigration, and tends to dismiss any discussion about racial disparities), ignore it.

Also, if it doesn’t mention work by respected researchers on school discipline such as Russell Skiba of Indiana University, Johns Hopkins University’s Robert Balfanz, John Wallace of the University of Pittsburgh or Rausch (all of whom use longitudinal student data in their research), then it deserves no consideration at all. Therefore, ignore this anti-school discipline white paper on Wisconsin’s efforts making the rounds this week — unless you want to give your child paper for cutting and origami. [Which is what happens to a lot of white papers coming to my office.]

Another alarm bell: When the report or study makes assertions that it later admits cannot be supported either by the underlying data or after going through additional analyses, including stress tests to verify results. In the case of discipline studies using school-level data championed by the anti-school discipline reform crowd, the results are often not going to be “granular enough” (or offer enough detail on how individual or groups of students are impacted by a reform or intervention) to support anything more than the most-tepid assertions.

Additionally, if the study doesn’t admit that other research bears out other sensible reasons supported by research and data for embarking on a school discipline reform, then it shouldn’t be taken seriously. Why? Because the failure to admit this is evidence that the study is little better than the kind of white papers that you would expect out of Forrester Research and other market insight firms whose predictions, as legendary former Forbes Editor William Baldwin would say, won’t come within a country mile of being realized. This is why a study by Boston University grad student Dominic Zarecki, which was used by Eden in an op-ed last week, has little value to anyone seriously discussing school discipline reform.

Finally, school reformers, most-notably those who are champions of discipline reform, must challenge, call out and dismiss shoddy data, especially when used by allies opposed to overhauling how children are corrected in schools. Researchers such as Daniel Losen of the Civil Rights Project at UCLA, along with advocates on the ground, already do this. There’s no reason why colleagues are allowed to engage in patently dishonest data usage, especially when they chant the mantra of using high-quality data when addressing other issues.

Comments Off on Banishing Bad Anti-School Discipline Reform Reports

Max Eden (and other School Discipline Reform Foes) Use Bad Data

There are some amazing things about the internecine battle within the school reform movement over efforts to end overuse of out-of-school suspensions and other forms of harsh traditional school discipline,…

There are some amazing things about the internecine battle within the school reform movement over efforts to end overuse of out-of-school suspensions and other forms of harsh traditional school discipline, and the effort by so-called conservative reformers to overturn the U.S. Department of Education’s Obama-era guidance to districts on school discipline reform. One is the unwillingness of opponents of school discipline reform, especially Michael Petrilli of the Thomas B. Fordham Institute and Max Eden of the Manhattan Institute, to actually engage the three decades of high-quality research that shows that far too many children, especially Black and American Indian kids, are suspended often. The other? That those very opponents attempt to use low-quality research that doesn’t actually prove their defense of such practices, often to ignore the volumes of evidence standing against them.

These two matters become especially clear this morning in an op-ed by Eden in US News & World Report that declares that reducing the overuse of suspensions — especially restrictions on using suspensions for minor infractions such as disruptive behavior that can be addressed through other means — is somehow causing “substantial academic damage” to children in classrooms. Primarily citing a study by Boston University graduate student Dominic Zarecki on Los Angeles Unified School District’s move five years ago to stop suspending children for acting out in class, Eden argues that “suspension bans hurt kids”, hinders the efforts of teachers to manage their classrooms and leads to lower student achievement.

Yet contrary to Eden’s assertions, the study itself doesn’t offer much in the way of hard conclusions. One reason? Because the study doesn’t use student-level academic data. As conceded by Zarecki (who, for some odd reason, goes unnamed by Eden in his op-ed), the study is based on school-level data which doesn’t follow an actual cohort of L.A. Unified students over a period of time. The other problem: That it doesn’t track impact over a period longer than two years. This is a problem especially given that the long-term effects of a reform or an effort can take years (including adjustments in implementation such as improved teacher training) to manifest. Since the study itself doesn’t actually look at student performance over time, or even accounts for matters such as student migration, it “lacks the data granularity” needed to look at how reducing suspensions impacts individual students or even particular groups, much less actually offer any conclusions worth considering. Even Zarecki concedes that based on additional analysis, L.A Unified’s ban “may have had no causal effect” on achievement.

Certainly a study using longitudinal student-level would be hard to do in part because of the efforts by California Gov. Jerry Brown to kibosh more-robust school data systems. But it wouldn’t be impossible. After all, the Los Angeles Times did exactly that in 2010 with its value-added analysis of teacher performance within the district, gaining access to the data after a Freedom of Information request to the school system. Researchers tend to have an easier time obtaining data, especially since they are willing to safeguard privacy and, in many cases, even withhold the name of the district itself (though there are often enough details to figure out which school operator was the subject). Zarecki, who also works for California-based charter school operator Fortune Schools, can easily get in touch with L.A. Unified’s data department if he chose to do so. There is no justifiable reason why the data couldn’t have been obtained for this study.

Put simply, this study is of low-quality. Especially when compared to the research on school discipline that has been conducted over the past decade alone. This includes the 2012 study conducted by a team led by Johns Hopkins University scholar Robert Balfanz that used eight years of student-level longitudinal data to determine that overuse of out-of-school suspensions in ninth grade were positively correlated with likelihood of dropping out of high school, as well as Balfanz’s renowned 2007 study on developing early warning systems with Lisa Herzog of the Philadelphia Education Fund (which also used eight years of student data, this time, from the City of Brotherly Love’s traditional district) to reach the same conclusions.

This lack of high-quality, along with the short time span being measured, is a problem shared by other studies promoted by other opponents of reforming school discipline. Take the study released last month by Petrilli’s Thomas B. Fordham Institute on Philadelphia’s school discipline reform efforts. Eden also cites the study in his piece. The study’s main conclusions — including the assertion that reducing suspensions for non-violent offenses have little effect on achievement — are also based on two years of school level data that doesn’t actually measure how the reforms impact student achievement. [It also doesn’t take into consideration how well individual schools implemented the reforms, a matter that is discussed by the University of Pennsylvania’s Consortium for Policy Research in Education in a similar study also released last month.] That the study uses school-level data instead of student-level data also means that the conclusions have little value.

Manhattan Institute’s Max Eden, along with other opponents of school discipline reform, has a tendency to misuse and overstate data.

In fact, the only useful study the Petrilli-Eden crowd have at their disposal is one conducted last year by a University of Arkansas team led by Gary Ritter. The study, which is based on six years of student-level data, concludes that out-of-school suspensions on their own don’t have a negative impact on student achievement and may lead to “slight” improvement in standardized test performance. But even the Ritter study is little use to them. One reason: Because the study itself doesn’t look at the impact of any particular school discipline reform (the study merely looks at possible impact of suspensions on achievement), it isn’t useful in any argument against those efforts. Another is the fact that the study doesn’t actually measure impact of suspensions based on the number of days kids are kept out of school; in the case of Arkansas, a suspension of more than 10 days is considered an expulsion, which means that thousands of children and their student achievement data have likely been excluded from the study, a limitation conceded by Ritter and his team. [Others have expressed their own concerns about the study.]

Meanwhile Ritter and his team honestly concede that decades of research show that overuse of suspensions damages children when you look at graduation rates and other data. In fact, they concede that school leaders and policymakers can have justifiable reasons for reforming school discipline. Ritter himself publicly stated that his study doesn’t argue for halting school discipline reforms and shouldn’t be used as justification for ending the Obama Administration’s guidance, the bete noir of the anti-school discipline reform crowd.

Despite these caveats, opponents of school discipline reform have insisted on using the study to bolster their case. Eden, in particular, mentioned the Ritter study as a supporting example last month in his testimony to the U.S. Commission on Civil Rights during one of its hearings. But this isn’t shocking. Eden also mentioned a 2014 study by Russell Skiba of Indiana University, the leading scholar on school discipline reform, to support his argument that racial bias wasn’t a factor in why Black, Latino, and American Indian children were suspended at far higher levels than White peers. Eden did this even though Skiba’s study actually focused on student misbehavior and concluded that minority children weren’t worse-behaved than White counterparts, and therefore, didn’t explain why those kids were suspended at higher rates than White children in the first place.

But again, Eden’s seemingly deliberate sloppiness in handling data and evidence, along with that of his allies, is not shocking at all. Eden was called out by Daniel Losen of the Civil Rights Project at UCLA during the Commission on Civil Rights’ hearing for making arguments not borne out by his own data. Meanwhile Fordham and Petrilli, who work alongside Eden on opposing school discipline reform efforts, has been called out several times by Dropout Nation and other researchers for other incidents of reaching conclusions unsupported by data. This includes misusing data from NWEA to claim in a 2011 op-ed that focusing on achievement gaps harmed high-achieving students (as well as a study published months earlier that attempted to do the same).

What does become clear is that Eden, Petrilli and company do all they can to dance around what decades of data has proven beyond dispute: That far too many kids are suspended and expelled from school. That those practices do little to improve student achievement, enhance school cultures, or make kids safer. That children from minority households are more likely to be suspended, expelled, arrested and even sent to juvenile justice systems than White peers, even when they are referred to dean’s offices for the same infractions. That also suspensions are far more-likely to be meted out over minor matters such as disruptive behavior and attendance than for violent behavior and drug activity. That soft and hard bigotries among White teachers toward poor and minority children are underlying reasons why those kids end up being suspended more-often than White counterparts. And that teachers and school leaders often use suspensions and expulsions to  to let themselves off the hook for the failure to address the illiteracy that is usually at the heart of child misbehavior.

Given all the facts, it becomes clear that Eden, Petrilli and their allies have little interest in dealing honestly with data and evidence on the damage of overusing harsh school discipline. Which makes them untrustworthy when it comes to the mission of the school reform movement to help all children succeed in school and in life.

Featured photo courtesy of the New York Times.

Comments Off on Max Eden (and other School Discipline Reform Foes) Use Bad Data

Warning: Trying to access array offset on value of type bool in /home/dropoutn/public_html/wp-content/themes/ralphkrause/ralphkrause/parts/mjr.php on line 47

This is Dropout Nation: America’s Truancy Problem: The L.A. County Example


Warning: Trying to access array offset on value of type bool in /home/dropoutn/public_html/wp-content/themes/ralphkrause/ralphkrause/parts/mjr.php on line 47

Two hundred seventy-two thousand Los Angeles County students were truant during the 2008-2009 school year. Let that sink in. Two hundred seventy-two thousand kids. That is 16 percent of all…

In L.A. County's San Gabriel Unified, students stay out more than they check in. (Photo courtesy of the San Gabriel Unified School District.)

Two hundred seventy-two thousand Los Angeles County students were truant during the 2008-2009 school year. Let that sink in. Two hundred seventy-two thousand kids. That is 16 percent of all the students attending schools in the heart of Southern California, or 1,509 students skipping school without an excuse every school day.

We know where many of these kids will end up: They will become high school dropouts. What is astounding is that thanks to California education officials and the state legislature, we even know the truancy rate at all. Most states are ignoring the importance of reporting credible, honest truancy numbers, leaving unaddressed a critical symptom of the nation’s dropout crisis.

Within the past five years, researchers such as Robert Balfanz have proven that truancy is one of the foremost symptoms of America’s educational crisis and a primary indicator of whether a student will drop out or graduate from school. As Balfanz, Lisa Herzog and Douglas Mac Iver pointed out in a 2007 study, a sixth-grader missing a fifth of the school year has just a 13 percent chance of graduating six years later. In elementary school, truancy is a sign of parenting issues. In later grades, truancy is an indicator that a child has given up on learning after years of poor teaching, lousy curricula and lack of engagement (and caring) by teachers and principals.

Yet, as with graduation rates a decade ago, states and school districts do an abysmal job of tracking truancy (and school attendance overall) and offers misleading statistics on the true size of the problem. California offers a decent start on how to solve the latter. But it will require better data standards and data systems to make real progress.

The problem starts with the statistics itself. Most states calculate attendance by dividing the total number of days missed by students by the total number of days they are supposed to attend (usually 180 days multiplied by enrollment). This metric, used largely for school funding, is great for district coffers. But it’s terrible for addressing truancy. Why? It hides the levels of truancy plaguing a school because it includes all unexcused absences, not just the set number of days under which a student is considered by law to be truant. Add in the fact that tardiness (or excess lateness by a student) is added into the attendance rate and one doesn’t get the full sense of a truancy problem. After all, one reacts differently to a 93 percent attendance rate (which makes it seem as if most kids are attending school) than a rate that shows that 16 percent of students are truant (which is more-accurate and distressing).

What principals, teachers, district officials and parents need is the percentage of students reaching the state definition of truancy (in many states, 10 or more days of unexcused absences) — in order to identify clusters of truancy — and the chronic truants themselves (so they can be targeted for additional help). A group of teachers at New York City’s High School for Telecommunications – frustrated with the district’s poor attendance tracking — are among those developing technologies to improve how attendance is calculated. The technological solutions, however, are meaningless without developing actual calculations that plainly break down what is happening and making the data public for all to see.

California is one of two states (out of 10) surveyed by Dropout Nation that have gone this far in providing truancy data.  (Indiana, the epicenter for a 2007 editorial series Dropout Nation’s editor wrote on truancy for The Indianapolis Star, is the other). Unlike other states, the state Department of Education publishes something called an actual Truancy rate, which shows the percentage of students missing three or more days of school unexcused. Even better, its data system actually shows the number of truant students in any given county, district or school. For a researcher or truancy prevention advocate, this is a much-better first step in determining the extent of truancy than the traditional attendance rates reported by other states.

What one learns, particularly about truancy in districts in Los Angeles County, is distressing. Fifty-seven of L.A. County’s 88 school districts (including the county department of education) had truancy rates of greater than 10 percent. Within the county’s largest district, Los Angeles Unified, 77 of its 658 schools were plagued with truancy rates greater than 10 percent. While high schools were plagued with double-digit truancy rates, so were middle schools such as Charles Drew in the city’s Florence-Graham neighborhood; there, 54 percent of the student population were chronically truant. The truancy rate for L.A. Unified overall was 5.4 percent; but the number leaves out truancy levels at the elementary school level (where as many as one in ten kindergarten and first grade students miss a month of school). (A a full list is on L.A. County is available here.)

A PORTRAIT OF TRUANCY: SAN GABRIEL UNIFIED

School Enrollment* Number of Students with UnexcusedAbsence or Tardy on 3 or More Days (truants) Truancy Rate
Coolidge Elementary 385 197 51.17%
Del Mar High 69 102 147.83%
Gabrielino High 1,794 1,535 85.56%
Jefferson Middle 1,239 691 55.77%
Mckinley Elementary 712 210 29.49%
Roosevelt Elementary 415 203 48.92%
Washington Elementary 458 241 52.62%
Wilson Elementary 367 161 43.87%
San Gabriel Unified District 5,439 3,340 61.41%

For all of its dysfunction, L.A. Unified doesn’t have the highest truancy rate in the county. That distinction belongs to the nearby San Gabriel Unified School District, where 61 percent of students were chronically truant. The level of unexplained absences starts early; 51 percent of students at Coolidge Elementary School were truant, while at Gabriellino High, the truancy rate was 86 percent. Another high-truancy district is Lynwood Unified, whose truancy rate of 56 percent was just below that of San Gabriel. Almost every one of the 3,152 students at Lynwood High School had missed three or more days of school without any explanation, while 81 percent of students at Cesar Chavez Middle School were truant.

A PROFILE OF TRUANCY: LYNWOOD UNIFIED

School Enrollment* Number of Students with Unexcused Absence or Tardy on 3 or More Days (truants) Truancy Rate
Cesar Chavez Middle 976 791 81.05%
Helen Keller Elementary 621 249 40.1%
Hosler Middle 1,159 1,011 87.23%
Janie P. Abbott Elementary 676 247 36.54%
Lincoln Elementary 644 176 27.33%
Lindbergh Elementary 784 179 22.83%
Lugo Elementary 492 218 44.31%
Lynwood High 3,152 3,137 99.52%
Lynwood Middle 1,648 1,450 87.99%
Marco Antonio Firebaugh High 1,875 863 46.03%
Mark Twain Elementary 616 197 31.98%
Pathway Independent Study 84 10 11.9%
Roosevelt Elementary 540 196 36.3%
Rosa Parks Elementary 626 99 15.81%
Thurgood Marshall Elementary 673 260 38.63%
Vista High (Continuation) 314 101 32.17%
Washington Elementary 786 198 25.19%
Will Rogers Elementary 769 190 24.71%
Wilson Elementary 586 102 17.41%
Lynwood Unified District 17,021 9,674 56.84%

The data  isn’t perfect. Tardiness is incorporated into the numbers, which could skew the number of actual absentees. One could also argue that three days of unexcused absence may be strict. But at least California has made a first step towards  reporting realistic attendance data — and school districts have information they can use to address the underlying causes of truancy.

This isn’t happening in a successful way. School districts in Los Angeles County haven’t exactly done a great job addressing truancy. Despite high-profile sweeps, anti-truancy ordinances and other efforts by districts in the county, the truancy rate countywide has barely budged between 2004-2005 and 2008-2009. L.A. Unified, even took the media-grabbing step of having its outgoing superintendent, Ramon Cortines and school board members go door to door to grab truants, is the only one that can report a decline, with a 34 percent decrease in truancy in that time. But even those efforts are only band-aids; more importantly, since the sweeps tend to happen during periods when districts must count up students in order to gain funding, the moves can viewed cynically  as just ways to keep the money flowing without actually doing anything to address the underlying causes of truancy. School district officials and charter school operators in L.A. County must do a better job of addressing the underlying issues — as must their counterparts throughout the nation.

But at least California (along with Indiana) has taken a step that most other states — especially Virginia and Tennessee, two of the other states surveyed by  Dropout Nation — refuse to do.  Accurate, honest, publicly-reported data is the critical first step to making the technological and academic changes needed to stop truancy in its tracks — and keep every kid on the path to economic, social and personal success.

1 Comment on This is Dropout Nation: America’s Truancy Problem: The L.A. County Example

Warning: Trying to access array offset on value of type bool in /home/dropoutn/public_html/wp-content/themes/ralphkrause/ralphkrause/parts/mjr.php on line 47

The Dropout Nation Podcast: The Dropout Crisis Beyond Cities


Warning: Trying to access array offset on value of type bool in /home/dropoutn/public_html/wp-content/themes/ralphkrause/ralphkrause/parts/mjr.php on line 47

On this week’s Dropout Nation Podcast, I explain why one must stop thinking about big cities as being the only epicenters of the dropout crisis and the nation’s crisis of…

Dropout Nation Podcast Cover

On this week’s Dropout Nation Podcast, I explain why one must stop thinking about big cities as being the only epicenters of the dropout crisis and the nation’s crisis of low educational achievement. While Detroit and other major urban areas are often associated with systemic academic failure, small cities such as Hammond, Ind., and Alexandria, Va., along with rural communities, suburbs and exurbs also struggle with dropouts.

You can listen to the Podcast at RiShawn Biddle’s radio page or download directly to your iPod, Zune, MP3 player or smartphone. Also, subscribe to the podcast series. It is also available on iTunes, Blubrry, Podcast Alley, the Education Podcast NetworkZune Marketplace and PodBean. And the podcast on Viigo, if you have a BlackBerry, iPhone or Android phone.

1 Comment on The Dropout Nation Podcast: The Dropout Crisis Beyond Cities

Warning: Trying to access array offset on value of type bool in /home/dropoutn/public_html/wp-content/themes/ralphkrause/ralphkrause/parts/mjr.php on line 47

Watch: Robert Balfanz on Fixing America’s Dropout Factories


Warning: Trying to access array offset on value of type bool in /home/dropoutn/public_html/wp-content/themes/ralphkrause/ralphkrause/parts/mjr.php on line 47

As the pioneering researcher who developed the Promoting Power Index of measuring high school academic failure and the man who coined the term ‘”dropout factories“, Robert Balfanz knows all too…

As the pioneering researcher who developed the Promoting Power Index of measuring high school academic failure and the man who coined the term ‘”dropout factories“, Robert Balfanz knows all too well about how the nation’s worst public schools retard the educational destinies of the children who attend them. From his perspective, it is as critical to turn around these schools as it is to deal with other causes of racial-, ethnic- and gender-based achievement gaps.

Watch this video of Balfanz’s testimony before the House Education and Labor Committee earlier this year on the need to reform the worst high schools (you can watch other HEL videos at the majority leadership’s YouTube page. And pay attention to his most-important point of all: Improving the quality of American public education — and helping all children achieve their fullest potential — isn’t insurmountable. It can be done — and it should.

You can also watch other Dropout Nation videos on YouTube.

1 Comment on Watch: Robert Balfanz on Fixing America’s Dropout Factories

Warning: Trying to access array offset on value of type bool in /home/dropoutn/public_html/wp-content/themes/ralphkrause/ralphkrause/parts/mjr.php on line 47

Evan Bayh’s School Reform Legacy: His Name is Stan Jones


Warning: Trying to access array offset on value of type bool in /home/dropoutn/public_html/wp-content/themes/ralphkrause/ralphkrause/parts/mjr.php on line 47

Given the array of plays on the Indiana U.S. Senator’s name — including some of my own reports — I’ll shy away from the pile-on amid his decision to end…

Given the array of plays on the Indiana U.S. Senator’s name — including some of my own reports — I’ll shy away from the pile-on amid his decision to end his re-election bid. But Bayh’s exit does give one pause about the role he has played, not only in American politics (and especially in the Hoosier State), but in helping to re-shape how the nation measures academic performance and emphasizes rigor and data over guesswork and academic failure.

For the most part, Bayh’s role in this was incidental. Save for championing some odd policy or two, education was an afterthought for him. The earliest school reform efforts came before Bayh’s tenure as Indiana Governor in the late 1980s thanks to a group that included then-state superintendent H. Dean Evans and future state House Republican leader Brian Bosma.  The most direct impact he had on education wasn’t even on  policy itself, but on a move back in the mid-1990s to address the state’s perpetual deficit in its teachers pension. Although Bayh and his main successor, Frank O’Bannon, helped decided to use funds from the Hoosier Lottery to pay down those deficits and fully fund the pension, it didn’t work. Indiana’s teachers pension is currently $10 billion under water.

One indirect legacy lies not with Bayh himself, but with his onetime chief of staff, Bart Peterson. After becoming Indianapolis’ first Democrat mayor in four decades, Peterson struck a blow for school reform and school choice when he successfully battled his fellow Democrats in Indiana’s statehouse to become the first mayor in the nation to authorize charter schools. Whatever Peterson’s other flaws as a politician (namely a lack of focus on quality-of-life issues), he remains a pathbreaker in education reform through his founding of the Mind Trust, one of the leading incubators of education reform solutions in the nation.

Bayh’s most-important school reform legacy was rather incidental. It came during his last two years  in the governor’s office when he appointed one of his aides, a former state legislator (and onetime candidate for state schools superintendent) by the name of Stan Jones, to the state’s Commission for Higher Education. At the time, the agency did little more than serve as the sounding board for the state’s higher ed policymaking and presenting budgets to the legislature.  What Jones managed to do over the next 13 years set the path for how education policymakers — both in the Hoosier State and throughout the nation — should approach systemic reform.

Even before the passage of the No Child Left Behind Act in 2001, Jones was among the first to call for reform of the state’s high school graduation rate calculation, which had been so inaccurate for so long that perpetual failing school districts such as Indianapolis Public Schools were allowed to post graduation rates of 95 percent and higher (even when it was more likely that they were graduating a mere 50 percent of freshman in four years). Not only did Jones call for replacing the old graduation rate calculation with a new one, with the help of one editorial board (on which I served) and a smattering of state leaders, Jones spent much of his tenure battling school districts, his fellow Democrats and even the state’s longtime education superintendent (and longtime foe) Suellen Reed to make it happen.

More importantly, along with the state’s Chamber of Commerce and Derek Redelman (a once-and-future Chamber executive who once, oddly enough, helped Reed beat Jones in winning the superintendent’s job), Jones began rallying state officials — including Bayh’s successor, Frank O’Bannon, Joe Kernan and Mitch Daniels — and business leaders to begin addressing Indiana’s most-pressing educational issues. He helped transform a politically-driven state college into a network of community colleges where high school graduates who weren’t ready for the rigors of Indiana University and Purdue could get prepared.  He began addressing the reality that the Hoosier State — home to the university that hosts the nation’s second-largest foreign student population (and another whose international tentacles extend into Asia) — couldn’t even assure that more than a quarter of its high schoolers were attending college.

These days, Jones is working with the Bill & Melinda Gates Foundation to address the nation’s problems of low college attendance and completion. But his past work has an impact far in Indiana and beyond. These days, state schools superintendent Tony Bennett — who may be the most-successful state schools chief executive in the nation — has to thank Jones for paving the way for Bennett’s own efforts to address teacher quality and end social promotion. Outside of Indiana, the work on graduation rates — along with the pioneering research of Jay P. Greene, Robert Balfanz and Christopher Swanson — is the underlying reason why President Barack Obama’s Race to the Top effort is gaining traction.

Bayh hasn’t exactly done much since on education policy. He hasn’t even been much of a presence in the debate over No Child or Race to the Top. But let’s give him credit for picking the men who cared about school reform and improving the lives of America’s children.

Comments Off on Evan Bayh’s School Reform Legacy: His Name is Stan Jones

Type on the field below and hit Enter/Return to search