American public education has long been one of the most expert-driven fields in either the public or private sectors. Teachers, school leaders, and ed school professors, especially of the traditionalist mode, tend to dismiss ideas and solutions from reformers (especially when those solutions counter their worldview) largely because those folks aren’t consider haven’t always spent enough time in classrooms to supposedly know better. It is why families are often treated with condescension and disdain when they push to take lead roles in shaping education for their children. Yet, more often than not, the “experts” really don’t know what they are doing themselves. In fact, it is the very practices championed by traditionalists — including those that create cultures of low expectations for poor and minority kids — that are the underlying reasons why traditional district schools fail to improve student achievement. And when experts dismiss the power of objective data, they do disservice to our children and their profession.
In this Voices of the Dropout Nation, Charter School Growth Fund partner Alex Hernandez, whose Think Schools blog and Twitter feed explores the future of American public education, further explains why expertise in education is overrated — and why teachers will now have to become sophisticated in using objective student data in order to do good and great work with our kids. Read, consider, and take action.
My wife tells me it’s not healthy to think about education 24 hours a day so I picked up Thinking, Fast and Slow, a book on judgment and decision-making by Nobel Prize-winning psychologist Daniel Kahneman. The book distills a lifetime of work by one of the most provocative, decorated scholars of the last century. Read it.
On my wife’s instructions, I’m edifying myself, finding new things to talk about at dinner parties. And then midway through the book, all kinds of edu-hell breaks loose. For three chapters in Thinking, Fast and Slow, Kahneman argues that professional expertise is largely an illusion:
- Expert judgment is notoriously unreliable.
- Statistical predictions are consistently more accurate than expert ones in a wide variety of fields. Professionals with strong short-term recognition rarely recognize their own limits in making long-term predictions.
- Expert judgment can only be acquired in stable, predictable environments where professionals learn patterns from prolonged practice and ample feedback.
- Expert judgment can add value but only when combined with objective, rigorous evidence.
Okay! Okay. My turn. You don’t have a crystal ball, you can’t look at a kid and predict his future any more than I can. I’ve sat at those kitchen tables with you and listened to you tell those parents ‘When I know, I know! And when it comes to your son, I know’. And you don’t. You don’t!
Billy Beane in Moneyball
Kahneman deluges his readers with fascinating studies across many different industries supporting these conclusions. I won’t spoil it for you. But when it comes to education, Kahneman’s points have some amazing implications for American public education:
For one, teachers, school leaders, and other educators (myself included) are probably not good at making long-term predictions about what students can do. This includes determining appropriate levels of academic rigor, and judging students’ long-term prospects for success.
The second: Educators should use a lot more student data. We can disagree about what types of assessments are valuable, but there is no utopia where trained professionals “just know”.
Another implication: That there are probably thousands of educators who work in schools where they are not able to acquire expertise because the environments are not stable and there is not enough feedback to support professional learning. This precludes educators from getting the prolonged practice necessary to master their craft.
And finally, blended learning will play an important role in building algorithms that help us see patterns in student learning.
Khaneman notes that his research is met with hostility and disbelief by professionals — and then usually ignored. Understandable. I’ll be honest, Kahneman’s book still has me on edge. I’ve invested a lot in my career and I don’t enjoy being told that my expertise is not as reliable as I like to think. But if we are serious about building better learning environments, we should examine how the ‘illusion of expertise’ manifests itself in education.
Read Education Week or The New York Times and you will see opinion piece after opinion piece about how assessments are an attack on teachers and an insult to their professional judgment. Or articles about how mindless, heartless software may one day replace teachers. Yet if education is anything like other highly-trained professions, data and statistics will only help ‘professionalize’ instruction.
I’ve been a fan of algorithms and standardized tests since one of my sons was administered the Apgar test at 1 minute and 5 minutes old. His declining score helped flag that his lungs were filling with fluid and that he could not breathe on his own. Thousands of lives have been saved by this simple assessment because, in the absence of a standardized procedure, doctors and nurses focused on different cues and danger signs were often missed. No one would say that these doctors and nurses did not care about newborns, it is just that professional judgment can be inconsistent and have real consequences for children.
What most unnerved me in Thinking, Fast and Slow was the implication that educators may not be good at determining appropriate levels of rigor for our students, because rigor forces us to make predictions about what students can do in the future. In her new book, Mission Possible, Eva Moskowitz shares some of the struggles Success Academies in New York City has in identifying appropriate rigor for their students. By the way, Success Academies runs some of the most rigorous public schools in the country and Eva has more courage than you can shake a stick at. And even then, they struggle with rigor. (Disclosure: Charter School Growth Fund, for which Hernandez is a partner, is a philanthropic investor in Success Academies.)
In response to teacher concerns over a difficult, new math assessment, Eva wrote: We are pioneering… because schools have not found what kids are capable of mathematically. Rather, a ridiculous number of assumptions have been made. Kids in Japan and Germany and Singapore are being asked to do much more rigorous math and at much earlier ages. The expected pace of learning outstrips what American educators are expecting… Here in Harlem and the South Bronx we are doing what India did a decade ago. We are experimenting. We are trying to find the rigor bar. We are trying to develop a shared culture around struggle and challenge, both for the grown-ups and the kids.
As an educator, I’m often haunted by one of my own students who went from failing every core high school class to making the honor roll in a year and a half. I worked in a high expectations school where we all took pride in helping students reach their potential. But this student was dragging every risk factor in the book to school each day. In my deepest, darkest moments, I wasn’t sure this student would graduate from high school. I finally concluded, in the absence of knowing any better, we just have to treat each student like they are going to Harvard. If we are going to be wrong, let’s at least do no harm.
In the next decade, we have a chance to build better schools using new approaches and new models. Once we let ourselves see beyond our own expertise, I suspect we’ll find that children are capable of more than we ever thought possible. Let’s see what the evidence says.