Some two decades into the grand national experiment with charter schools, how much do we really know about them? Not all that much. And not nearly as much as we easily could, say researchers from the University of California, San Diego Division of Social Sciences.
Writing in the journal Science, UC San Diego educational economist JuIian Betts and Richard Atkinson, president emeritus of the University of California and former director of the National Science Foundation, find that most studies of charter schools “use unsophisticated methods that tell us little about causal effects.”
The Clinton, Bush and Obama administrations, as well as many members of the general public (not to mention the makers of the popular 2010 education documentary “Waiting for Superman”) have embraced charter schools as the saviors of a broken educational system. But does going to a charter school improve student outcomes? We don’t really know, argue Atkinson and Betts. Which charter schools, or even types of charter schools, are more effective than others? We don’t really know.
Ideally, charter schools – which are funded publicly but are granted charters by school districts or other authorizing bodies to operate outside many of the strictures of regular neighborhood schools – would be hotbeds of innovation. They could try out different curricula, different teaching methods, different training or reward systems for the teachers. The best and most effective schools would inspire imitation. The worst would have their charters revoked and would go away.
“But most policymakers don’t have sufficient data on charter schools to decide whether they’re successes to be replicated or disasters to be shut down,” said Betts, who, in addition to being an economics professor at UC San Diego, is also executive director of The San Diego Education Research Alliance at the university, a Bren Fellow at the Public Policy Institute of California and a research associate at the National Bureau of Economic Research.
Most studies take a simple snapshot of achievement at a charter school, reading and math scores in the spring, say, and compare these to scores at a nearby traditional public school. A study of this sort, Betts said, is “naïve and essentially meaningless.”
Self-selection is the problem. A snapshot study might give you a picture of the students who selected a particular charter but says little about that school’s effectiveness. In a recent meta-analysis of the available literature on charter schools, coauthored by Betts (with UC San Diego economist Emily Tang) and published by the University of Washington’s National Charter School Research Project, 75 percent of studies were discarded because they failed to account for differences in the backgrounds and academic histories of traditional public-school students and those who chose to go to a charter.
Some of the best, most rigorous studies are based on analyses of charter-school lottery winners and losers, write Betts and Atkinson. Charter schools that are popular enough to be oversubscribed are usually compelled by law to hold lotteries. As dramatized by the documentary “Waiting for Superman,” lottery winners differ from losers only by the luck of the draw. That is to say, students who lose a lottery are an ideal control group, and comparing outcomes for lottery winners and losers is the closest we can get to a randomized controlled experiment.
Lottery-based studies suggest that charters do as well as or better than traditional public schools. But, write Betts and Atkinson, these studies have, to date, only examined about 90 charter schools – or just 2 percent of charter schools nationally.
As good as lottery-based studies are, they contain some major impediments. One is that most charter schools don’t hold lotteries. They’re not oversubscribed. A recent and major nationwide U.S. Department of Education study of charter middle schools found that only 130 out of 492 held lotteries. (And of these, only 77 agreed to share their lottery data so researchers could study them.)
It also very likely, Betts said, according to evidence from Texas and elsewhere, that the oversubscribed, lottery-holding schools are better than average to begin with. Put another way, parents are smart and there’s a reason some charter schools are popular.
The lottery-based studies, Betts and Atkinson conclude, are not very representative.
So how to study all the other charter schools, the majority, that don’t hold lotteries? Atkinson and Betts propose that “value-added” research – research that follows individual students’ trajectories, comparing how they test before/after entering or leaving a charter – is a close second-best to lottery-based research. For that, though, researchers need access to individual student test-score data over time. This access is not easy to come by and is often fiercely contested.
Betts and Atkinson add that for the fullest account of educational impacts it would also be important to look at and supplement testing data with measures of higher-order learning and behavior as well as such longer-term outcomes as graduation rates and college attendance.
The way forward is clear, Betts said, but whether there’s political will is another matter: “We need more lottery-based studies and we need to be able to do longitudinal work,” he said.
Researchers need routine access to individual student data. And charter laws should be overhauled so that charters schools have to share their lottery data with authorizing bodies and with state departments of education. (To do good research, researchers would also need to know how wait lists are administered.) This last wouldn’t be costly, Betts said, and could be accomplished by simple fiat.
It might be a tall order for all 51 states to implement the suggested reforms at once, the researchers write, but federal initiatives like the No Child Left Behind Act and the Race to the Top fund could make financial support to schools contingent on these requirements.
“Taking these steps,” Betts said, “would improve research, not only on charter schools but on all public education.”
Inga Kiderra, 858-822-0661 or firstname.lastname@example.org