Charter Schools Show Steeper Upward Trend in Student Achievement than District Schools

The number of charter schools grew rapidly for a quarter-century after the first charter opened its doors in 1992. But since 2016, the rate of increase has slowed. Is the pause related to a decline in charter effectiveness?

To find out, we track changes in student performance at charter and district schools on the National Assessment of Educational Progress, which tests reading and math skills of a nationally representative sample of students every other year. We focus on trends in student performance from 2005 through 2017 to get a sense of the direction in which the district and charter sectors are heading. We also control for differences in students’ background characteristics. This is the first study to use this information to compare trend lines. Most prior research has compared the relative effectiveness of the charter and district sectors at a single point in time.

Our analysis shows that student cohorts in the charter sector made greater gains from 2005 to 2017 than did cohorts in the district sector. The difference in the trends in the two sectors amounts to nearly an additional half-year’s worth of learning. The biggest gains are for African Americans and for students of low socioeconomic status attending charter schools. When we adjust for changes in student background characteristics, we find that two-thirds of the relative gain in the charter sector cannot be explained by demography. In other words, the pace of change is more rapid either because the charter sector, relative to the district sector, is attracting a more proficient set of students in ways that cannot be detected by demographic characteristics, or because charter schools and their teachers are doing a better job of teaching students.

Three Decades of Growth

The nation’s first charter school opened in Minnesota in 1991, under a state law that established a new type of publicly funded, independently operated school. School systems in 43 states and the District of Columbia now include charter schools, and in states like California, Arizona, Florida, and Louisiana, more than one in 10 public-school students attend them. In some big cities, those numbers are even larger: 45 percent in Washington, D.C., 37 percent in Philadelphia, and 15 percent in Los Angeles.

Nationwide, charter enrollment tripled between 2005 and 2017, with the number of charter students growing from 2 percent to 6 percent of all public-school students. But the rate of growth slowed after 2016 (see “Why Is Charter Growth Slowing? Lessons from the Bay Area,” research, Summer 2018). There are several possible reasons for this. The rate of states passing charter laws declined after 1999, and many of the laws passed since 2000 have included provisions that can stymie growth: caps on the number of schools allowed, arcane application requirements, and land-use and other regulations. In addition, a political backlash is slowing charter expansion in some states.

Researchers who have looked at the academic performance of students in charter and district schools at a single point in time have generally found it to be quite similar. For example, the 2019 “School Choice in the United States” report by the National Center for Education Statistics looked at students’ reading and math test scores in 2017 and found “no measurable differences” between the sectors. Also, multi-state studies by the Center for Research on Education Outcomes, or CREDO, at Stanford University have found only small differences in achievement at charter and district schools.

Analyses that summarize findings from multiple studies also report little difference on average between the two sectors, though they do identify specific situations in which charter schools excel. In a comprehensive review published in 2018, Sarah Cohodes wrote that, while the evidence on the whole shows “on average, no difference” between the two sectors, “urban charter schools serving minority and low-income students that use a ‘no excuses’ curriculum” have “significant positive impacts.” In a 2019 meta-analysis of 47 charter studies, Julian Betts and Y. Emily Tang found overall only a small predicted gain from attending a charter of between one-half and one percentile point. And in a 2020 paper, Anna Egalite reported little difference, on average, between the two sectors but wrote that charters in some locales reveal “statistically significant, large, and educationally meaningful achievement gains” for low-income students, students of color, and English language learners.

However, no study has used nationally representative data with controls for background characteristics to estimate trends in student performance over a twelve-year period. That is our goal here.

Data and Method

Our data come from the National Assessment of Educational Progress. NAEP is a low-stakes test that does not identify the performance of any student, teacher, school, or school district. Rather, it is used to assess the overall proficiency of the nation’s public-school students in various subjects at the state and national levels. A nationally representative sample of students in grades 4, 8, and 12 take the reading and math tests every other year. We do not report results for 12th-grade students because the number of test observations in the charter sector are too few to allow for precise estimation.

Between 2005 and 2017, more than four million tests were administered to district students, and nearly 140,000 tests were given to charter students, with data available on each student’s ethnicity, gender, eligibility for free and reduced lunch, and, for eighth-grade students only, the level of parental education, number of books in the home, and availability of a computer in the home. We do not include in our main analysis controls for participation in the federally funded special education and English language learner programs, because schools in the two sectors may define eligibility differently. However, we confirm that our results do not change in any material way when controls for these two variables are introduced.

We report trends in standard deviations, a conventional way of describing performance differences on standardized tests. Because NAEP tests are linked by subsets of questions asked both in grade 4 and 8, we can use this metric to estimate the difference in the average performances of students in those grades. We then create an estimate of a year’s worth of learning based on the average difference in student performance between those grades.

We compare performance of student cohorts on those tests in 2005 and 2017 and find that, on average, students in 8th grade performed 1.23 standard deviations higher than students in 4th grade. This implies that students learn enough each year to raise their reading and math test scores by approximately 0.31 standard deviations. Accordingly, we interpret a test-score improvement of 0.31 standard deviations as equivalent to roughly one year’s worth of learning.

Trends in performance are based on the distance between the charter and district school scores on NAEP tests in 2007, 2009, 2011, 2013, 2015, and 2017 and their average scores in 2005, which are set to zero. We report these differences in standard deviations. We apply the survey weights provided by NAEP to obtain representative results.

Figure 1 - Charters Catch Up to District Schools on National Tests

Investigating Differences by School Type

We first look at differences in average scores on the 2005 and 2017 tests. On average, district schools outperformed charter schools in 2005 in both the 4th and 8th grades—particularly in math. For 4th-grade students, the average math score at district schools was 237 points compared to 232 at charter schools, a difference of 0.15 standard deviations. In reading, the district school average was 217 compared to 216 at charters. For 8th-grade students, the average math score at district schools was 278 compared to 268 at charters, a difference of about 0.28 standard deviations. In reading, the district school average was 260 compared to 255 at charters.

By 2017, most of these differences had disappeared, or nearly so (see Figure 1). In 4th grade, charters still trailed districts by 3 points in math, with an average score of 236 compared to 239. In reading, however, the average charter score was one point higher at 266 compared to 265 for district schools. On 8th-grade tests, the sector had the same average score in math of 282 and virtually the same in reading, at 266 for charters and 265 for district schools. None of these 2017 differences were large enough to be statistically significant.

In looking at performance trends across all seven of the NAEP math and reading tests from 2005 through 2017, we find a larger increase in student achievement for students at charter schools than for students at district schools (see Figure 2). On average across grades and subjects, test scores at charter schools improved by 0.24 standard deviations during this time compared to 0.1 standard deviations at district schools.

Changes in the demographic composition of students who were enrolled at district and charter schools during those years may have differed, so we perform additional analyses that adjust for students’ background characteristics. After that adjustment, the test scores for students at charter schools improved by 0.09 standard deviations more than scores for students at district schools, which is equivalent to a little less than one-third of a year’s worth of learning. The differences are larger for 8th-grade students, at 0.12 standard deviations, than for 4th-grade students, at 0.06 standard deviations.

In other words, a considerable difference in the trends in student performance between charters and district schools cannot be explained by demographics. Either there are unobserved changes in student characteristics related to performance in the two sectors or charter schools, relative to district schools, are providing an increasingly effective learning environment.

Recommended Posts