We’ve seen that division SOL pass rates fall with increasing economic disadvantage. Those data also suggest that Richmond’s gross underperformance is not explained by the economic disadvantage of the Richmond students.
Drilling further into the relationship between academic performance and economic disadvantage (ED for short), the reading pass rates of Richmond’s elementary schools show a moderate correlation with ED and the mathematics a weak correlation but our middle and high schools show considerably more robust correlations:
Here are the SOL/ED data:
Note: Franklin has both middle and high school grades; I omit it from the graphs because it does not directly compare to either kind of school.
The other thing to notice about the middle schools is the very low pass rates. Here, for reference, are the average pass rates by grade. The horizontal lines are the reading and math “benchmarks” for accreditation.
Why do the middle schools get much lower SOL pass rates with mostly the same kids as the elementary schools? Let’s infer that the middle schools are doing a muchworse job. See below.
In any case, the R2s imply that the SOL is affected, especially in the middle and high schools, by economic condition or something related to it.
The Student Growth Percentile (SGP) was supposed to remove that correlation so I turned to the latest available data, the 2014 data by school in the 2d download from VDOE in response to Brian Davison’s suit.
There are no high school reading or mathematics data for Richmond in that dataset (EOC Algebra I only) but the elementary and middle school results are compelling.
Here we see our elementary schools performing at bout the 50th percentile on math and a notch lower on reading. Those performances were mostly uncorrelated with ED (reading R2 of 1%; math, 3%). The Good News: These learning measures, esp. the reading, are a bit better than the SOL pass rates might suggest.
The school with a reading SGP of 71 (!) is Carver; the 63 is Jones. As to math, we have six schools above the 60th percentile (Ginter Park at 70; Fisher, 67; Carver, 66; Jones, 65; Munford, 64; and Greene, 62), with Reid in the basement at 32. That collection of reading STPs just under 40 is not encouraging.
Caveat: These data use the whole school %ED from the Fall census. The VDOE data would allow calculation for only the SGP grades, 4 & 5, except that their data suppression rules give blank ED values for Munford and Henry by suppressing the fifth grade data (fewer than ten kids reported). The totals are larger than the sums for the individual grades and presumably include all the ED students so I’ll stick with the (presumably undoctored) total data.
Here are the data:
The two very low ED schools are Munford at 10%, performing well above the 50th percentile, and Fox at 22% ED scoring at the 50th percentile in reading but only the 44th in math. This makes it look like those nice SOLs at Fox are the result of smart kids who are scoring well but not improving as much as the smart kids in other schools.
The 24th percentile score in math is Reid.
The conclusion: On the 2014 data, our elementary schools are doing an average job, on average. There’s work to be done at Reid and some others but, all in all, the SGPs report more learning than the SOLs might suggest.
And how much the kids learned was generally unrelated to economic disadvantage.
The middle schools were an unhappier story:
The database let me pull the 6th, 7th, and 8th grade data so I’ve included Franklin.
Note the low average performance and the modest correlation of the math scores. Also notice the absence of schools with low ED populations.
As to that last point, these data raise the question whether those low ED kids from Munford and Fox have dropped out or gone to the Counties or to private schools for middle school or whether their numbers just disappear into the average.
To that issue here, first, are the totals:
And here are the details:
Or, relative to the 9th grade memberships:
VDOE publishes no data on kids who drop out before entering middle school The data they do share indicate zero dropouts from grades 7 or 8 in 2014. That seems unlikely but it’s all the information we have.
We are left with the possibility that the middle school drop in membership and rise in %ED reflects some of the more affluent kids fleeing to private schools and to the Counties. The precipitous drops in both total and ED membership after the 9th grade surely come from dropouts.
But to revisit the major point: The low correlations with ED tell us that the low middle school SGPs can’t be caused by the increased economic disadvantage; the leading candidates for those lousy SGPs, then, are lousy teaching and/or lousy administrators who fail to control the middle schools.
The other point here: The State Department of Data Suppression has stopped calculating SGPs, which leaves us with the manifestly flawed SOL data to assess school (and teacher) quality. It seems we’ll have to wait until late summer to see whether they are going to release or suppress their new progress (aka “value”) tablesthat measure academic progress (but mostly ignore the lack of it).