As I discuss at length elsewhere, VDOE has a secret and byzantine process for transforming pass rates into accreditation scores that are divorced from actual performance.
One has to wonder why they bother: They write the tests and they can make them as hard or easy as they wish. Nonetheless, they “adjust” the scores.
They briefly outline the process here. There’s no way to infer the algorithm from that, but we can compare the “adjusted” scores to the actual scores to get a feel for what’s going on.
Well, to a degree: There are two English tests but they report only one “adjusted” score. There’s no straightforward way to compare the actual to “adjusted” numbers. But they report a single math pass rate for each school.
2013 was the second year of the new, tougher math test, so the three year running average included an old, generally higher, number that boosted many scores.
This year, all three years were under the new test, and the generally increasing math scores mostly eliminated the running average adjustment. Thus, the “adjustments” were smaller, albeit with the same effect of increasing the number of schools that reached the 70% “adjusted” pass rate.
The average boost was 2.8%; the maximum, 29%.
Notice how the number of schools with scores below the magic 70% decreased while the number with scores >70% increased:
Indeed, 1245 schools scored 70% or more but, after the “adjustments,” 1372 schools made the 70% cutoff.
Caveat: VDOE reported math pass rates at fifteen schools for which they did not report “adjusted” scores; they reported “adjusted” scores for fifty schools for which they reported no pass rate. I deleted all sixty-five of those schools from the analysis above.
This post already is too long. I’ll do the Richmond scores tomorrow.