If you wanted to boost the pass rates of the SOLs, you’d have three choices (aside from the one perfected at Carver): Improve teaching, make the tests easier, or relax the scoring.
On the 2019 revision of the math tests, the Board of “Education” chose the last option: They adopted cut scores in five of six cases that were less than the level necessary to retain the same level of rigor as the earlier tests. The results were predictable (and, of course, fed the false notion that student performance was improving).
The Board now has jiggered the English tests to the same end. The recommendation(teacher-driven; no pretense here of objectivity) was for every cut score to be lower (easier) than the level necessary to maintain the rigor of the tests.
The Board rejected the Superintendent’s slightly higher recommendations and adopted the committee’s numbers (video at 1:44:55; minutes are not yet available). This grade inflation will have the happy result of making the Board and the Superintendents and the English teachers and the students all look better, all without anybody having to break a sweat.
It will also make it impossible to measure the effect of the coronavirus on English performance.
This is not an anomaly, but rather part of an ongoing campaign to camouflage the fading performance of the Virginia public school system. However, unfortunately for the self-serving mendacity of the “education” establishment, the NAEP data for fourth grade reading
and eighth grade reading
give away the game.
Your tax dollars at “work.”
Richmond: The Excellent, the OK, and the Awful
While we wait to see how far the Board of Education will punt on the 2021 SOL testing, let’s look in some detail at the 2019 performance of Richmond’s schools (there having been no testing in 2020).
But first, some important background: Statewide, “economically disadvantaged” (here, “ED”) students underperform their more affluent peers (“Not ED”) by some 17 to 22 points, depending on the subject. Thus the school and division average pass rates depend both on student performance and the relative numbers of ED and Not ED students. This punishesthe schools and divisions with large ED enrollments. We’ll avoid that issue here by looking at the rates for both groups.
To start, here are the ED pass rate distributions of Virginia and Richmond schools on the reading tests.
The blue bars are the counts of Virginia schools with the indicated (rounded) pass rates. The red bars, with open middles so the state data can show through, are Richmond; the Richmond scale is on the right-hand axis.
The state data here (and even more in the next chart) are skewed toward the low end. That renders the usual measures of a distribution, the mean and standard deviation, less useful. The measure reported here is the median.
The two Richmond schools that aced the reading tests are Open and Community. The next entry, at 86%, is the other selective school, Franklin. The best of the mainstream schools is Marshall at 71%. The only other school to beat the state median was Cary at 68%. The eight Richmond schools in the cellar are, from the bottom, Alternative, Fairfield Court, MLK, Carver, Woodville, Chimborazo, and Mason.
The Not ED data portray another disaster.
Community and Open again aced the tests. They are followed by Munford, Hill, Franklin, Fox, Alternative and, barely above the state median, Patrick Henry. At the other end, the largest failures are, from the left, Greene, MLK, Boushall, Woodville, Elkhardt-Thompson, and Henderson. Fairfield Court would surely be in that latter list but for the suppression rule (<10 Not ED students).
Turning to the math tests, the Richmond pass rates are even less encouraging:
The schools that beat the state median are, from the top, Open, Community, Cary, Franklin, and Redd. At the other end, the basement dwellers are, from the bottom, Alternative, MLK, Fairfield Court, Carver, Boushall, Wythe, and Henderson.
As to Not ED, Open, Community, Munford, Fox, and Ginter Park beat the state median. Boushall, MLK, Wythe, Greene, Henderson, Elkhardt-Thompson, and Blackwell all scored below 50%.
These data emphasize the huge spreads between Richmond’s best and worst schools as well as the stunning under-performance of flocks of Richmond’s students.
For the record, here are the data, sorted by decreasing averages of the four data points. The “#DIV/0!” entries are for cases where the student count was zero or, more likely, suppressed by VDOE because it was <10.
Region 7 Addendum
We have seen that the divisions in SW Virginia (“Region 7” in the VDOE system) formed their own organization, the Comprehensive Instructional Program (“CIP”), that brought nice improvements in student performance.
While we wait to see whether the Board of “Education” will punt on the 2021 SOL testing, I’ve been looking over the 2019 data (there being no tests in 2020). The data for Region 7 paint a lovely picture.
You may recall that, since undertaking the CIP, Region 7 has seen major improvements in the pass rates of its economically disadvantaged (“ED”) students.
They accomplished this with a large and increasing ED population.
To put the 2019 results in a more nuanced context, let’s start with the school average reading pass rates for the Not ED students.
The blue bars are the counts of Virginia schools with the indicated 2019 pass rates of Not ED students (rounded to the nearest whole nubers). Thus, one school (Fairfax County Adult High) turned in a 13% pass rate(!) and 102 schools had 88% rates. The red-bounded bars are Region 7, left open to allow the state numbers to show through. The Region 7 scale is on the right vertical axis. The lowest school there turned in a 69 while 11 schools had 91% rates. (Excel reports for “multiple items” when you tell it to report data for more than one division; please read that term as “Region 7.”)
The usual statistical measures, mean and standard deviation, are of limited use with skewed distributions so I show the medians here. Of course, as a distribution approaches “normal,” the median approach the mean. In any case, these are medians of the school averages, not division medians.
If you think the Not ED pass rates for Region 7 schools are a pleasant bit of news, take a look at the ED numbers:
Here, the Region 7 median is ten points higher than the state. Or you might prefer to ignore those stats and just look at the lovely picture.
The math data similarly testify to the success of the CIP.
It is instructive to compare the (manifestly sensible) techniques used by the CIP with the resolutely ineffective bureaucratic nonsense imposed by the “education” establishment.
- Identify the good teachers,
- Share their materials and techniques,
- Measure what works,
- Focus on core skills,
- Set high expectations,
- Bond with the students, and
- Use the feckless VDOE only for what
it actually can do well: crunch numbers.
The state, here the Petersburg Corrective Action Plan (for a division that the state has been attempting to repair, without success, since 2004):
I think it is past time to redirect the education bureaucracy to what it can do well, crunch numbers, and give the rest of its budget to the CIP.
More Money and Less Education in Richmond
On the subject of spending for schools, the VDOE Web Site has 2019 data for division income by source and fall enrollments of both economically disadvantaged (“ED”) students and their more affluent peers (“Not ED”).
The division income totals per student, plotted against the % of ED enrollment, looks like this:
Richmond is the enlarged, yellow point. The red points are, from the left, the peer cities Newport News, Hampton, and Norfolk.
The fitted line suggests that per student division income increases by $68 for a 10% increase in the ED percentage but the R-squared value tells us the variables are uncorrelated.
Richmond is in 15th place in this find-the-money derby.
In terms of local funding, Richmond again is above average but down in 27th place.
The R-squared value of 7% suggests a slight correlation between local funding and % ED, but in the decreasing direction.
The other funding sources present a more interesting picture.
State funding shows a modest correlation, R-squared = 22%, while the federal data exhibit a more robust R-squared of 42%. Funding in both categories increases with increasing % ED. Richmond is well below the fitted curve for state funding, with part of that gap closed by Uncle Sam.
The Sales Tax funding is essentially flat.
Looking again at just the Big Winners:
Here we see that larger than average local taxes support the effort in every case while the State accounts for some of the excess in Highland and Sussex and the federales do so in Surry, Sussex, and Richmond.
Of course, once that money comes in the divisions spend it. We earlier saw the expenditure data juxtaposed with Richmond’s lousy performance:
If only Richmond would stop whinging about money and start educating its schoolchildren.
No comments:
Post a Comment
Remember: I will review all comments before posting and if you wish your information to remain confidential, please know that I will honor your request.