How RPS Cheated Its Way to the Bottom ...
After John Butcher and I pointed out the cancerous growth of the VGLA (Virginia Grade Level Alternative) testing and its use to cheat on the SOL testing, especially in Richmond, the Virginia Department of Education (VDOE) set out to eliminate the VGLA. The new math test in 2012 mostly eliminated the VGLA as to math. What follows is the first of a series of blogposts analyzing how RPS cheated its way to the bottom. First up: John Butcher's number-crunching and analysis, with plenty of charts that show how Richmond compares with various localities.
As expected, the tougher new test produced a statewide drop in the scores in 2012. This year, the scores (pass rates) bounced back statewide. Richmond is another story.
The division scores minus the State average give a
measure of each division's performance in relative terms.
As you can see, Richmond, Petersburg, and Charles City did not share in the general improvement in 2013; they continued to decline vs. the state average. Note, however, that lowly Petersburg beat Richmond in both 2012 and 2013. Richmond was fourth from the lowest pass rate in the state in 2013.
As with the math test a year earlier, the new
reading test dropped scores statewide in 2013.
Versus the state, Richmond again
Richmond had the lowest pass rate in the state.
The scores dropped statewide in 2011 (and more in Richmond) because HB304 made it tougher to use the VGLA to cheat. In 2012, the statewide reading scores again dropped slightly. The Richmond reading score dropped even more. The new reading test in 2013 produced a huge drop in the Richmond scores, almost certainly because it deprived RPS of the VGLA cheat.
The new front end to the SOL database makes it much easier to extract data.
Those data cast a clear light on the effect of Richmond's VGLA cheating. We start with the math scores by year. Here are the Richmond and state scores for students with and without disabilities; the second graph shows the Richmond scores minus the state scores, both with and without disabilities.
Notice that Richmond has performed well below the state average but its students with disabilities performed well above the state average for students with disabilities until 2012.
Further notice the drop in scores for students with disabilities in 2011, perhaps related to the State Superintendent's belated "concern" about abuse of the VGLA.
To the point here, Richmond's kids without disabilities underperformed the state average while the kids with disabilities outperformed the statewide average for students with disabilities. Until 2012, that is, when the new test stopped Richmond's abuse of the VGLA and took the Richmond scores for students with disabilities below the state scores.
The new reading test in 2013, with no VGLA (except for LEP students), confirms this picture. Again we see Richmond's students before 2013 generally underperforming the state average but its students with disabilities outperforming the state average for students with disabilities. But the 2013 plunge in Richmond's scores, led by the students with disabilities, gives away the game.
"Plunge" understates the reality: Richmond's students with disabilities went from 3 points above the state average for students with disabilities to 15.2 points below from 2012 to 2013. Looking at these scores in context with the math test leaves only one explanation: the Richmond scores for students with disabilities were inflated until deflated by the new, VGLA-free math and reading tests.
The drops in scores for students without disabilities, in 2013 for the reading test and in 2012-13 for the math test, are consistent with the poor preparation, mentioned above, for the new tests.
Whether you credit that analysis or not, explain this: In 2013, Richmond had the lowest pass rate in the state in reading and the fourth lowest math. Here are the bottom ten on each test:
|Prince Edward County||60.6%|
|Charles City County||62.2%|
|Buena Vista City||45.2%|
|Charles City County||46.0%|
I don't think the (ex)Superintendent should merely have been fired; I think she should be sued for deliberately harming these kids.
Don't Blame the Kids
We hear that the Richmond student population is particularly difficult because the kids are [pick your excuse]. The only excuse that I might credit is socioeconomic: Poorer kids don't perform as well in school as their better-off peers. We'll investigate that below; in the meantime, nothing about the Richmond student population explains the recent plunges in the reading and math scores (while a lousy Superintendent explains the situation fully).
A few years ago, the Gates Foundation sponsored a huge data-gathering effort by S&P. Their Web site showed, with some clarity, that school performance correlates negatively with socioeconomic status: Poorer kids don't do as well.
The conventional proxy for socioeconomic status of a school population is the number of kids who qualify for free or reduced price lunches. Indeed, the F/R percentage is the criterion for Title I money from the feds. VDOE has settled on a more general measure, the percentage of students who areeconomically disadvantaged. That term includes any student who "1) is eligible for Free/Reduced Meals, or 2) receives TANF, or 3) is eligible forMedicaid, or 4) [is] identified as either Migrant or experiencing Homelessness."
The enrollment data are available here and the SOL scores are here for both the general population and the economically disadvantaged.
So let's look at the Virginia division pass rates as a function of the percentage of students whom VDOE identifies as "economically disadvantaged." First, the reading test:
The data give a decent least squares fit (R2 = 0.64), suggesting that the ED percentage indeed correlates with the scores. On this graph, Richmond is the gold square (recall that Richmond had the lowest reading score in the Commonwealth this year). The red diamonds are, from the left, Hampton, Newport News, Norfolk, and Petersburg. Charles City is the green diamond.
Here is the same graph for the math test:
The color codes are the same and the correlation is not as good.
Plainly, Richmond is grossly underperforming both the fitted line and a number of other divisions that have similar socioeconomic situations. Hampton and Newport News, both old, urban jurisdictions, are particularly instructive. In short: Socioeconomics does not provide an excuse, much less a reason, for Richmond's lousy performance.
Delving further: The database also gives data for students who are or are not economically disadvantaged. Plotting the pass rates of the economically disadvantaged students v. the rates for the students who are not, by division, gives the following for the math tests:
The R2 of 0.48 indicates a pretty good fit, suggesting that variations in the average pass rate of the kids who are not economically disadvantaged in a division explains about half of the variations of the rate for those who are economically disadvantaged.
The red diamonds are, from the top, Hampton, Newport News, and Norfolk. Richmond is the gold square. Charles City is the green diamond. No surprises there. The largest outliers from the fitted line are Middlesex County, the purple diamond, and Lexington, the yellow diamond.
The graph for the reading test provides an interesting contrast:
The color codes are the same as for the math test, with the outliers again being Middlesex and Lexington. The R2 suggests a much poorer correlation for this test than for the math test.
Of interest here is the relatively poor performance of Richmond's economically disadvantaged students. Those same students performed near the fitted line on the math test. In light of the very low scores on both tests, even after considering the large percentage of economically disadvantaged students in Richmond (see above), the implication is that Richmond is doing a poor job teaching both subjects but an even worse job teaching English to its economically disadvantaged students. Given Richmond's large proportion of economically disadvantaged students, that is consistent with Richmond's position as fourth worst division on the math test and worst of all on the reading test.
I used to think that, despite the obvious rigging of the Accreditation scores, the SOL scores were fairly honest. Then I learned how the Maggie Walkerscores go to the high schools the kids would attend if they did not attend Maggie Walker (more likely, "might attend"; the adjusters ignore the reality that many of these kids would attend private schools if they did not get into Walker). In contrast, the schools -- esp. the middle schools -- can ship their troublemakers off the CCP (Capitol City Program, aka Richmond Alternatative) and have both the troublemakers and their scores land at CCP. So the Richmond schools keep the scores of good students who do not attend those schools and get rid of the scores of lousy students. Nice!
This year they put up the SOL scores on July 1. I had the division and Richmond math scores downloaded when they took the database back down. When I filed a Freedom of Information Act request for the data, they denied the request. Their story, in an affidavit they filed after I sued them, is:
The alternative assessments given to significantly cognitively impaired special education students, known as the . . . VAAP tests, are always among the last results to be reported, not becoming available to VDOE until the end of July.
VDOE then gives the schoools the opportunity to access the VAAP data thro9ugh a secure web-based portal and reassign the scores appropriately per federal law [i.e., to meet the 1% passing cap]. These reassigned scores are then included in the VDOE data set. If the testing records are released prior to all the data being reported to VDOE, given the much smaller number of children represented in this last batch of test scores, one could discern how the Commonwealth's special education children scored on their tests based upon the impact on overall pass rates.
Let's put aside, for the moment, the absurd notion that the addition of a number of VAAP passing scores that cannot exceed 1% of the number taking the SOL would change the average enough to reveal the identities of the students taking the VAAP. Let's look at the actual change in the Richmond math scores from July 1 to the general release on August 20:
|John Marshall High||Mathematics||27.14%||70.31%||43.17%|
|Richmond Community High||Mathematics||55.64%||91.27%||35.63%|
|George Wythe High||Mathematics||32.28%||53.18%||20.90%|
|George W. Carver Elementary||Mathematics||50.00%||70.78%||20.78%|
|Franklin Military Academy||Mathematics||47.02%||63.76%||16.74%|
|Westover Hills Elementary||Mathematics||45.19%||60.71%||15.52%|
|Martin Luther King Jr. Middle||Mathematics||13.32%||25.23%||11.91%|
|Amelia Street Special Education||Mathematics||44.74%||55.17%||10.43%|
|Thomas Jefferson High||Mathematics||40.38%||48.59%||8.21%|
|Lucille M. Brown Middle||Mathematics||49.65%||56.85%||7.20%|
|Oak Grove/Bellemeade Elementary||Mathematics||46.43%||51.68%||5.25%|
|Albert Hill Middle||Mathematics||55.23%||57.52%||2.29%|
|Fred D. Thompson Middle||Mathematics||39.92%||41.01%||1.09%|
|Ginter Park Elementary||Mathematics||36.44%||36.52%||0.08%|
|Miles Jones Elementary||Mathematics||46.79%||45.58%||-1.21%|
|J.B. Fisher Elementary||Mathematics||56.86%||55.20%||-1.66%|
|William Fox Elementary||Mathematics||83.65%||81.78%||-1.87%|
|J.L. Francis Elementary||Mathematics||49.28%||47.29%||-1.99%|
|Mary Munford Elementary||Mathematics||80.50%||76.64%||-3.86%|
|Fairfield Court Elementary||Mathematics||73.04%||69.03%||-4.01%|
|E.S.H. Greene Elementary||Mathematics||66.67%||62.50%||-4.17%|
|Summer Hill/Ruffin Road Elementary||Mathematics||57.40%||52.17%||-5.23%|
|Patrick Henry School Of Science And Arts||Mathematics||48.21%||40.30%||-7.91%|
|Linwood Holton Elementary||Mathematics||70.13%||60.57%||-9.56%|
|Elizabeth D. Redd Elementary||Mathematics||54.65%||45.03%||-9.62%|
|G.H. Reid Elementary||Mathematics||51.45%||40.52%||-10.93%|
|Broad Rock Elementary||Mathematics||89.73%||76.47%||-13.26%|
|Thomas C. Boushall Middle||Mathematics||51.50%||34.62%||-16.88%|
|George Mason Elementary||Mathematics||60.78%||43.43%||-17.35%|
|J.E.B. Stuart Elementary||Mathematics||68.79%||49.63%||-19.16%|
|John B. Cary Elementary||Mathematics||69.05%||46.74%||-22.31%|
|Clark Springs Elementary||Mathematics||88.03%||60.56%||-27.47%|
Whoops! The math score at John Marshall went up 43 points from July 1 to August 20. There is no way on this planet that the small number of passing scores from kids with "significant cognitive disabilities," which is capped by law at 1% of the class size, could raise the average by forty-three points.
Looking at the division scores, the differences are smaller but, on the scale of the SOL's, huge. Here are the divisions whose scored improved by more than one per-cent from July 1 to Aug. 20:
|Virginia Beach City||70.37%||72.03%||1.66%|
|King George County||69.82%||70.90%||1.08%|
|Charles City County||44.95%||46.00%||1.05%|
As you see, Richmond improved its score by over five points over that sixty day period.
It will take some more digging to figure out what is going on here. What is already clear, however, is that there are HUGE variations of the scores from July to August and VDOE wants to keep them (and, doubtless, the reasons for them) secret.