Editor’s Note:
A bit of a kerfuffle ensued recently (9/13/2018) on my FaceBook page when I reposted a blogpost by my friend and colleague, John Butcher. Some readers took great umbrage that Butcher and I were somehow accusing Mary Munford principal, Greg Muzik, and his staff of cheating on Standards of Learning (SOL) Tests.
For the record, neither Butcher nor I ever suggested any such thing. We did, however, question the likelihood that both the math and reading scores would be exactly the same. In multiple years.
The purpose of Butcher’s piece was to turn from reporting about the worst scores in our public schools to analyzing the best schools. Butcher, a former chemistry professor at Hampden-Sydney College and a graduate of University of Virginia Law School, served as an Assistant Attorney General for the Department of Environmental Quality (DEQ) in the Virginia Attorney General's (OAG) office. I mention Butcher’s bonafides to let folks know Butcher understands the minutiae of analysis, percentages, charts and Virginia law. He is also an expert on the Freedom of Information Act (FOIA). As officials in the Richmond Public Schools (RPS), Richmond City government and the Virginia Department of Education (VDOE) can attest, Butcher is indeed wicked smart and has never suffered fools gladly. (Click HERE for additional information on Butcher.)
To anyone who questioned his accuracy, I have never known Butcher to be wrong during the last ten years that he and I have collaborated on our respective blogs to hold RPS and VDOE officials accountable. To anyone who questions our intent, the answer is simple: to hold all schools accountable, regardless of zip codes.
Midway through a series of robust exchanges on Sept. 13, several FB posters and I concluded that problem at the heart of the matter was most likely bad data management. Both former Supt. Dana Bedden and the current Supt. have asked for additional money in order to have sufficient personnel to deploy to use the data that the district collects. And both were denied.
Butcher and I have plenty of reasons to question both RPS and VDOE data. Butcher earlier quoted Scott Adams for the notion that “whenever you have large stakes, an opportunity for wrong-doing, and a small risk of getting caught, wrong-doing happens. . . . When humans can cheat, they do.” That certainly is what we’ve seen wholesale in Atlanta and in Virginia earlier on the VGLA and most recently at Carver Elementary School.
A bit of a kerfuffle ensued recently (9/13/2018) on my FaceBook page when I reposted a blogpost by my friend and colleague, John Butcher. Some readers took great umbrage that Butcher and I were somehow accusing Mary Munford principal, Greg Muzik, and his staff of cheating on Standards of Learning (SOL) Tests.
For the record, neither Butcher nor I ever suggested any such thing. We did, however, question the likelihood that both the math and reading scores would be exactly the same. In multiple years.
The purpose of Butcher’s piece was to turn from reporting about the worst scores in our public schools to analyzing the best schools. Butcher, a former chemistry professor at Hampden-Sydney College and a graduate of University of Virginia Law School, served as an Assistant Attorney General for the Department of Environmental Quality (DEQ) in the Virginia Attorney General's (OAG) office. I mention Butcher’s bonafides to let folks know Butcher understands the minutiae of analysis, percentages, charts and Virginia law. He is also an expert on the Freedom of Information Act (FOIA). As officials in the Richmond Public Schools (RPS), Richmond City government and the Virginia Department of Education (VDOE) can attest, Butcher is indeed wicked smart and has never suffered fools gladly. (Click HERE for additional information on Butcher.)
To anyone who questioned his accuracy, I have never known Butcher to be wrong during the last ten years that he and I have collaborated on our respective blogs to hold RPS and VDOE officials accountable. To anyone who questions our intent, the answer is simple: to hold all schools accountable, regardless of zip codes.
Midway through a series of robust exchanges on Sept. 13, several FB posters and I concluded that problem at the heart of the matter was most likely bad data management. Both former Supt. Dana Bedden and the current Supt. have asked for additional money in order to have sufficient personnel to deploy to use the data that the district collects. And both were denied.
Butcher and I have plenty of reasons to question both RPS and VDOE data. Butcher earlier quoted Scott Adams for the notion that “whenever you have large stakes, an opportunity for wrong-doing, and a small risk of getting caught, wrong-doing happens. . . . When humans can cheat, they do.” That certainly is what we’ve seen wholesale in Atlanta and in Virginia earlier on the VGLA and most recently at Carver Elementary School.
Something has to give. The way they do this now clearly reduces the accuracy that is supposed to derive from the SOL tests and NCLB and the new ESSA accountability measures. Proper measurement is a management tool and encourages accountability. To have no safeguards in play to determine the credibility of the data is absurd.
This method of measuring and reporting a school’s supposed SOL scores without having checks and balances in place simply insures that the data, which VDOE and the local districts spend a fortune to collect, has very little reliability or usefulness. Reducing that accuracy, reduces the possibility that policy makers can reach informed and wise decisions that will help improve public education. If the data are faulty, how can the conclusion be accurate? And, if the schools’ problems are intentionally disguised and distorted, how can we expect to fix them?
This method of measuring and reporting a school’s supposed SOL scores without having checks and balances in place simply insures that the data, which VDOE and the local districts spend a fortune to collect, has very little reliability or usefulness. Reducing that accuracy, reduces the possibility that policy makers can reach informed and wise decisions that will help improve public education. If the data are faulty, how can the conclusion be accurate? And, if the schools’ problems are intentionally disguised and distorted, how can we expect to fix them?
The following comments from Butcher are in direct response to the suggestion that his data was incorrect (it wasn’t). ~ Carol A.O. Wolf
Muzik Non Rilassante
By John R. Butcher
September 19, 2018 by cranky
Note added 9/20: Mr. Muzik emailed a response to this post. I’ve appended it here.
Still later: Plus a note from Charles Pyle of VDOE.
The Principal at Munford, Greg Muzik, posted a couple of comments to Carol Wolf’s repost of my Munford post. Yesterday, I copied him on a note to Carol and Chuck Pyle of VDOE that included a spreadsheet analyzing his data and reexamining my own. Mr. Muzik replied to the email. I now respond:
Mr. Muzik,
I write in response to your email of yesterday to me with copies to Mrs. Wolf and Mr. Pyle (copy attached below). Your email begins:
I am not sure what you mean by bowderlized. I sent the data based on what you had put on our blog related to student pass rates (just grade 3 and 5 from 2017 and 2018 and what Carol W. had requested.
Notes: The parenthesis is not closed in the original. The “what you had put on our blog” is nonsense: I posted originally to my own blog; Mrs. Wolf reposted to hers; I have not “put” anything on any blog of yours. As set out below, my analysis manifestly was not restricted to “just grade 3 and 5.”
By “bowdlerized,” I mean the adjective form of the verb, bowdlerize:
Bowdlerize verb . . .
2 : to modify by abridging, simplifying, or distorting in style or content . . .
Synonyms -- censor, clean (up), expurgate, launder, red-pencil
The pdf you sent Mrs. Wolf contains eight pages. These report “CAT” data for Munford: For 2018, one page each 3d grade reading and math and one each 5th grade reading and math; and the same four datasets for 2017.
These data appear to be the basis for your objection to my remarks.
Those remarks were, in relevant part: It would be unusual to see scores in the 86’s for both lower grades and some 8 points higher in grade 5 in either subject, much less in both. It must be beyond coincidence that, as well, the reading and math scores are the same at each grade level and when averaged, either by grade or by student.
Your response, posted in comments on Mrs. Wolf’s site, discussed at length the CAT pass rate differences at Munford between the third and fifth grades, with no mention of the fourth grade and without any principled explanation of the pattern of reading/math pass rates at all three grade levels in the official VDOE data containing the results of all the tests.
You thrashed that straw man after admitting “I am not sure where Butcher is pulling his data.” If you had attended either my post or Mrs. Wolf’s repost, you would have seen the link to the VDOE database (containing the scores I list above) in the second line of each post.
(On the subject of attention or the lack of it, I would point out that my name is “Butcher,” not “Bollard” as you call me at one point in your comment on the Wolf site.)
If you had wished to dispute my data, the principled approach would have been to download the VDOE data, check my analysis, point out any discrepancy, and suggest a data exchange to resolve any dispute. Instead, you used a partial dataset that is not available to the public to attack my use of the official data that I had sourced and that you admitted were foreign to you.
More to the point, your analysis, such as it was, focused on the aspect of the data that I called “unusual” and, aside from your analysis of the wrong, partial database, failed to respond to the more serious issue, the identical pass rates.
I found ten instances in the six-year period where a Richmond elementary school reported identical reading and math pass rates at grade 3, 4, or 5.
Exactly half of those cases were at Munford.
The presence of Carver in this list carries a particularly unfortunate implication. The presence of Oak Grove, where there was a 2005 cheating scandal, casts a weaker aura.
Friar Occam would counsel the same conclusion that I suggested:
Cheating (by the school, not the students), done systematically, could easily produce equal pass rates in the two subjects. Coincidence is most unlikely here. No other explanation presents itself.
I now would amend that to include another unlikely possibility, error by VDOE in processing the data. While possible, that also is not a likely explanation. I have been looking at their data for more than a decade. I recall only one instance where they made a mistake (it was obvious on the face of the data) and one other where they failed to demand that Richmond complete a faulty submission.
As you saw (or should have seen) from my earlier email, I have asked Pyle whether VDOE has an explanation for the discrepancy between your and VDOE’s pass rate data. The copies you provided, along with the discussion in your last email, give us that explanation: You chose to rely upon a limited set of data, not available to the public, that reports the results of CAT tests, not the full range of tests included in the VDOE pass rate data that I cited and used in my analysis.
To the more fundamental point, VDOE has already taken a first look at the Munford data: Charles Pyle of VDOE wrote me on the 17th:
"Not the final word, but I wanted you to know that we looked at the Mary Munford Elementary data. When you breakdown the grades 3-5 reading and math pass rates, the advanced/pass-proficient percentages are not identical. Given the number of students tested in each grade level, coincidence is not out of the realm of possibility."
Of course, coincidence is not out of the range of possibility; neither is VDOE error. Neither is probable, however, and coincidence is especially unlikely in light of the data above.
BTW: You have overlooked a feature of the State data that supports the notion that the Munford scores were not rigged. One giveaway to the Carver data was the sudden rise to best in the City. For (a startling) example:
In contrast, the Munford data, particularly in the third grade, show year-to-year variations, with an improving trend, all in a good but not spectacular range. For example:
If your people were cooking the data, these numbers could be expected to be more uniform and, especially, higher, contributing to a statewide all-subject average rank much better than Munford’s current #203 (of 1,714).
On another subject, you said, “Butcher seems to be complaining about our tax dollars used to support education.” Again, you have revealed your lack of attention to the material you are criticizing. My comment, “98.2 million dollars of our tax money at ‘work,’” was the sarcastic conclusion to a jeremiad about VDOE’s ability to easily spot cheating such as persisted at Carver for at least four years, and its abiding failure to do so.
Indeed, in the Carver case, VDOE demonstrated that it has a key to falsification of the cheating hypothesis as to Munford. Their analysis of the Carver situation comported with the obvious notion that students whose scores are artificially raised in elementary school will generally score much lower once they enter middle school. I already have asked Pyle whether VDOE intends to perform a similar cohort analysis of the Munford students who have recently entered middle school.
I hope you will join me and asking that they conduct that study for all of our elementary schools. Even though it cannot identify problems with this year’s grade 5 pass rates, that study can root out past problems and, more importantly, should serve as prophylaxis for next year’s testing.
The email:
I am not sure what you mean by bowderlized. I sent the data based on what you had put on our blog related to student pass rates (just grade 3 and 5 from 2017 and 2018 and what Carol W. had requested. The data sent is “raw” data that shows the actual number of students who took the test and who passed or failed. The data from the state you get may be the adjusted pass rates. This takes into account students who were “recovery” that can add to your pass rate, those who transferred into the school form another division after the first 30 days of school (Failing scores don’t count) and English Language learners whose scores are not counted when DOE adjusts the pass rates.
It also does not include special Education students who take the Va. Alternative Assessment (VAAP). That data is reported separately, but included in the state published pass rates. As I told Ms. Wolf, the scores at Munford are at or just a little higher than what would be expected based on the population we serve. While I would love to say the reason our students perform so well is they great teachers and principal, the results reflect what we see all over Virginia and the nation.
I did neglect to include a sheet for the Plain English Math Assessment that one student took that so the math and reading pass rates in 2018 were the same (4 students failed both math and reading our of the 71 who took the tests). but having the same number of kids passing or failing a test is not unusual and does not mean much. In our case, 3 of the 4 students were special education students and this is a challenge for all schools, and one of the gap groups. In grade 5 reading and math go hand in hand as there is a great deal of reading students must do in the math assessment. The pass rate has nothing to do with how student performed as passing scores range from 400 – 600. A better way to look at grade level performance is the mean scaled score. The mean scaled score for math was 510.5 and the mean scaled score for reading was 497.9. So even students who passed had great variation in their scores (as expected).
For schools, the raw data is what is used to look at instruction, addtional support services and programs to support students. The adjusted pass rates are really just a school accountability process for accreditation. For instruction use, we use the raw data.
On Tue, Sep 18, 2018 at 9:54 AM, John Butcher wrote:
Carol and Chuck,
Here . . . are the (obviously bowderlized) Pearson pdf that Carol received from Munford along with my spreadsheet that compares the Pearson data for Munford (4th grade numbers were absent from the Pearson data I received), my original Munford numbers extracted from a large dl from the VDOE site, and a fresh Munford dl from the VDOE site. Yellow headings denote my calculations.
The VDOE pass rates are identical in both downloads. There is a clear disconnect between the Pearson and VDOE data.
[Chuck]:
Do your data folks have an explanation?
Carol tells me that the Principal told her that the Pearson data include SGPs. Does the FOIA still define “public record” to include data “prepared or owned by, or in the possession of a public body or its officers, employees or agents in the transaction of public business”?
Muzik email of 9/20:
1. The data I provided is the raw data from Pearson and was not altered data in any way. This does not account for SOA adjustments that relate more to accreditation that student performance. The Raw data is what drives instructional priories and programs to address remediation and student support.
2. I only addressed grade 3 and 5 because there seemed to be a concern about the SOL pass rate growth of 8 points from grade 3 – 5 which matches the growth state wide from grade 3 – 5. I provided several reasons why we see growth from grade 3 – 5.
3. It it not unusual to have the same pass rates in reading and math in a specific grade level. In grade 5 last year, it just happened that there were 4 students who failed the SOL math and 4 who failed the reading. This does not mean the results are the same as the range of failing is 0 – 400. It is also not unusual for students who fail the reading test to also fail the math especially if they are students with disabilities. Many students who struggle with reading comprehension, vocabulary and fluency may also have difficulty with math fluency. However, I reviewed the specific students scores and in grade 5. Only one student failed both the reading and math SOL tests. The rest of the students failed one and passed the other with scores ranging from a low of 301 to a high of 397. A student with a score of 397 was one question away from passing!
As already indicated, there is nothing unusual about the SOL results at Munford. We are one of very few schools in the city that has very few children living in poverty. Our student population reflects the neighborhood we serve. Our scores reflect what is seen across the state based on the population of students. All other outcome indicators at Munford have similar results such as the PALS, NWEA MAP and ARDT when we used it.
Mary Munford has outstanding teachers who provide high quality instruction to students and this does impact SOL results, but may not be seen if simply looking at pass rates. High quality instruction helps all students, but pass/fail end of course testing does not show student growth during the year.
Pyle email:
I’ll just add that I am not familiar with the form of the data schools receive directly from Pearson. The 2017-2018 pass rates reported by VDOE in August were calculated according to business rules under ESSA, just as pass rates were previously calculated according to business rules under NCLB. The adjusted accreditation pass rates we’ve reported for SOA purposes reflect the state Board of Education’s regulations and another set of business rules.
***
Next week, VDOE will report accreditation data aligned with the revised SOA.
No comments:
Post a Comment
Remember: I will review all comments before posting and if you wish your information to remain confidential, please know that I will honor your request.