Yelp! … But For Schools?!
27 Friday Sep 2019
Written by redqueeninla in Education, LAUSD, Privatization
Tags
growth, growth modeling, LAUSD Board Of Education, Performance framework, performance growth, privatization, School Performance Framework, SPF, statistical modeling, Summative assessment, VAM
Share it
Is it a good idea to label schools with a really simple star-rating?
Can we do for Education and schools what Yelp has done for eating and restaurants?
No! Of course not.
And we all know it. It’s not just that summarizing a complex ecosystem like a school via one, single number is simplistic. It’s not just that your child’s needs are different from my child’s even while both may be served perfectly well (even mutually beneficially) in adjacent seats. It’s not just that your opinion being different from mine might not reflect our legitimate, genuine differences.
It’s that an opinion aggregator like Yelp is fundamentally, technically different from the “School Performance Framework (SPF)” LAUSD resolved to develop for stakeholders in spring of 2018. On further reflection the school board will be reconsidering the summative framework (the wording of this resolution is here, at “tab 26”) at its BOE meetings next month.
Yelp works by taking a hard number, your assessment, and “averaging” that opinion with everyone else’s. Statistical theory shows that with enough of these assessments, they will fall out in a bell curve shape with the peak being the “best guess” of “most” folks’ opinion.
The School Performance Framework would cough up a single number just like Yelp does, but it’s derived completely differently. The SPF depends on a guess, an estimate, instead of simply characterizing hard data. (The leap of faith in Yelp scores comes in your presuming the average will be like your opinion, but that’s another issue).
That’s what the much-vaunted “Growth score” is about. “Growth” depends on a bottom-line expectation, a prediction, an interpolation: a guess.
On the basis of a statistical model that generates an “expectation” of performance, “growth” is defined as the difference between assessed and expected performance. And since “growth” constitutes most of the SPF score, SPF is therefore principally composed of a modeled estimate only. SPF essentially launders a prediction through a process so opaque that it comes out the other end looking like a hard assessment. But it isn’t; it’s still just a guess.
And everyone should know by now that “past performance is no guarantee of future results.” Today’s schools are still hurting from the budget cuts inflicted in 2008 following widespread, collective embrace of venture speculation.
To be sure, the desire to generate a best guess about “performance” is motivated by an urgent and understandable desire. Parents (and guardians) today, everywhere, do feel impelled to do the best we possibly can for our charges, and that argues identifying, somehow, “the best”.
But as that other great truism suggests: “you can’t always get what you want.” Just because for-profit real estate websites erroneously render schools’ performance into a single meaningless number, doesn’t mean the public entity overseeing schools should condone or contribute to specious data engineering.
There’s no shortage of information available for analyzing our schools. The trouble is appropriately using it to inform policy, whether public or personal. Parents may access the state’s “dashboard” of school indicators. It will take a little time to understand and hold multivariate factors, but that is our job as advocates and there is no shortcut for it. This matters to us as individuals and because our family’s experience and needs are unique, there is no way to assess individual’s needs globally.
As it happens, wrestling lots of different, and different kinds of, variables down into one single “answer” removes a parent’s prerogative – indeed responsibility – to evaluate the parts that matter most to us personally. But not incidentally, the complexity and nontransparency which marks the SPF index and its “growth” component, winds up just facilitating a route for bias and politicization. Far from being “denied” access to magical growth modeling, halting implementation restricts a system so rococo and esoteric as to camouflage statistical or modeling issues, whether intentional and inadvertent.
Tellingly, the “Performance Framework” is a way of thinking that hails not from education-policy, but instead filters from the world of business management. It arose through 1990s-era strategic management theory, and its establishment of “management processes and setting performance goals and objectives”. Which is to say that the process of setting performance goals and objectives is itself political and subjective. Driven by neoliberal public choice theory this framework champions private business in lieu of public service. And the procedure for deriving performance metrics though numeric, is all the same vulnerable to the ideological bias of its consultants and their overseers, neither statisticians nor even psychometricians, but promoters of this ulterior agenda.
There are other technical objections to the index, but in the end it is enough simply to respect one’s sense that a single rating like this is “too good to be true.” It is the market theorists, afterall, who point out there is “no such thing as a free lunch.” So it is hardly surprising that burying the SPF is not “hiding growth data”, as the charter lobbying front group SpeakUp insists, but protecting us from ideological and political manipulation.
Data is an important tool for understanding and influencing group dynamics. But its use in making individual decisions is another matter altogether. Read the excellent 2014 statement from the professional association of statisticians on precisely this matter (SPF uses a “Value-Added Assessment Model”). And support your school board member in protecting us from other people manipulating our decisions for us. Let them know you support their decision (find contacts here) to rely on the state’s dashboard rather than the SPF. Let them know you support their decision to limit unknown consultant’s filtering of data and instead let the public access education data directly.
One Comment