Two Aprils ago the District resolved to “Establish A Framework For Continuous Improvement” of its diverse complex of approximately one thousand schools, including traditional, charter, special education, special programming, etc.

The idea was to ‘identify and track a uniform set of measures for each school’s overall annual performance, that would enable district and parents alike to understand and evaluate school performance’.

A little off schedule (about nine months late), two trios of Educational consultants converged from Washington, DC (“Collaborative Commnications”) and Madison, WI (“Education Analytics”) to present some trial dashboard-like screens of the new School Performance Framework (SPF) to a collection of parents assembled during four daytime sessions, still ongoing this week (click here to attend).

The trouble is that at least for Tuesday’s session, fewer than a dozen parents were present, and the preponderance of these had heard of the meeting through “word-of-mouth”, via a shallow pool of friends and acquaintances demographically similar essentially by definition; not directly by invitation or message from the District. Questioned by a parent about outreach sharply, a District official explained that only parents already engaged with elite, District-level councils were notified during this holiday period, and the meetings were not advertised or managed even on LAUSD’s website. The demographic represented directly curiously reflected just those special-interest lobbying groups originally invoked (@1:27.07) 15 months previously at the time of the Resolution’s hasty passage:  Parent Revolution and SpeakUp. Conspicuously absent were parents not previously actively-engaged with the District, or of a perspective not actively-curated.

The imported consultants took the temperature of potential parent-users regarding various select metrics {e.g. standardized test scores; school suspension rates; reclassification of English-language-learners, etc}; summary baskets holding a suite of these metrics {e.g., “school climate”; “growth”; “college/career readiness”}, and the overall mixture of these summary baskets {e.g. 40% contributed by basket A, 25% contributed by basket B, etc}. The resulting weighted composite scores are used to compute a single, yelp-like, five-star rating.

Because complex ecosystems like a school can be collapsed into a single, summative rating just like that. Because inherently non-comparable metrics like, say, reclassification rates of non-native speakers; and, say, student fitness; and, say, academic achievement among schools with significantly different proportions of special education students – can be manipulated onto a single quantitative scale and termed comparable thereby. Because the individuality of every unique learning child can be subsumed in a heterogeneous generalization that erases needs and personality:  just because.

Most LAUSD metrics are by and large available already at the State’s Department of Education. And those that are not could simply be released by the District without the associated qualitative interpretation, color-coding, star-rating or ranking.

Because the artificial constructs of market baskets, and the subsequent weighting of these, all contribute dramatically to unstable summative interpretations.  When these translate to serious choices or incumbent “high-stakes” like choosing a school, labeling failure or censuring a teacher, equivocal conclusions assume a false command. The American Statistical Association is very clear about Value-Added Assessment (VAA aka VAM) in this rare, 2014 public clarification:  “VAMs should be viewed within the context of quality improvement, which distinguishes aspects of quality that can be attributed to the system from those that can be attributed to individual teachers, teacher preparation programs, or schools” [emphasis added].

Please read the whole paper: it is excellent, terse and should be heeded well. VAM by any other name – VAA, summative assessment, SPF – is a fraught technique with a ghastly history. We must never forget the horrible sequelae to the LA Times’ ingenuous embrace of its false promise. Artificially bolstering support for SPF from among astro-turf parent groups, camouflaging its pre-ordained, inexorable imposition by nominally conforming to policy mandates for stakeholder engagement with inadequately noticed or meaningfully heeded “parent focus groups”, serves absolutely no one well at all.  Not students, not teachers, not the district, not statistical science.

Fact-based, quantitatively-evidenced science is the only kind that there is. But substituting a superficial mathy facsimile for the complex socio-developmental process of Education, is to betray our children’s educational prerogative to high-priced, commercial, partisan stakeholders.