This blog is committed to the idea that research matters, that research methods matter, and that they matter to regular teachers, principals, and district policymakers, and this blog is also dedicated to figuring out why real research often doesn’t matter in the least, why it is often disregarded, ignored, or unknown to practitioners and policymakers, and what we all can do about this sorry state of affairs.
And now, just as we’ve been trying to get this blog going, along comes a gift-wrapped controversy: the NCTQ report on the woes of our university-based teacher preparation programs. It’s perfect because this report manages to be completely off-base and laughably preposterous in every specific claim while at the same time making a broad claim that is undeniably and inarguably true. The idea that this report’s methodology can accurately rank one teacher preparation program as higher quality than another program is just plain silly, but the underlying idea that teacher preparation programs are collectively weak is just plain common knowledge.
So, first, this report got a ton of attention and was considered by many as a serious indictment of the quality of teacher preparation, in spite of the fact that it was based exclusively on examining course syllabi and online program materials.
Seriously. This report purports to rate the quality of our nation’s ed schools based on their course syllabi.
Yet, it garnered headlines such as this one from the LA Times: “New teacher training study decries California universities.” And here is the Times’ summary: “A controversial policy group singles out teacher training programs at UCLA and Loyola Marymount as hardly worth attending. But the schools say the report is flawed.” And their analysis of the methods: “The researchers were trying to develop a consistent, relevant rating scale, including such measures as whether incoming teachers learn to analyze student performance data and whether they learn about phonics-based reading instruction. The council said its effort will evolve and should become increasingly reliable.”
And here was the Superintendent’s reaction: “It’s widely agreed upon that there’s a problem” with teacher training, said L.A. schools Supt. John Deasy. “The report points out that California has an acute set of problems.”
Now, to be fair to the media, the report has also been widely and prominently criticized for its methodological weaknesses and inaccuracies (Linda Darling-Hammond) and ideological bias (Diane Ravitch). So I won’t reiterate the details here.
But, because the NCTQ report is pointing to a problem that is widely perceived (you’d be hard pressed to find a teacher who would say s/he was well prepared for the job on day one), and because the critics of the report are often perceived as politically motivated, the “controversy” is playing out much the way the LA times’ subheading states: “others say the report is flawed.”
This report ends up feeling a lot like the LA Times’ value-added controversy: an inflammatory and deeply flawed way of attracting attention to a problem that everyone already knows about. Surely there are other ways to fix the rotten planks in our education system than by singling out individual teachers or ed schools to poke them in the eye?
This blog will attempt to be a place to find these other means, a place to discuss problems with strong opinions, but also honest appraisals of the evidence on all sides.
And as for this NCTQ report? It’s not that it had nothing of value to report. UCLA, for instance, from my experience, could likely benefit from including a course focused on classroom management and reading Fred Jones, Rick Morris, Harry Wong, and others. And analyzing teacher education syllabi and course descriptions can probably provide us with broad lessons about the inconsistent approaches of various programs.
But the true impact of the NCTQ report ought to be a cautionary tale: when reading a “study,” read the Limitations before the Conclusion.