David G. Tuerck
January 20, 2001
Education reform in Massachusetts
took an important turn this month, with the release by
the state Department of Education of its cornerstone
School Performance Rating Process (SPRP) report. The report
was issued as a step toward the implementation of the
Education Reform Act of 1993, under which the state assumed
a much expanded role in funding and managing what were
once locally funded and controlled public school districts.
The SPRP report examined schools for their
improvement on MCAS tests over the period 1998 to 2000.
Unfortunately, the methodology used to rate schools failed
in what should have been its central purpose, which was
to identify schools that do an exceptionally good
or an exceptionally bad job of teaching.
The reason is that the
ratings given individual schools are based on nothing
more than the subjective and arbitrary views of the report's
authors. The relevance of this disregard for statistical
rigor lies in its consequences for schools that are found
failing to meet or exceeding the authors' standards. Once
identified as failing to meet those standards, schools
can receive warnings or referrals for review.
School districts found to be chronically under-performing
can be put in state receivership. Conversely, schools
identified as exceeding DOE standards become eligible
for recognition as exemplary schools or role
models for others to emulate.
The hope behind these
penalties and incentives is to encourage schools to do
a better job of teaching. However, the design of the school
rating system doomed this hope from the start.
The problem is that, in
designing the rating system, the state ignored what everyone
knows namely, that a school's performance depends
mainly on factors beyond the control of its administrators
and teachers, in particular, the socioeconomic character
of the community in which it operates. A business owner
cannot determine what he expects of a store's
manager without considering that manager's customer base.
The owner of a chain of swimsuit shops does not expect
his Maine shops to match sales in January with his Florida
shops. The same goes for different schools striving to
teach students from very different communities.
Education officials defend
the SPRP on the ground that they don't want us to permit
socioeconomic factors to limit our expectations of students
in disadvantaged schools. But all the research shows that
socioeconomic factors overwhelmingly determine school
performance. To ignore these factors, therefore, is to
give low marks and hence to penalize some
schools that deserve to be recognized and rewarded for
their improvement and to give high marks and hence
to reward some schools that deserve to be recognized
and penalized for their failure.
An education assessment
model developed by the Beacon Hill Institute shows how
a carefully constructed statistical model can lead to
results that are very different from and far more
reliable than those obtained by DOE. The BHI model
predicts school performance on MCAS tests with an extraordinary
high degree of accuracy. Schools whose students do much
better on the tests than what the model would predict
can be fairly deemed as exceeding expectations. Conversely
schools doing much worse can be deemed as failing to meet
expectations.
Consider, for example,
the Hadley school district. DOE rated Hadley 4th
and 10th graders as having failed
to meet expectations and 8th graders as having
only approached expectations. Yet, Hadley
students rank among the top ten schools according to their
success in outperforming the BHI model.
Thus, while the DOE officials
would slap Hadley on the wrist for doing at best a mediocre
job of meeting their expectations, they should instead
go to Hadley to learn how to run a school. If they did
so they would find a lean administrative staff, teachers
who voluntarily give extra time to meet learning targets
and a community committed to keeping its school system
small in order to maintain control of the curriculum.
Others lessons could be learned about Everett 4th
graders, Clinton 4th and 8th graders
and Shrewsbury 10th graders, none of whom exceeded
DOE expectations but all of whom exceeded statistical
expectations.
Faced with the prospect
of having to deny graduation to students unable to pass
the MCAS tests, Massachusetts schools need to know which
schools among them have a track record of good teaching.
This they can do only when the state is willing to put
no-nonsense statistics ahead of well-intended but entirely
unrealistic education standards.
This article appeared
in the January 20, 2001 edition of the Boston Globe.
Format revised on August 18, 2004
|