ANNA MARIE MURPHY
GARY WAYNE GILBERT
a balmy spring evening in may -- what lately passes for protest season
in the public schools of Massachusetts -- Boston College education
professor Walter Haney sits cross-legged on the floor of the stage
of the Oak Hill Middle School auditorium in suburban middle-class
Newton, going over notes for his presentation. A compact man with
disheveled white hair, he is the picture of studiousness as the
room bubbles to life around him.
In the auditorium seats, some 200 parents, students, teachers, and
other interested parties are gathering for a forum on the three-year-old
Massachusetts Comprehensive Assessment System (MCAS), a springtime
battery of challenging tests required of students primarily in grades
four, eight, and 10 by the state's 1993 Education Reform Act. Thus
far the tests -- mostly multiple-choice, with some open-ended or essay
questions -- have been used only to help identify weak students, ineffectual
schools, and struggling school districts. Now MCAS (pronounced em-kas)
is on the brink of becoming the state standard for graduation -- that
is, to use educators' jargon, on the brink of becoming "high
Beginning with the high school class of 2003 (the 2000-01 sophomores),
virtually all public school students will have to pass the 10th-grade
version of the math and English MCAS tests in order to receive a
high school diploma. They'll get five chances. And if current trends
persist, many students will need all five and more. Last year, 34
percent of sophomores failed the English section and 45 percent
failed the math. Among African-American students the figures were
more grim: 60 percent failed the English portion and 77 percent
the math. Latino students fared even worse.
Most in Haney's audience already feel a personal or professional
stake in the matter, and when they stand at their seats to speak,
their tones are charged and angry. A mother assails what she calls
the "demoralization" of children who fail -- a particular
concern for parents whose children have learning disabilities or
attend vocational school and must take the tests along with students
on the honors track. A principal decries the disruption of ordinary
curriculum and the sacrifice of enrichment programs to prepare for
the tests, as well as the class time lost to the tests themselves,
which can take upwards of 30 hours to complete. (His own eighth
graders happened to have scored best in the state in the previous
year; when reporters asked how that felt, his reply was, "I
feel outraged.") A teacher on the board of the Massachusetts
Teachers Association warns that if large numbers of students continue
to fail the MCAS -- which seems likely -- opportunists will push hard
for the privatization of public schools. "Public education
is the place to make money today," he says, citing charter
schools and vouchers. But first, "you have to prove the system
doesn't work." Students are heard from too. A teenager from
a nearby high school reports that 20 sophomores in her school had
a zero averaged in to their English grade as punishment for boycotting
The evening turns ugly when Deputy Commissioner of Education Alan
Safran rises from the audience to defend the MCAS tests. Parents
and teachers shout him down. "Are you a teacher?" several
On the stage, Walt Haney remains silent through the most heated
moments. His role in the forum is to provide hard information, and
he is the only speaker whose presentation includes overhead slides
of tables and charts.
For a professional lifetime, and for the 17 years he's been associated
with Boston College's Center for the Study of Testing, Evaluation,
and Educational Policy (CSTEEP), Walt Haney's research has focused
on the unintended negative consequences of high-stakes testing.
Over the years, his work has fueled legal challenges to government-mandated
tests in states including North Carolina, New York, Mississippi,
and Texas. "I have this terrible weakness for civil rights
lawyers," he says.
Haney is no stranger to grass-roots forums on MCAS like the one
at Oak Hill, and he makes no secret of his opposition to using tests -- even
relatively well-crafted ones -- as the sole gateway to student advancement.
He shares with his audience the results of his studies and those
of some of his colleagues: about how the MCAS tests compare inconsistently
with older, well-established tests; how their handwriting requirements
prompt schools to pull students away from computers for retrograde
longhand exercises; and how high-stakes tests elsewhere have spurred
a rise in student dropouts.
CSTEEP (pronounced see-steep) was founded at Boston College
in 1980 within the Lynch School of Education by George Madaus, who
is the Boisi Professor of Education and Public Policy, and Associate
Professor Joseph Pedulla. What began as an operational umbrella
for a handful of scholars conducting research on testing has expanded
today to include some 80 staff and associates. The group outgrew
its headquarters in Campion Hall about three years ago. Some of
its researchers now have offices in the football stadium; others
occupy a pair of trailers in the parking lot of the Merkert Chemistry
Center on BC's lower campus.
During the 1990s, the center was home to the Third International
Mathematics and Science Study (TIMSS) -- which remains active in some
40 countries and represents the largest international assessment
of student achievement in math and science. Two years ago, TIMSS
spun off to form the International Study Center at BC. Current programs
under CSTEEP include the National Board on Educational Testing and
Public Policy, described by one associate as "a sort of Consumer
Reports for people using tests." The board monitors and evaluates
testing programs around the country. Researchers in another program
are working with school systems in six U.S. cities -- Minneapolis;
San Diego; Chattanooga; Corpus Christi; Louisville, Kentucky; and
Long Beach, California -- to improve their assessments of student
performance. In 1999-2000, CSTEEP took in $2.5 million in grant
money, placing it among the University's leading recipients of outside
Not long ago, the conservative Thomas B. Fordham Foundation published
a censorious paper by Richard P. Phelps, a proponent of high-stakes
tests, entitled "Why Testing Experts Hate Testing." Phelps,
who has provided research for the Indiana Department of Education
and the federal Government Accounting Office, accused both CSTEEP
and UCLA's Center for Research on Evaluation, Standards, and Student
Testing -- the nation's top two programs in educational measurement
studies -- of harboring antitesting bias; accused them, in effect,
of allowing politics through the door. Although some members of
the Lynch School faculty do enter the political fray over high-stakes
testing -- usually, like Walt Haney, in opposition -- the full picture,
as ever, is more complicated.
The fact is that educational testing in this country has been entangled
with political purposes almost from its beginnings. Research programs
on testing at BC, for instance, got a big boost from the National
Defense Education Act of 1958. Among that law's goals was "the
fullest development of the mental resources" of the country's
youth. The federal money it unleashed nourished not only university
studies but also, through the conduits of state and local education
departments, the nascent testing industry.
The high-stakes testing movement that is now capturing state capitals
grew out of concerns that the emphasis on student self-esteem and
the softening of public school curricula during the 1960s and 1970s
had resulted in the need for remedial classes on US college campuses -- the
"rising tide of mediocrity" cited in the National Commission
on Excellence in Education's 1983 report "A Nation at Risk."
Some of the most distinguished names in testing scholarship and
practice appear on the Lynch School's faculty rolls. Consult almost
any treatise on the use of standardized tests and you will find
citations for research by Walt Haney and George Madaus.
Professor Albert Beaton's career in the testing industry encompasses
key posts at the most prominent testing organizations in the country,
including the Educational Testing Service, home of the SATs. Professor
Diana Pullin, who is also a lawyer, has provided counsel in nearly
every significant court case addressing high-stakes testing for
high school graduation in the past 22 years.
As the MCAS protest season looms again in Massachusetts -- indeed,
as controversies over high-stakes testing heat up nationwide -- the
Lynch School is likely to be a center of attention, and a font of
Among the studies that Walt Haney described at the Oak Hill forum
was one undertaken by George Madaus, together with Professor Irwin
Blumer and doctoral students Catherine Horn and Miguel Ramos. To
see how the MCAS measured up against other tests, the team compared
3,728 Massachusetts eighth-grade students' ratings on the mathematics
portion of the test (four ratings are awarded: Failing, Needs Improvement,
Proficient, and Advanced) with the same students' scores on the
nationwide Stanford 9 math test. What they found raises questions
about how the MCAS categorizes students. Although the highest-scoring
students tended to do well on both tests, and low-scoring youngsters
were also fairly consistent, there were plenty of exceptions. Indeed,
the 40 percent of pupils who did best on the Stanford test showed
up all over the MCAS scoring range: Some 18 percent placed in the
MCAS Failing category, and 27 percent were designated Needs Improvement.
Madaus, a lean man with a sharp gaze, likens standardized tests
to the protagonist of an English song, "Cosmo, the Fairly Accurate
Knife Thrower." If there's one thing that he thinks the general
public ought to know about tests, it's that they are fallible. "People
tend to take test scores and reify them," he says. "But
decisions made on the basis of test scores can often be wrong -- in
either direction." The high-stakes decision about whether to
graduate a student should turn not on a single test, he says, but
on a variety of indicators, including classroom performance. "If
a doctor gets a bad number back on a prostate-cancer test,"
he observes, "the doctor gets other measures before doing anything
The passing score on each MCAS test is 220 (on a scale from 200
to 280). But, says Madaus, "If you look at the work product
on tests between a 219 and a 220 -- or even between 218 and 222 -- there's
really no difference. I'm of the opinion that these cut scores are
taken too damn seriously."
Madaus is not opposed to standardized tests. In fact, he serves
on the state's technical advisory committee for the MCAS. The MCAS
is, by his account, "state of the art." But, he says,
when a test, by virtue of being high stakes, comes to represent
broad educational standards, a "corruption" typically
follows. It's called the Tradition of the Past Exam -- teachers using
old tests to prepare students for the next one.
In a paper he published in 1985 with Vincent Greaney, Ph.D.'73,
now an education specialist for the World Bank, Madaus described
the preposterously dramatic effect this practice can have. He and
Greaney looked at student essays written to obtain the Primary Leaving
Certificate in Ireland during the 1940s. In three consecutive years,
students were asked to write about a bicycle ride (1946), a day
in the bog (1947), and a bus tour (1948). There must have been a
great deal of rehearsal from the same script, however, because despite
changes in subject matter, passing essays by different students
in 1947 and 1948 contained the singular sentence, "Early summer
roses grew all over the pergola which stood in the middle of the
garden." No doubt the teachers of 1947 and 1948 thought they'd
taught their pupils well, but at that point, says Madaus, the test
wasn't measuring writing anymore. It was measuring the ability to
"memorize stock phrases and give them back."
The MCAS tests had been around for only a year when the Tradition
of the Past Exam began to take root in Massachusetts. After the
first administration of the eighth-grade test in 1998, the Department
of Education published an example of a high-scoring student essay
on the Internet, as a guide to teachers. A somewhat gory indictment
of television violence, the essay began with the command: "Imagine
a woman lying on the ground. . . ." In 1999, say test scorers,
quite a few essays began with the word "Imagine."
Does it matter that teachers are teaching to the test if the test
reflects high standards? "No test is good enough to be a curriculum,"
Nearly all states test their students to see how they measure up
to curriculum standards, if only for diagnostic purposes. Over the
next few years, 28 states will require the passage of an exit test
for graduation. One state that is years ahead of Massachusetts -- and
indeed most of the country -- in its high-stakes testing experience
is Texas. The Texas Assessment of Academic Skills (TAAS) has been
administered since the 1990-91 school year. Passing the test became
a graduation requirement in 1992-93.
As with the MCAS, students in Texas take the TAAS primarily in the
fourth, eighth, and 10th grades. Like the MCAS, the TAAS is meant
to serve multiple purposes: to gauge the competence of individual
students and their schools, and to push students and schools toward
more rigorous standards by testing for advanced academic skills
and "higher-order thinking."
In 1998, however, concern about the lagging scores of African-Americans
and Latinos on the TAAS prompted the Mexican American Legal Defense
and Education Fund in Texas to approach Walt Haney and ask if he
would study the data and serve as an expert witness in an upcoming
court challenge (G.I. Forum v. Texas Education Agency).
Haney ended up spending four days on the stand in the federal district
court in San Antonio during the fall of 1999, discussing his findings
on, among other things, dropout rates and the numbers of Texas students
who were being kept back from grade to grade.
The challenge failed, but Haney was intrigued enough by what he
had learned that he continued his Texas research. (His willingness
to take up causes has earned him the sobriquet "resident Don
Quixote" at the Lynch School.) Haney's findings were published
by the online Education Policy Analysis Archives on August 19 in
a report entitled "The Myth of the Texas Miracle in Education."
Owing to the presidential campaign, the study attracted a fair amount
Haney had managed, in effect, an end run around a daunting roadblock
to determining the efficacy of the TAAS: namely, the notoriously
inaccurate dropout statistics kept by the Texas Education Agency.
Several years ago, a state audit determined that dropouts were being
underestimated by as much as a hundred percent (the old counting
methods are now being revised). So Haney started from scratch. He
collected statewide high school graduation data and enrollment figures
by race for every grade for 20 years. What he found was that in
the first year of the TAAS, 1990-91, the high school minority population
took a slide from which it has never fully recovered.
Students -- mostly minorities, but whites also -- who had stayed the
course from first grade onward were being held back to repeat the
ninth grade in increasingly large numbers. By the late 1990s, says
Haney, Texas schools were keeping back 25 to 30 percent of minority
students and about 10 percent of white students just on the eve
of the high-stakes 10th-grade test. Although these were undoubtedly
the weakest students, who could well benefit from some kind of remedial
help, Haney suspects that in too many cases teachers and administrators
were holding them back to make their schools' 10th-grade scores
"look better." In Texas, not only are poor TAAS scores
a blot on a school and its teachers, but teachers and administrators
may qualify for cash bonuses -- sometimes as much as $25,000 -- when
their students make gains on the tests.
Scholars elsewhere have uncovered the dampening effect that being
separated from peers and staying back a grade can have on teenage
students and their will to persevere to graduation. In Texas, Haney
has found, only two out of every 10 ninth-graders who are held back
ever finish high school. Two will eventually earn a national GED
certificate; another two will end up in Texas's official count of
dropouts. And four will simply disappear from the books.
As a sidebar to his research on dropouts, Haney has become something
of an accidental expert on test cheating by educators. In Texas
there appear to have been at least a dozen instances so far, he
says, including a case involving suspicious erasures in Houston,
in which three teachers were implicated. Two years ago the Austin
Independent School District was indicted for tampering with student
Massachusetts, with its less centralized education system, has no
cash rewards policy (though, like Texas, the state does threaten
to take over schools where high failure rates persist). But already
some Massachusetts communities -- the Nauset Regional School District,
for example, and the town of Milford -- have begun unilaterally tying
school wages to MCAS scores.
Massachusetts differs from Texas in another essential way. While
in Texas the dropout rate approaches 30 percent, Massachusetts embarks
on its own high-stakes era with one of the lowest dropout rates
in the country, less than 4 percent. At the same time, the TAAS
is a relatively easy test; its 10th-grade math questions, for instance,
align with eighth-grade (some specialists say sixth-grade) expectations.
The MCAS tests, on the other hand, are considered to be among the
most challenging exit exams in the country. To put the high rate
of MCAS failures in human dimensions, of the 69,000 high school
sophomores who will take the MCAS this spring, an estimated 30,000
are expected to fall short of the passing mark.
Already, the number of students in Boston who are repeating at least
one subject in the ninth grade has doubled in the past year. "I
personally think," says Haney, "that the ramifications
of my work on Texas portend a tragedy that's going to happen here
Proponents of MCAS tend to stress the goals of the 1993 Education
Reform Act and its achievements to date. They point out, for instance,
that before 1993, the only statewide requirements for graduation
were a single year of American history and four years of gym. Although
often the center of controversy, and prone to delays, new comprehensive
curriculum guides -- called the Frameworks -- now exist in Math; English
and Language Arts; Science, Technology, and Engineering; History
and Social Science; Foreign Languages; Health; and the Arts.
Students have also seen a new emphasis on writing in Massachusetts
classrooms -- a response to the long composition requirement and to
the open-ended questions that appear on all MCAS tests. According
to the federally sponsored National Assessment of Educational Progress,
sometimes billed as the Nation's Report Card, Massachusetts eighth-graders
ranked second in the country in writing performance in 1998. Ironically,
they achieved this even as 45 percent earned the label Failing or
Needs Improvement in their English MCAS scores that same year.
And then there is the investment that has accompanied education
reform and the MCAS: State spending for public education has risen
in Massachusetts from $1.25 billion in 1992 to $2.8 billion in 2000.
As Republican Governor Paul Cellucci '70, JD'73, has made clear,
MCAS testing is the price that schools and students must pay for
this more than doubling of state funding. It is the means to determine
whether the money is being well spent. "We made a deal in 1993
with all the players," Cellucci has been quoted as saying.
"We said, We'll provide a lot more money . . . but we ask for
one thing in return: accountability."
Nearly as important as the money has been the attention. Thanks
to the controversial test, a brighter light now shines on public
education issues in the state than at any time since the days of
Horace Mann. The test has prompted new remedial efforts -- $3.7 million
appropriated in Boston, for example, for a program on teaching math,
and mandatory summer school for that city's failing students.
The bottom line is that high-stakes testing carries with it a sometimes
brutal calculus of human gains and human losses. Where to place
the values in this calculus is a political decision.
BC Education Professor Albert Beaton, a former director of CSTEEP,
is more favorably disposed to high-stakes standardized tests than
either Walt Haney or George Madaus. Beaton has spent most of his
career in the testing industry. For 27 years, he worked at the Educational
Testing Service, where he oversaw research and data analysis. In
1966, his third year at ETS, Beaton served as the chief technical
expert for what journalist Nicholas Lemann, in his 1999 book The
Big Test, has called "the single best-known piece of quantitative
social science in American history," the Coleman Report, an
investigation of black student underperformance. Beaton went on
to serve as director of design, research, and data analysis for
the National Assessment of Educational Progress. Upon arriving at
Boston College in 1990, he launched the international TIMSS project.
Three years ago data from TIMSS were used to show that eighth-grade
science students in Massachusetts ranked ahead of their peers in
Japan, Germany, England, and 37 other countries (behind only Singapore
To Beaton, the problem is not so much that mixing high standards
with high stakes produces more failures -- "of course when you
do that more people will flunk," he says. The problem is that
there are no clear or happy answers for what comes next. If the
cost of high standards "is to flunk kids out, never to return,"
he says, "we don't want that in this country."
Last November the Massachusetts Association of School Committees,
by a lopsided count of 137 to 30, voted to recommend suspending
the MCAS graduation requirement indefinitely. The association maintained
that because of state delays in producing the Frameworks, schools
had not had enough time to adjust their curricula to the tests.
Indeed, a survey by the Massachusetts Teachers Association suggests
that only about half have managed to do so in math.
Elizabeth Reilinger, chairwoman of the Boston School Committee,
was one of the few dissenting voices. She sees the MCAS as an ally
in an effort to channel state support to the school districts that
need it most. "You could get into a discussion about whether
this is the right test or a fair test or a perfect test," she
says, "and it's not necessarily any of those. However, it is
a form of establishing some baseline for accountability."
It's also, she says, a way to measure whether the new remedial programs
are working. The reality, she says, is that "If MCAS is not
high stakes, people won't take it seriously." Research at BC
and elsewhere suggests that teenage test-takers in particular are
apt to slack off.
State Commissioner of Education David Driscoll '64, Ph.D.'81, agrees
that "until kids try," there will be no way to tell how
they measure up against the new tougher standards. Even so, he says
he is "very comfortable" with the passing score that the
state has set for the MCAS. "I'm confident that [students]
who don't pass MCAS frankly don't have the basic skills," he
says, noting that education reform began because high schools were
graduating students who lacked the basic competencies needed to
hold down a job, let alone continue on to a college or university.
"There's no question that a lot of our kids, particularly in
our urban areas, face challenges in their lives that make it much
harder for them to succeed academically," he says. But it is
schools "that offer the hope. These kids need to be held to
the standard or else they're never going to get out of this life
of limited opportunity."
Driscoll is braced for a "short-term" rise in the dropout
rate. "Hopefully," he says, "we will find different
ways to keep kids in school -- by providing a strong retest strategy,
tutoring, and other tools to help." As a show of accountability,
and faith, Driscoll has tied his future salary increases to student
progress on the tests.
Few, of course, would impugn the motives behind educational reform.
It is the narrow method employed to measure its progress that raises
hackles. The Massachusetts Teachers Association has said it will
propose legislation this year to base graduation decisions on an
array of measures, rather than just one test. In fact, this option
is already embedded in the Education Reform Act of 1993, which contains
language embracing a "comprehensive" assessment system.
Within a year of the act's passage, the state Department of Education
was discussing "portfolio evaluation, performance tests, and
other more authentic assessment techniques." But early talk
of factoring in students' class work has petered out at the state
level. When it comes to mandating a portfolio assessment system
as a way of evaluating samples of students' accomplishments over
time, James A. Peyser, chairman of the state's Board of Education,
has said, "the practical issues and fairness issues are just
insurmountable." (In a reverse twist, Vermont tried relying
on statewide portfolio assessment for about five years; it has since
added standardized testing to the mix.) The dispute over the MCAS
tests' exclusive reign will likely end up in the courts.
A landmark federal case waged in Florida from 1978 to 1984 points
to the issues that will arise should testing go on trial in Massachusetts.
It was litigated by Diana Pullin, now a professor of education administration
and higher education at Boston College. An outgoing woman with a
pleasant, take-charge manner, Pullin at the time was a practicing
attorney with no aspirations to join the academy. Among her expert
witnesses in the Florida case, which centered on the use of testing
to deny high school diplomas, were George Madaus, Walt Haney, and
Peter Airasian, a professor of educational evaluation and former
director of CSTEEP. That association, says Pullin, "was the
beginning of my long-term relationship with this university."
The judge's decision in the Florida case, Debra P. v.
Turlington, made plain that a state could indeed use a test
to award high school diplomas. But several conditions would have
to be met. First, students must have had a fair opportunity to learn
the material covered by the test. Second, the test would have to
be a valid and reliable measure of their competence. And third,
the test could not perpetuate racial or ethnic discrimination. In
1982, Pullin worked on the appeal of a case in Illinois -- Brookhart
v. Peoria School District -- that laid down a fourth condition:
The test could not discriminate against people with disabilities.
"In each of these areas," says Pullin, "Massachusetts
has some particular vulnerabilities that could result in successful
lawsuits against the state."
As in Florida, the court here would have the option of shutting
down the tests entirely or simply eliminating the graduation requirement.
Given the widespread belief that educational reform is necessary
in this country, Pullin does not expect any judge to toss out the
MCAS completely. Beyond that, no outcome is certain.
The Texas case in which Walt Haney testified against the TAAS went
down to defeat even though the judge found that students had been
victims of racial and ethnic discrimination. The judge, Edward C.
Prado, said it was the hardest decision he'd ever made, but in his
opinion the test served "legitimate educational goals."
The plaintiffs, he wrote, had not shown that "other alternatives
could sufficiently motivate students to perform to their highest
Ironically, while the Texas case was underway, the American Educational
Research Association issued its standards for educational and psychological
testing. The standards state that a high-stakes decision should
not be based on any one test. Pullin served on the committee that
set the standards.
Walt Haney may be right when he says that "traditional scholarly
research" is not going to determine how standardized tests
are used. It will only inform the debate. The choice of whether
to adopt high-stakes testing is at its core a political one, and
even a philosophical one. It is a debate over what motivates people
to learn and to do their best. It is a debate over how we define
and measure success. And it is a debate over who we hold to be the
first client of public education -- the individual student, the economy,
or the society.
"Look," says George Madaus, "having a high school
diploma in America is no big deal" -- as researchers elsewhere
have demonstrated, it won't land anyone a great job, and it's no
guarantee of affluence -- "but not having one is a disaster."
From this perspective, the MCAS and other tests like it are not
so much exit exams as they are entrance exams. The aims of the MCAS -- high
academic standards, an educated work force, students fit for higher
education -- are attractive. But standing alone, the MCAS tests may
represent a narrow rendering of fitness for life after high school.
They automatically exclude students whose math or verbal skills
may be substandard but whose dependability, social intelligence,
strength of character, or other attributes leave open the possibility
that their limits have not yet been reached.
At a lecture on campus last fall, Walt Haney asked an audience of
academics and students how many had ever been stopped for speeding.
A field of hands went up. How many had received a ticket? About
half the hands went down. "We expect a police officer to use
discretion," Haney said. "Teachers know a lot more about
their students than police officers do about us when they stop us.
How much more sense it would make to return standardized tests to
their appropriate role -- to inform human judgment, not mechanically
As the forum last May at the Oak Hill Middle School wound down,
anger among the participants sometimes made way for alternative
proposals and plans. Early on, Haney had suggested that the decision
to bestow a high school diploma should be reached the way admissions
decisions at colleges are -- by taking into account a student's grades
and personal qualities as well as standardized test scores. Now
the middle-school principal who had shared his outrage addressed
how to better motivate teenage test-takers: Offer free state-college
tuition to all who pass the MCAS, he said. "Give the test on
Saturday, and make it voluntary. Why be harsh? Why be punitive?"
Others in the audience talked about swaying state officials to their
side. A parent from CARE, the Coalition for Authentic Reform in
Education, provided form letters to anyone who wanted to send a
message to representatives in the legislature. And a student from
SCAM -- the Student Coalition for Alternatives to MCAS -- invited everyone
present to join a rally near the State House two weeks later.
With academic research lending scant support to high-stakes testing,
protest over the MCAS grows -- at the grassroots and among local governments.
At the same time, legislative leaders, the governor, and the Board
of Education say their support for the test is unequivocal. The
battle may last for years.
Top of page