This is the story of how that came to happen.
Cartersville is a typical Georgia town. Cartersville isn't a wealthy suburb. It's a middle class, mixed race community. Average household income is around $36,000.
In 1988, John Bridges, a Cartersville school board member attended a textbook fair and met John Saxon , the publisher of a series of math textbooks. Saxon had been an instructor at the Air Force Academy in Colorado Springs. Bridges, who had toured the Academy, was aware of the Academy's excellent reputation and was willing to listen to Saxon's story. Saxon made his pitch and gave the board member samples to take back to Cartersville.
The books ended up with the sixth grade math teachers who were asked to evaluate them. Saxon gave the districts enough books so they could test the program with one of the sixth grade classes. At the end of three months, the entire sixth grade was tested using the ITBS. The test results were significant enough to convince Dr. Harold Barnett, Carterville's school superintendent, that the series was worth picking up for the entire sixth grade class.
As the sixth grade moved up the grades, the Saxon texts were added in. When the original group of sixth graders were tested two years later, they ranked ninth in Georgia. Since 1992, Carterville's eighth grade has ranked first among Georgia's eighth grade math classes. As the students progressed to high school, the SAT math scores rose from the lower 400's to hover at 500. The nature of math classes being offered has changed as well. More girls are taking Calculus in Carterville's high schools than before and more Calculus classes are being offered.
Saxon's books were not on the approved text book list for Georgia. Some states that have approved math text lists, such as California, follow the guidelines proposed by a group called the National Council of Mathematics Teachers, NCTM. The NCTM suggests teaching methods. The suggestions are accepted at face value and are not subjected to any statistical testing to see if they are effective.
John Saxon contacted Barnett to ask for help in getting on the Georgia approved text book list. Saxon's approach of persistent pencil and paper work is contrary to the approach favored by the NCTM. The NCTM guidelines favor calculator usage instead.
Barnett told Saxon that the Cartersville data were sufficient to request a re-hearing before the committee. The committee rejected Saxon's appeal. Bridges, the school board member who had started the ball rolling in 88, said he had the distinct impression the committee was looking for ways to disapprove the text. One committee member said that the texts weren't "cultural enough." Barnett then told Saxon how to appeal the committee's decision to a state review panel.
The day after the election, Saxon carried his appeal of the committee's decision to the state review panel. The review panel engaged a statistician who reported that the data showed there were no differences between schools that used the Saxon method and those schools that didn't. Based on the report, the panel rejected the appeal.
Both Barnett and Saxon were perplexed. Other districts in Georgia had adopted Saxon's books despite the books being off the approved list. Because Georgia has a requirement that school districts must report their standardized test scores to the state, Barnett knew Cartersville wasn't alone in Georgia. The districts that used Saxon's books tended to rank in the upper third of the state and the districts that were fighting the approval were in the bottom third. Given these facts, how could the statistician find there were no differences? Barnett and Saxon asked to see the raw data that the panel's statistician had relied on. The review panel refused to relinquish the data.
By now, Barnett was annoyed. He contacted the assemblyman from his district and asked the assemblyman to get the data. The assemblyman asked for the data and was also refused access. The assemblyman ended up filing suit against the Georgia Department of Education.
Rather than go to trial, the department released the data. The data the review board had relied on were preliminary pilot data. The statitician had ignored data from the schools that had bypassed the approved list and had adopted Saxon. When those data were incorporated, there was a clear difference in schools that used Saxon and those that did not.
California has no similar requirement for standardized testing and reporting across the state.
In 1996, each district does what it feels is correct in evaluating its performance. Districts may choose whatever technique they wish to evaluate how good a job they're doing. It's a bit of the foxes guarding the chickens. It isn't until the children reach 11'th grade and some of them take the Scholastic Aptitude test that any meaningful data emerges. Want to find the best elementary district when it comes to teaching spelling? No one can say. Math? Same story.
The State of California doesn't fare much better in all of this. In an attempt to set standards, the California Board of Education draws up an approved list of texts. There are no "proof of performance" requirements to attain a spot on the approved list. Admission to the list is based on how closely a committee believes the texts adhere to the Framework.
The Framework's suggestions are taken at face value and not subjected to any tests. The committee's choices aren't tested except by the children. In my son's school district, the math textbook adoption committee was allowed to evaluate only the books on the approved list.
The selection process ignores proven efficacy. Instead, it selects for adherence to a set of untested proposals.
In some cases, the textbook publishers wine and dine members of committees to get certification. As Richard Feynman documented in his autobiograpy, to some committe members, restaurant quality was more important than text quality.
A simple standard would be to require districts to administer a specific, nationally normalized test that measures basic skills. The Stanford Achievement Test (SAT) and the Iowa Test for Basic Skills (ITBS) are but two candidates. Whatever test that was chosen statewide, the tests results would be required to go to the state as well as the districts. The state could then deliver the data to any one who wished to see it. In addition, requiring the districts to report which class used which text would serve to identify text books that were more effective than others.
Given the paltry test scores in California, it's clear that the process being used today isn't working in most districts.
mgreene@greenes.com. April, 1996