TEA may create system to hunt TAKS cheaters; Criticism of private firm spurs call for agency to analyze scores itself

By Joshua Benton
Staff Writer

Page 1A

Unhappy with the way an outside company did the job, the Texas Education Agency appears ready to take on the hunt for cheaters itself.

An agency task force has recommended that TEA build its own system for analyzing scores on the TAKS test to look for suspicious patterns.

But it could be another year or more before the system is ready.

Until then, it appears likely that testing data will go unscrutinized.

“Investigating these anomalies a year or two later is very difficult,” said Michael Donley, the agency’s inspector general. “I have a feeling that old data will be skipped.”

The recommendation is one of 10 made by the state’s test-security task force, which was formed last fall.

An agency spokesperson said Education Commissioner Shirley Neeley was broadly supportive of the recommendations, although details about implementation remain to be worked out.

The task force’s conclusions come as it finishes an unusual investigation into 700 schools whose scores on the 2005 test were flagged as suspicious by the Utah test-security firm Caveon. Nearly 600 of those schools were recently cleared by TEA – most of them solely on the basis of a questionnaire sent to school officials about their testing practices.

State officials have criticized Caveon, saying the company flagged too many schools and questioning parts of its methodology. Just over 100 schools remain under investigation, a process state officials hope to conclude by the end of February.

About 30 more schools will receive on-site visits from state investigators in the coming weeks, including campuses in the Dallas, McKinney and Mesquite districts. About 65 schools have already received on-site visits.

State’s actions defended

Mr. Donley defended the state’s treatment of the 700 Caveon schools, citing the difficulties caused by the nearly two years between the possible cheating and the investigations’ conclusion.

“Given the time barrier, I think our efforts have been effective,” he said. “We did everything we could do.”

It remains unclear why a year passed between the time when Caveon first shared a draft of its findings with TEA, in the fall of 2005, and the beginning of investigations. But no matter the delay’s cause, Mr. Donley said, it severely hampered the agency’s ability to hunt for possible offenders.

Using statistical methods to look for cheating is a practice the agency has argued against in the past, particularly when The Dallas Morning News has used statistical analysis to identify schools where cheating may have occurred. Agency officials have said in the past that they needed some sort of supporting evidence of cheating – such as eyewitness testimony – to merit an investigation.

“I don’t want to say it’s a complete reversal, because we have used data in the past,” agency spokeswoman Suzanne Marchman said. “But some of our policies are changing.”

Ms. Marchman said the system would be designed to be transparent and open, so that schools would know what behaviors trigger an investigation. That would address one of the agency’s complaints about Caveon’s analysis. As a private company, Caveon refused to share its exact methodology for detecting cheating with TEA.

“The biggest problem [with Caveon’s analysis] was not knowing what their algorithms were,” Mr. Donley said. “You can’t tell how strong an indicator is if you don’t know how they did the math.”

Although Caveon would not reveal its exact methods, the company was willing to share more detailed information about its findings, such as the identities of students suspected of cheating and which cases seemed most severe. TEA declined to obtain that information from Caveon.

No details yet

The details of the new TEA system are still months or years away from development. Mr. Donley said that the system would not be in place in time for the TAKS tests that will be administered this spring. He said he wasn’t sure whether it would be ready for the 2008 testing.

Until then, it appears likely that TAKS scores won’t be analyzed for cheating. That would include scores from last spring and the current school year.

Mr. Donley said the agency’s new system would be developed in-house, although he said it was possible outsiders might be hired to add technical expertise.

The task force also recommended that TEA create a “model policy” for test security, determining best practices on how to conduct an honest testing session.

But the decision to adopt best practices will be left up to school districts, which could choose to ignore them. Mr. Donley said the task force did not want to interfere too much with local control.

“I would guess that the task force is trying there not to be too heavy-handed,” Mr. Donley said.

Among the other recommendations the task force made:

* Increase the ease with which educators can report possible cheating anonymously.

* Audit the test-security practices of a random selection of school districts annually.

* Consider adding a test-security component to the state’s accountability system, which produces the much-watched school ratings each fall.

* Require school districts to retain important test-security materials for up to five years after test day.

The task force’s recommendations are not binding as policy; Dr. Neeley could choose to enact some or all of them. In a prepared statement, she said the agency “will give strong consideration to all their suggestions.”

“We will do whatever it takes to make sure our students’ test results are valid and reliable and our testing program above reproach,” she said.