By Joshua Benton

Staff Writer

Page 8A

Caveon, as a for-profit company, has declined to reveal how, exactly, it does its work.

“Companies don’t publish much about their methods, because then everyone could do it,” said Chris McManus, a professor of psychology and medical education at University College London who has researched cheating.

But Caveon’s report to state officials offers clues to how the company crunches its numbers.

It appears that, like many other cheating researchers, Caveon focuses on wrong answers, not correct ones. If two brilliant students both got perfect scores on the TAKS, Caveon wouldn’t consider that suspicious – even though, by definition, all their answers would be exactly the same.

An appendix to Caveon’s report says that the company calculates the probability that pairs of students would have the same answers if they had acted independently. That’s similar to a portion of the method used at Canada’s McGill University. But it’s unclear how similar two students’ answer sheets must be to trigger Caveon’s suspicion.

There’s one big difference between Caveon’s methods and those of most academics in the field. Technically speaking, Caveon isn’t searching for students who might be cheating. It’s searching for classrooms and schools where an unusually high number of students might be cheating.

Caveon’s analysis expects classrooms to have a certain number of students with very similar answer sheets, based on statewide averages for the grade and test. The number varies by grade and test. In third-grade math, for instance, Caveon expects about 1.2 percent of answer sheets to look suspicious. In 11th-grade math, that number is 6.1 percent.

Caveon flagged a school or classroom only if it had many more suspicious answer sheets than other schools in the state. For instance, an 11th-grade math class where 6 percent of answer sheets were exactly identical wouldn’t be flagged – because that would be merely average in Texas.

Similar answer sheets are the most common suspicious pattern Caveon flagged in schools, but the company also screened for three other kinds of potential problems:

* Schools where students had unusually large jumps in test scores.

* Schools where answer sheets had unusually high numbers of erasures that changed wrong answers to right ones.

* Schools where students had unusual answer patterns, such as answering difficult questions with ease but missing easy ones.