What does "SAT" stand for? As of 1997, the acronym no longer has any meaning. But in 1926, when the test first appeared, those three letters meant "Scholastic Aptitude Test." In 1994, the new definition for the acronym became "Scholastic Assessment Tests," but accusations of eugenics—or something—arose. Therefore, College Board's official position now is that "SAT" doesn't stand for anything.
Achievement on SAT-testing has become an important part of the college-admissions process. Some students begin preparations for the test as early as sixth grade. I'm not kidding! I've worked with some of those driven young people and their parents. Most students, however, wait until their high-school years to begin formal preparation. They purchase one or more of the many study guides available. Taking sample tests and studying various techniques to "beat the test," including discriminatory skipping of questions, students seeking admission to prestigious universities, work through thick volumes of material. In addition, students enroll in special preparatory classes so as to fill in gaps and to boost their scores. Gone are the days when students seeking college admission rely on what they learned in the classroom to get them into college.
Since last year's revamping of the the SAT's, taking the test has become even more stressful for college applicants. The verbal section no longer contains analogies—which used to be the bane of many test-takers, and now students wade through more reading passages, both short and long, as well as sentence-completion items with advanced vocabulary. A new multiple-choice portion, "Writing," tests style and grammar elements and also includes editing a provided composition. Verbal items are no longer limited to multiple-choice, and students have twenty-five minutes to handwrite an original essay in answer to a prompt. In the mathematics section, the level of material has been raised to reflect the new emphasis on the earlier introduction of algebra and the addition of pre-calculus, and students are allowed to use a calculator. Because of the change in the test-format and a lack in statistics as to what the scores mean, many colleges admit uncertainty about proper interpretation of the scores.
As of last week, an unexpected element appeared with regard to the New SAT's—the validity of recent scoring. According to an article in the March 20, 2006 edition of
Newsweek, thousands of students who took their SAT's have received incorrect scores, scores which have been raised or lowered according to the whim of the scanners used to read the answer sheets:
"...Last Wednesday, five months after he took the exam, [high-school senior Robert] Smith received an e-mail from the College Board, which administers the tests, telling him that his SAT had been scored inaccurately. He had actually earned a 1780 (out of 2400 on the three-part test), 50 points higher than what had been reported to his family—and his first-choice school, Hofstra University. 'I would've never imagined something like this could happen,' Smith says. 'This is the SAT—not the math quiz you take on Friday.'...
"[A] handful of students who got low numbers last fall might have lowered their sights accordingly. That's what happened to Robert Smith. Last December, a rep from Boston University—the school he'd really wanted to go to—told him his SAT wasn't high enough, so I got discouraged and didn't apply. Now, with the change, my math score is at the top end. I missed the early-admissions deadline. By the time I apply for admission in March, they could be full.'
"How was the whole mess ever detected in the first place? The College Board says that late last year, two students asked for their tests to be rescored. When the results showed that they had been improperly scanned, the board asked for a larger sample of tests to be checked."
Apparently, the students' bubble sheets, on which answers are marked during the testing process, were stored for just a few hours in a damp warehouse in Austin, Texas. The paper expanded enough to affect alignment, and, therefore, the scanners used to score these tests inaccurately read the students' answers.
According to a March 10, 2006 article in the
Washington Post, SAT scores are but one of the criteria used for admission to college:
"Ted O'Neill, admissions dean at the University of Chicago, said the mistakes in the SAT scores made no difference to any of his institution's 35 affected applicants because 'scores don't really matter very much, and in most cases, not at all.'
"'We have a lot of information about the kids that we think is more important,' he said."
Such is the official position of the colleges. But the truth is that, for colleges requiring the test, SAT scores have assumed greater importance, especially as the reliability of high-school grades has deteriorated. Part of the reason for changing the format of the SAT's last year is inflated grading on the part of high schools. Another reason for changing the test is the proliferation of cheating on college applications; for a fee, students can hire someone to write the application-essay for them. As a result of the tainted materials submitted to admissions offices, many colleges have been forced to deal with incompetence in freshmen's writing skills, and that incompetence has resulted in the proliferation of remedial courses, the cost of which is a sizeable burden on the finances of American colleges of all types. Even the non-remedial English courses have been, of necessity, dumbed down.
Insofar as is yet known, only a very small percentage of SAT scores seems to have been affected by the recent scanning errors. Most are small errors in the points' totals. But in a few cases, the scores were off by 100-130 points—a significant difference in today's competitive atmosphere for college admission. In addition, another question has now arisen: Are even more SAT scores skewed? That question remains to be answered; indeed, it may be unanswerable.
Talk of litigation has appeared, of course. But the impact of errors in SAT-scoring goes beyond anything which a court can solve.
Continue reading....