Gaming the SAT

Headlines from several sources the other day announced that the folks who run the college SAT testing for high-schoolers were not only revising the tests significantly but even making the essay part optional. Well, I thought, this sounds like just one more step in dumbing-down the education system. I was wrong, and things aren’t always what they seem (he said tritely). There was much more to the story than the superficial treatment given by most sources. Most of those implies that public complaints as being the primary cause of the changes.

SAT'S

credit: online.wsj.com

Educational testing is a big deal. Really big. It’s grown into a $4.5 Billion dollar industry that seeks to satisfy the demand to capture the academic potential of students and to sum that up in just a few numbers. It works poorly, so poorly in fact that a few colleges are already abandoning testing and relying only on school records and testimonials. Studies seeking to find correlation between test scores and success in college have been consistent in finding almost none. But, what about the essay? Surely a student’s native ability for self-expression is an essential element in any formula for success, so how can they abandon that? (Students who want to continue that tradition will still be able to, and colleges who want it can specify that they need it.)

A New York Times reporter investigated and reported on how the changes came about. It was enlightening. The new president of the College Board, one David Coleman, is a major force behind the changes. This is a guy who thinks outside the box and who has been a long-time critic of the testing system. In talking to an MIT professor, one Dr. Perelman, he found that the essay part of the test can be gamed. Here, from the NYT article, is how:

Since 2005, when the College Board added an essay to the SAT (raising the total possible score from 1,600 to 2,400), Perelman had been conducting research that highlighted what he believed were the inherent absurdities in how the essay questions were formulated and scored. His earliest findings showed that length, more than any other factor, correlated with a high score on the essay. More recently, Perelman coached 16 students who were retaking the test after having received mediocre scores on the essay section. He told them that details mattered but factual accuracy didn’t. “You can tell them the War of 1812 began in 1945,” he said. He encouraged them to sprinkle in little-used but fancy words like “plethora” or “myriad” and to use two or three preselected quotes from prominent figures like Franklin Delano Roosevelt, regardless of whether they were relevant to the question asked. Fifteen of his pupils scored higher than the 90th percentile on the essay when they retook the exam, he said.

And this:

Over the course of their two-hour conversation, Perelman told Coleman that he wasn’t opposed to an essay portion of the test, per se; he thought it was a good idea, if done well. But “when is there a situation in either college or life when you’re asked to write on demand about something you’ve never once thought about?” he asked. “I’ve never gotten an email from a boss saying: ‘Is failure necessary for success? Get back to me in 25 minutes?’ But that’s what the SAT does.” Perelman said that tutors commonly taught their students to create and memorize an all-purpose essay that contained the necessary elements for a top score — “a personal anecdote, a few historical references; Florence Nightingale seems a strangely popular reference.” When test day comes, they regurgitate what they’ve committed to memory, slightly reshaping depending on the question asked. But no one is actually learning anything about writing.

What the article does not say, explicitly anyway, is what flows naturally from this. The SAT people are using software to grade essays. It makes sense. Just consider the problem if they tried to hire English majors to do the job. They would have to give them something like these instructions.

We would like to hire you part-time for six months to do nothing but grade essays by high-school students. Since we have hundreds of thousands of these, through-put is important, so you will need to do this fairly and to be consistent with the output of the other 500 English majors we’ve hired of course. We expect you to complete at least 200 papers per week. Your results will of course be sampled by peer-review for quality.

Would you take that job? I wouldn’t, not unless I was starving and desperate. No, they’ve got to be doing it with software, hence Perleman’s analysis for tricking the system.

Self expression is vital, but I see no better way to gauge it than by lessening the testing burden on schools and freeing teachers to teach the way they should and not just “teach to the test”.

New York time reporter Todd Balf did a fine job with “The Story Behind the SAT Overhaul”. Those interested in education will, I think, find more to like at the link.

Enhanced by Zemanta

About Jim Wheeler

U. S. Naval Academy, BS, Engineering, 1959; Naval line officer and submariner, 1959 -1981, Commander, USN; The George Washington U., MSA, Management Eng.; Aerospace Engineer, 1981-1999; Resident Gadfly, 1999 - present. Political affiliation: Democratic.
This entry was posted in Education and tagged , , , , , , , , , . Bookmark the permalink.

15 Responses to Gaming the SAT

  1. Jeff Little says:

    Good posting. I learned something.

    Like

  2. Jim in IA says:

    ‘…lessening the testing burden…’ I agree.

    I have lot of opinions about education and teaching. I have not enough time at this moment to say much more than that. Sounds like a cop-out, I know. Company is coming in a few minutes.

    This article on the SAT I found interesting this morning. http://wapo.st/1jSsntp

    I will check back later.

    Like

    • Jim Wheeler says:

      Thanks for that link, JII. Those data are familiar to me, but given the insight into how SAT essays are processed I’m wondering just how much all the data might be skewed simply by tutoring in test-taking. It’s a component, it’s just not clear how large it is.

      Like

      • Jim in IA says:

        I don’t feel the data is skewed. I have no references to back up that claim. Only my anecdotal experience with kids in high school for many years.

        They get signed up to take the tests…ACT and SAT. Their date comes near to show up and do it. Some buy test taking preparation books before.

        As the time draws nearer, a smaller fraction will actually open the prep books. A smaller fraction still will take the time to go through the exercises and sample tests to make some improvements to their skill.

        Many kids are good at testing strategies. The things they learn in the books will help them a little. But I don’t feel it is much.

        The biggest impact on score is the retest a year later. Our son moved up several points on his ACT just because he retested after a year of maturity and some more content under his belt. I know for a fact he didn’t put in much time with the test prep materials.

        Like

      • Jim in IA says:

        As to tutoring, I don’t recall hearing about kids at my school being in tutoring sessions for the test. Most of the teachers I know would not do that.

        We did offer ideas, strategies, ways of thinking to the kids as a general approaches for any test or challenging situations. But, the students were expected to do the SAT/ACT prep on their own. I could never justify using valuable class time for that. It is hard enough to cover the curriculum as it is.

        Maybe they got help from tutors outside of school. I can’t back that up with first hand knowledge.

        Like

  3. PiedType says:

    I would have assumed they used computers for grading, simply because of the number of tests to be scored. And because grading by humans would be too subjective. Still, it seems to me human grading would be the only valid method, since reasoning, transitions, proper phraseology, grammar, vocabulary, etc., need to be evaluated. Otherwise requiring an essay question is pointless. Frankly, until I saw the recent reports, I wasn’t aware an essay had been added to the SATs.

    Like

    • Jim Wheeler says:

      I too hadn’t really thought about how such essays might be graded, PT, but now I’m convinced that the only sensible way to assess writing ability is to strengthen English programs in high school, and that means luring teaching talent there with better pay. As far as the SAT tests, I’m wondering now if grammar is being adequately tested? That’s an area that was generally disliked when I went to high school, although I found it interesting. I wonder what’s become of it?

      Like

      • PiedType says:

        When I attended the Univ. of Okla., all students were required to take an English proficiency exam between freshman and sophomore years. It involved long essay questions and was strictly graded by hand for grammar, spelling, and generally good use of the language. You had to pass it to advance. I think something similar should be instituted in our high schools and no one should ever be graduated without a minimum proficiency in English composition. Of course, that would really throw a wrench into the “teach to the test” philosophy of most schools.

        Like

  4. henrygmorgan says:

    Jim: Thoughtful and interesting article. I believe that we can learn more about a student from the essay than any fill-in-the blank, multiple-guess test yet devised. Not the ideas alone, but the way they are expressed reveal a “plethora” (had to get that in!) of information about the author of those ideas and that form of expression. How in command of the subject matter are they? How deftly do they organize the material? Do they have the ability to see which parts of the subject are more important than others? Can they use subordination to reveal these differences? How well do they handle the language? These are just a glimpse into the information revealed by writing.

    At many colleges and universities, senior staff in the English Department are exempt from teaching Freshman English, admittedly a difficult course for a teacher, as, indeed, it is for the student. At three of the four universities in which I’ve taught, the English Department made it policy that every faculty member would teach at least one Freshman English course, the current policy at MSSU. The thinking is that the exempting of senior staff removes from this critical course the most experienced teachers, leaving the task in the hands of the newest and least experienced teachers. I heartily approve this doctrine, because learning to write well is the entry key to success in every course in a college, not just English.

    The essay component of the SAT presents an excellent preview of the students’ ability to deal successfully with college work and with future employment. When I receive requests for recommendations from prospective employers of my former students, they often specifically ask about the students’ writing skills. No matter the area of employment, it is likely that writing skills will be necessary, in communicating with other business contacts, writing of reports, applying for grants, and a “plethora”of other tasks.

    Regardless of the Professor’s results in what appears to be a pointed study, even with the synthetic, artificial devices that he urges the students to use, he has available, which he seems to have overlooked, a valuable insight into the students’ abilities.

    I am reminded of Alexander Pope’s description of fine writing as “what oft was thought, but ne’er so well expressed.” Bud

    Like

    • Jim Wheeler says:

      Bud, I’m honored to have your professional and experienced views on this subject, and they mirror my own. The ability to write cogently is evidence that the student can reason and organize cogently, as you point out. The point of my post, of course, is that if the SAT essays are being graded by a software program, which they clearly are, then the system is not measuring writing ability but rather test-taking ability.

      I was unaware that MSSU and other universities had policies ensuring top teaching talent for freshmen and I’m delighted to learn of it, especially since I’ve heard that many courses are taught by TA’s. Prospective students and parents would do well to assess policies of this kind in the selection process.

      Like

  5. Well I’ve certainly learned something interesting from you today. When I first heard the SAT was dropping the essay, I had the same initial reaction that you did Jim. But you have cast another light.
    About fifteen years ago (or longer?) I was asked to be a reader. The college gave me the information and the readers’s test — meaning the test that tests the readers for consistency. I was given some sample essays to score. The point was to see whether I (as a prospective reader) gave the same score to these samples as most other readers. I “passed” meaning that the scores I gave to the samples were consistent with the scores given by the actual readers. I don’t recall what the job paid, but it was not enticing and I did not follow through with accepting the work. But Jim, I can support your observation regarding who might be willing to do this. I guess I didn’t follow through because starvation wasn’t imminent.
    I didn’t realize they had moved to scoring essays by computer, but your evidence certainly points to that conclusion.

    Like

    • Jim Wheeler says:

      I’m glad to have your background on being a “reader”, Helen. Software is bound to have progressed significantly since your experience a couple of decades ago, so I rather doubt they are hiring readers now. After all, even smart phones now respond to voice commands and iPads will read to us.

      Achieving SAT and ACT tests is frenzied in some social circles, something I was reminded of by an article in the New Yorker magazine. It was written by a helicopter mom who made a project out of taking the SAT herself 6 times in an effort to tutor her kid in test-taking.

      Like

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.