Monday, July 17, 2017

What Is Test Prep?

Yesterday I fell into a discussion of test prep on Twitter where a participant tossed forward the notion that test prep actually decreases test results. Others asserted that test prep doesn't really help. I'm pretty sure that both of those assertions are dead wrong, but I also suspect part of the problem is that "test prep" is an Extremely Fuzzy Term that means a variety of things.

The research itself is not exactly stunning. A study that turns up from time to time is a study from Chicago from 2008 that looks at test prep and ACT results (in Illinois, everyone takes the ACT, so congrats to whatever salesperson/lobbyist from ACT's parent company that landed that contract-- ka-ching! We'll skip over all the reasons that's a bad idea for now). A quick look at the summary shows that this study didn't exactly prove that test prep is a bust:

CPS students are highly motivated to do well on the ACT, and they are spending extraordinary amounts of time preparing for it. However, the predominant ways in which students are preparing for the ACT are unlikely to help them do well on the test or to be ready for college-level work. Students are training for the ACT in a last-minute sprint focused on test practice, when the ACT requires years of hard work developing college-level skills.

That's a nice piece of sleight of hand there. Test prep wasn't failing-- just one particular type was. There are, of course, many other test prep alternatives, but the study ignores those, shrugs, and says, "I guess our only alternative is to believe the ACT PR about how the test measures 'years of hard work' on college level skills."

Meanwhile, the College Board has been touting how their special brand of test prep totally works on the SAT. I'm going to summarize the test prep research by saying that there isn't much, what exists is kind of sketchy, and clear patterns fail to emerge. So let me get back to my main question.

First of all-- which test? For our purposes, when we talk about test prep, we're talking about the Big Standardized Test that the Common Core reform wave inflicted on every state. On the subject of test prep, those are the tests that matter most because that brand of test-centered high-stakes data-generation is the thing that has twisted our schools into test prep factories.

Lots of folks have tried to define test prep very narrowly as simply drilling or rote-working the specific information that is going to be on the test. That definition serves members of the testing cult because by that definition, not much test prep goes on. But I suspect virtually no actual classroom teachers would define test prep that way.

How would I define it?

Test prep is anything that is given time in my classroom for the sole purpose of having a positive impact on test scores.

Right up front, I'll note there is some grey area. There are some test prep things that I can turn into useful learning experiences, and there are some actual education things that may have a positive impact on test scores.

But if I'm only doing it because it will help with test scores, I say it's test prep, and I say to hell with it.

This covers a broad range of activities. It is necessary, particularly in the younger grades, to teach them how to deal with a multiple choice test, doubly necessary if the test is going to be taken on computer.

But once we've introduced that, we never let it go. Fifteen years ago, the amount of time I would have spent in my English class on activities in which students read a short passage and then answered a few multiple choice questions-- that time would have been pretty close to zero. Short excerpts and context-free passages are a crappy way to build reading skills or interest in reading, and multiple choice questions are just about the worst way to assess anything, ever. But now, like English departments across the country,  we have bought stacks of workbooks chock full of short passages coupled with sets of multiple choice questions. We don't buy them because we think they represent a great pedagogical approach; we buy them because they are good practice for the sort of thing the students will deal with on the BS Test. They are test prep, pure and simple, and if I were deciding strictly on educational merit, I wouldn't include them in my class at all. Not only are they a lousy way to teach reading, but they reinforce the mistaken notion that for every piece of reading, there's only one correct way to read it and that the whole purpose of reading is to be able to answer questions that somebody else asks you with the answers that somebody else wants.

Writing is even easier to do test prep for, and my department is the proof. We teach students some quick and simple writing strategies:
   1) Rewrite the prompt as your first sentence.
   2) Write neatly.
   3) Fill up as much paper as you can. Do not worry about redundancy or wandering.
   4) Use big words. It doesn't matter if you use them correctly (I always teach my students "plethora")
   5) Indent paragraphs clearly. If that's a challenge, skip a line between paragraphs.

With those simple techniques, we were able to ride consistent mid-ninety-percent of our students writing proficiently.

In addition, because the state wants the BS Test to drive curriculum, they make sure to let us know about the anchor standards (the standards that will actually be on the test) so we can be sure to include them, which to be effective, has to be done by using the state's understanding of the standards. Our professional judgment is not only irrelevant, but potentially gets in the way. This can cover everything from broader standards to specific terms likely to appear.

And, of course, we need to familiarize the students with the state's style of questioning. For instance, PA likes to test context clue use by giving students a familiarish word used an uncommon meaning for the word to make sure that the students decipher the word using only the sentence context and not actually knowledge already in their brains.

None of this is rote memorization of details for the test. All of it is test prep, and all of it is effective up to a point. Particularly students who are neither good nor enthusiastic test-takers, this can make the difference between terrible and mediocre results. And every year it leaves an ugly bad taste in my mouth, and every year all of us struggle with maintaining a balance between that educational malpractice and doing the teaching jobs that we signed up for when we started our careers.

Test prep does, in a sense, carry beyond the classroom. The article that kicked off yesterday's conversation was a piece by Matt Barnum about the shuffling of weaker teachers to younger grades. That is absolutely a thing-- I suspect every single teacher in the country can tell a story about administration moving teachers to where they won't "hurt us on the test results." Teachers who can do good test prep are moved to the testing windows; those who can't are moved out of the BS Test Blast Zone. There are far better ways to assign staff, but many administrations, eyes on their test scores, are afraid not to make test scores Job One.

In fact, there are districts where the structure of the schools is changed in response to testing. Eighth graders do notoriously badly on BS Tests, so it's smart to put sixth graders in your middle school with the eighth graders to mitigate the testing hit.

And there is the test prep that goes beyond instruction, because teachers understand the biggest obstacle to student performance on the BS Test-- the students have to care enough to bother to try. In a state like PA, where my students take a test that will effect my rating and my school's rating, but which has absolutely no stakes for them, that's an issue. The BS Tests are long and boring and, in some cases, hard. Youtube is filled with peppy videos and songs and cheers from the pep rallies and other endless attempts to make students actually care enough to try. This kind of test prep is not so much toxic to actual instruction as it attacks the foundation of trust in the school itself. Elementary teachers may feel it's helping, but by high school the students have figured out that it was all, as one student told me, "a big line of bullshit. You just want us to make you look good."

The most authentic assessment is the assessment that asks students to do what they've practiced doing. The reverse is also true-- the most effective preparation for an assessment task is to repeatedly do versions of that exact task. And so all across the country, students slog through various versions of practice tests. If you want students to get good at writing essays, you have them write essays. If you want them to get good at reading short stories, you read short stories. And if you want them to get good at taking bad multiple choice standardized tests, you take a steady diet of bad multiple choice standardized tests.

That's test prep, and it's effective. It won't make every student score above average for a variety of reasons, not the least of which is that the tests are meant to create a bell curve and not everybody can be above average.

But if you think the solution to getting ready for the BS Tests is to just teach students really, really well and the scores will just appear, like magic, then I would like to sell you a bridge in Florida that crosss the candy cane swamp to end in a land of unicorns that poop rainbows.


  1. Everyone knows about the John Oliver segment on charter schools that riled the corporate ed. reform no end, and provoked them to put a bounty on his head.

    What some might not remember was Oliver's takedown of standardized testing. As with the the charter segment, a memo of sorts must have gone out, as it provoked unprecedented vitriol from some very prominent corporate reformers.

    Here's the Oliver on standardized tests: (which is approaching 10 million views)

    A measure of the effectiveness of Oliver's piece can be seen in the confused, yet hysterical reaction of folks such as Peter Cunningham over at the Broad-funded EDUCATION POST:

    Cunningham condemns Oliver, claiming that he "devoted 18 tedious minutes to attacking something (the testing industry) that protects students at risk from being neglected, ignored and condemned to second-class citizenship."

    Yeah, right. That's what they're doing over at Pearson and the other testing behemoths. Imagine the conversations going on over there:

    "Say, how can we better protect students at risk from being neglected, ignored and condemned to second-class citizenship?" SAID NO TESTING COMPANY EXECUTIVE OR EMPLOYEE EVER.

    Cunningham doesn't address the two dozen or so points made in the Oliver piece, but offers such nonsense as:

    "John Oliver’s facile mockery of standardized testing does not put him on the side of disadvantaged children afflicted by underperforming schools. Instead, John Oliver sides with the comfortable bureaucrats, self-serving union leaders, and the complacent middle class that abdicates any responsibility for extending the American Dream beyond their own insular worlds."

    For some great pieces defending Oliver and attacking Cunningham, read here:

    In the latter, Mitchell Robinson quotes Cunningham saying the following:

    "We know these things because we force the educational bureaucracy to test kids, publish results and take action. Until we demanded real accountability, many states, with a few exceptions, simply ignored these kids."

    Robinson ain't having it: "For a person who has never taught, and holds no elected office, that's a pretty gaudy resume. Mr. Cunningham seems to believe he has the power to 'force' schools to test students according to his demands, and to enforce accountability on the unruly masses. To listen to Mr. Cunningham, before he came to the rescue no one in education ever thought to assess students' learning--and in fact, we were simply warehousing children with no thought of their futures. This is beyond arrogant--its delusional; and the fact that wealthy benefactors are subsidizing these beliefs should give us all pause.

    "The facts are that teachers know how to assess their students, and have been doing so very well for a long time. This recent obsession with standardized tests adds nothing to our arsenal of measurement tools that will help achieve the primary goal of any form of meaningful assessment: to improve instruction. Standardized test results present only a gross measurement of student progress, and usually are returned to teachers far too late to be of any real assistance in adjusting lesson plans or assignments. Those that depend on tests as a useful tool are placing all of their eggs in a torn and broken basket."

  2. Sometimes you have to take allies where you can find them, regardless of who they are.

    For example, Project Veritas is a group that has done some pretty heinous stuff, particularly in their manipulative editing of hidden video footage to create a false impression that conforms to their extreme right-wing views.

    However, in the following Project Veritas hidden video segment, they did good work:

    Here you have a testing company executive Dianne Barrow, (from HOUGHTON-MIFFLIN-HARCOURT) admitting,

    DIANNE BARROW: "It's all about the money. You don't think that the educational publishing companies are in it for the kids. Do you? No, they're in it for the money."

    PROJECT VERITAS JOURNALIST: "You seem like you're in it for the kids, though. You seem like, you know-"

    DIANNE BARROW: "No, I hate kids. (LAUGHS) I'm in it to sell books. Don't even kid yourself for a heartbeat."

    Barrow lost her job as a result of this.

  3. I don't know much about the standardized tests within the school systems, as I teach in a non-standard setting. But I also tutor ACT and SAT prep for a fine old commercial test-prep company, and I know "test prep" works in that context, for the reasons you outline. Standardized means predictable in format and design, with only the actual content unknown ahead of time. Students can be taught with drills, instruction, and practice tests both what to expect, and how best to problem-solve when unable to see the answer immediately.

    Because it's for college-admission, not scoring districts and schools, of course we see a bit more application on the part of the students than, as you hint, a school might see for a state-mandated test with no direct consequences for surly and bored adolescents.

    If in fact these state tests have closely aped the existing college-admission tests, then to hear officials claim that 'test prep' doesn't work is as funny to me as reading that the SAT was said to be immune to prep up until a few decades ago, when the College Board 'fessed up and began issuing its own 'test prep' handbooks for its own products.

    PS. I used to score the practice essays for the SAT and ACT prep, and I made an effort not to reward bloviation and inappropriate use of inflated vocab words, teaching the kids to focus on clarity, logical exposition, and supporting examples as the rubric required. I was never sure I was actually scoring the essays as the real scorers were, although I knew the kids were writing better essays on the third or fourth try using my approach. And then the SAT and ACT made their essays harder but optional, so we don't cover it in test-prep nearly as much any more. I suspect people were figuring out that a standardized essay test doesn't say a lot more about writing ability than the standardized multiple-choice section about writing, sorry, about proofreading skills does. If I remember, the client colleges back in the 90s found, almost as soon as the tests instituted a called-for 'Writing' component, that the scores paralleled the old "Verbal" (mostly Reading) scores almost exactly, adding no new information to the colleges' admission folders.