Popham, Chapter 6 Pondertime & Chapter 7 Pondertime, Due February 8, 2012

Chapter 6 Pondertime (p. 161, #1, #3)
image
1.  If you were asked to take part in a mini-debate about the respective virtues of selected-response items versus constructed-response items, what do you think would be your major points if you were supporting the use of selected response test items?

Selected-response test items can be more numerous, i.e. students can complete them more quickly than constructed response test items.  Selected-response tests can also be graded more quickly.  Although, I would readily admit that selected-response test items do not give near the amount of insight into the real thought processes of the test-taker that constructed-response items provide.  And, there is more chance of guessing a correct answer in a selected-response test, than in a constructed-response test.

3.  Why do you think that multiple-choice tests have been so widely used in nationally standardized norm-referenced achievement tests during the past half-century?

To put it quite simply, multiple-choice tests are easier to grade and easier to analyze.  The range of outputs of a multiple-choice test is only a function of the number of questions on the test.  Which is to say, you can explicitly determine all the possible scores of a multiple choice test and then determine how many students have fallen into certain output categories or groupings.

Chapter 7 Pondertime (p. 184, #1, #2)
image
image
1.  What do you think has been the instructional impact, if any, of the widespread incorporation of student writing samples in high-stakes educational achievement tests used in numerous states?

Teachers who desire their students to do well on high-stakes educational achievement tests, undergo huge pressures to tailor their instruction to help their students succeed on such tests.  For student writing samples,this means that teachers are delivering instruction that causes students to practice the skills of needed to produce students writing samples that are of high quality.  Tips such as 5-paragraph essay format, narrowing topic quickly and precisely, and writing with the right combination of detail and brevity is key.  Other instructional impact may include the loss of time for other topics in the classroom as preparation is made for doing well on writing sample portions.

According to Wikipedia, the writing section of the SAT was added in March of 2005  (almost 20 years after I took the SAT).  I should do some research on whether data since that time has proven the Writing section to be valuable or useful.

References

SAT (2012) Retrieved February 10, 2012 from http://en.wikipedia.org/wiki/SAT

2.  How would you contrast short-answer items and essay items with respect to their elicited levels of cognitive behavior?  Are there differences in the kinds of cognitive demands called for by the two item types?  If so, what are they?

Short answer items demand limited levels of cognitive behavior, at least in comparison to essay items.  The longer form demands, on average, more recall and reasoning from the student, or at least more regurgitation of opinions heard or given during classroom discussions.  The real issue I think is cognitive demands.  A short-answer item, by definition may be a mere phrase or paragraph, and thus not require much application of new learning or thinking (i.e. extrapolation or interpolation of sources and opinions of others).  The essay-based test question forces a student to mentally perambulate through sources and opinions, hearsay and argumentation, and prove that they themselves can either replicate a standard argument or come up with a newer one.  In this case I am equating “argument” with a chain of assertions more or less supported by logic or sources which necessitates some development (i.e. proposition, inference, conclusion). 

That perambulation requires some extreme cognitive demand in order not to be found falling into some slough of whimsy or crevasse of fallacy.  A student successful in an essay item has definitely met some higher cognitive demands than those the short-answer question.  An interesting followup question might be:  “Is more better?”

According to the  Wikipedia article on the SAT (2012), the writing section on that test has been studied since its inception in 2005 and there is some evidence that the longer the answer essay the better the score.  This may be an artifact that favors the rambling student!

References

SAT (2012) Retrieved February 10, 2012 from http://en.wikipedia.org/wiki/SAT

References

Popham, W.J. (2011). Classroom Assessment: What Teachers Need to Know. (6th ed.). Boston, MA: Pearson Education, Inc.

Advertisements
Trackbacks are closed, but you can post a comment.

Comments

  • tanshu  On February 12, 2012 at 9:02 pm

    I definitely agree with you about selected response questions. One of my students who has missed about 3 weeks of school took the benchmark and managed to get a 10/12. I asked him how he managed to do that and he responded “I just guessed.” We are always told to support our argument when we write essays, and in order to do that we have to write more. I wouldn’t say rambling makes your essay better but if you get to the point with evidence, they will give you a good score.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: