Monday, October 21, 2013

Shout out for multiple choice tests

Don't drop your jaw, your drink, or your keyboard but today's post is in praise of the multiple choice test. 

Readers know that I've pondered "best practice" with quiz agents in recent years, due to the surge of the LMS, and I've promoted the rich feature set in quiz options. 

So many options. Randomize questions, randomize answers, allow multiple attempts, accept highest attempt, and even the wide range of question types now available. It changes the game, certainly, but it's still a "multiple choice test" (said with disdain in my best teaching excellence voice) and I cautioned time-savings for us ("It grades itself!") vs meaningful learning. 

I was torn. I loved allowing multiple attempts as a self-assessment tool and loved the tie to research on value of in-situ, timely feedback. 

So, ambiguity in place, I was doing a workshop recently exploring the "flipped classroom." Participants who had tried flipping noted difficulty in motivating students to do the work before coming to class - a requirement in flipping. We talked about solutions and the best offered? Use of multiple choice, online exams that closed before start of class. A required pre-quiz gives students incentive to prepare, read, study before class. Participants offered the idea hesitantly, for it was recommending - yes, that inner voice again - multiple choice exams.

OK, here's the news for which my stories prime the pump: researchers recently publishing in Psychological Science (Little, Bjork, Bjork, & Angello, 2012) make a convincing case that, when constructed properly, multiple-choice tests can and do engage the productive retrieval process and do so effectively. More than that, the authors claim that multiple-choice can actually help with learning in a more effective way than cued-recall as multiple-choice questions aid in recall of information pertaining to incorrect alternatives, whereas cued-recall questions did not. 

Read it and weep, ancient souls still fighting the adoption of the LMS and online learning resources. Combining the deep feature set of online tests with the evidence for value of reinforcing retrieval and enhancing understanding of incorrect, common choices puts me firmly in the "I love Canvas, I love the LMS, I love self-assessment" camp. 

Thanks to ISPI for publicizing to the design community  and for those who can get through the firewall via your university library, here's the primary research. 

GREAT reads. Guilt relievers. Emotional support for new practice in time-saving teaching and deeper learning. 

No comments: