Failure of online quizzing to improve performance in introductory psychology courses

Document Type

Article

Publication Date

6-2015

Publisher

American Psychological Association

Abstract

Online quizzing tools cost both money (students) and time (faculty and students) to implement; if online quizzes boost class performance, then the extra cost can be justified. Although many studies have found that students who use online quizzes do better on tests than their classmates who do not, these studies are frequently confounded by a variety of factors. For example, some studies allow students to self-select into user and nonuser groups (e.g., Grimstad & Grabe, 2004), leaving open a question about good students being more likely to use online quizzing tools. Others have found that online quizzing produces only marginal and selective improvement in course performance (e.g., Bartini, 2008). In contrast to these more hesitant findings, other areas of the literature reveal robust evidence for a number of best practices for improving student learning (see Dunn, Saville, Baker, & Marek, 2013) such as testing and spacing effects. Our objective was to see if online study tools could produce these effects in real undergraduate classes (as opposed to within the laboratory). Undergraduate students enrolled in introductory psychology courses were provided access to publisher-provided content (Norton) and were required to use online quizzing. In the first experiment, the spacing of the online content was manipulated. In the second experiment, the requirement for completing online quizzing was manipulated within subjects. Online tools did not improve performance on in-class quizzes nor did they influence in-class exam performance. The findings of our study underscore the importance of creating online content that demonstrates improved student learning.

Share

COinS