Delen, E., Liew, J., & Willson, V. (2014). Effects of interactivity and instructional scaffolding on learning: Self-regulation in online video-based environments. Computers & Education, 78, 312–320. https://doi.org/10.1016/j.compedu.2014.06.018
This study used experimental design to compare a control group consuming common video, to an experimental group interacting with enhanced video. Common video is defined here, and elsewhere in the literature, as video with standard playback controls – termed micro-level activities. Enhanced video interactivity features added for the experimental group included note-taking affordances, supplemental resources, and practice questions – termed macro-level activities. Three measures of data were collected: a survey of student self-regulation (the SRSI instrument), a recall test of content, and frequency usage of interactive features. Results showed the interactive environment was superior in learning performance as well as superior for graduate students in terms of self-regulation.
There are several strong elements of this study. Structurally, the reader is provided clarity in literature review strands and research questions, methods and results. While this might be expected in all research, it is surprisingly absent from some studies. Clearly stating these in a delineated heading structure was helpful. The experimental design was well-documented also. In terms of results, three streams of data aided in triangulation of data, making the results richer.
I found this study useful for several reasons. First, the researchers created an interface. I am always interested in research that results in a tangible product, which is then tested for effectiveness. Second, and more importantly, the topic is also directly related to my work and research interests. At our institution, we do a great deal of lecture capture for online viewing. In fact, in several of the colleges, students are not required to attend lecture. I am very interested in ways to enhance this online experience by incorporating active learning techniques. From a technological standpoint there are a few options, but they come at a cost from vendors. If we are to incorporate these features and take on the cost, we must have evidence that the features will improve outcomes. This is exactly the evidence we can use. Last, I found the mention of the SRSI instrument useful. This is an instrument I might be able to use in my own research for similar studies.