Engaging Students through Video: Integrating Assessment and Instrumentation
MetadataShow full item record
CitationMacWilliam, Tommy, R. J. Aquino, and David J. Malan. Forthcoming. Engaging students through video: Integrating assessment and instrumentation. In 18th Annual ACM Conference on Innovation and Technology in Computer Science Education. Canterbury, England. July 2013.
AbstractCS50 is Harvard’s introductory course for majors and non-majors alike. For years, we have posted videos of the course’s lectures and sections online for the sake of review and distance education alike. But students’ experience with these videos has been historically passive. Students have been able to watch the
course’s content on demand, rewinding and fast-forwarding at will, but they have not had means to engage interactively with the content or to check their understanding of material while watching videos. Furthermore, while we collected basic usage data (e.g., how many times a video was viewed), we lacked detailed analytics
describing, for example, which portions of a video were commonly skipped or watched multiple times by students.
To make videos more immersive and engaging for students, we developed CS50 Video, an open-source video player for desktop and mobile devices. CS50 Video allows instructors to integrate assessment questions to be answered by students at their own pace or at specific points in time directly into a video player. CS50 Video also allows students to search over video transcripts to find content easily as well as view videos at variable playback speeds (in order to make videos more accessible for ESL learners). Finally, CS50 Video integrates with third-party analytics solutions to allow instructors to view detailed usage statistics describing how students are interacting with videos (e.g., which videos or portions of videos are commonly watched or skipped over).
We have deployed CS50 Video to students taking CS50 online and have obtained preliminary results. Because CS50 Video stores responses to questions server-side, we have been able to track students’ performance on in-video assessments. Thus far, we have observed that only 28% of students who watch online videos have
engaged with assessment questions. Students who answer an assessment question incorrectly on their first attempt will often try again until reaching a correct answer, with 84.5% of correct answers reached in at most three attempts. We next plan to analyze the effects of in-video assessments on students’ mastery of material and introduce A/B-testing functionality for questions. We also plan to use students’ performance on assessments to understand the topics with which students struggle.
Citable link to this pagehttp://nrs.harvard.edu/urn-3:HUL.InstRepos:10629764
- FAS Scholarly Articles