Increasing Impact of Coursework Through Deep Analytics

Wednesday, 17 December 2014
Lev Horodyskyj1, David Schönstein2, Sanlyn Buxner3, Steven C Semken1 and Ariel D Anbar1, (1)Arizona State University, School of Earth and Space Exploration, Tempe, AZ, United States, (2)Smart Sparrow, Sydney, Australia, (3)Planetary Science Institute Tucson, Tucson, AZ, United States
Over the past few years, ASU has developed the online astrobiology lab course Habitable Worlds, which has been offered to over 1,500 students over seven semesters. The course is offered through Smart Sparrow's intelligent tutoring system, which records student answers, time on question, simulation setups, and additional data that we refer to as "analytics". As the development of the course has stabilized, we have been able to devote more time to analyzing these data, extracting patterns of student behavior and how they have changed as the course has developed.

During the most recent two semesters, pre- and post-tests of content knowledge related to the greenhouse effect were administered to assess changes in students' knowledge. The results of the Fall 2013 content assessment and an analysis of each step of every activity using the course platform analytics were used to identify problematic concepts and lesson elements, which were redesigned for the following semester. We observed a statistically significant improvement from pre to post instruction in Spring 2014. Preliminary results seem to indicate that several interactive activities, which replaced written/spoken content, contributed to this positive outcome. Our study demonstrates the benefit of deep analytics for thorough analysis of student results and quick iteration, allowing for significantly improved exercises to be redeployed quickly.

The misconceptions that students have and retain depend on the individual student, although certain patterns do emerge in the class as a whole. These patterns can be seen in student discussion board behavior, the types of answers they submit, and the patterns of mistakes they make. By interrogating this wealth of data, we seek to identify the patterns that outstanding, struggling, and failing students display and how early in the class these patterns can be detected. If these patterns can be identified and detected early in the semester, instructors can intervene earlier, prodding unmotivated students while devoting significant attention to students who struggle yet remain interested in the subject matter. This offers a marked improvement over the more common practice of waiting (often futilely) for students to seek help.