Telling Delays: What Students’ Study Sign-up Timing in Sona Reveals

Published October 19, 2022

An overdue acknowledgement of a study on procrastination

One of the upshots of fielding so many questions from users around the world is the holistic, interdisciplinary perspective it provides. Partly, this perspective informs our continual development of our platform to accommodate the everchanging landscape of scientific research and higher education. But it also enables us to occasionally get creative in how our current tools and services can be used, and offer recommendations accordingly.

However, we’re not alone here. And, if we’re honest, some of the ideas for novel or “unconventional” uses of the Sona platform we share originally came from the research literature. Previously, we couldn’t readily share these finds with Sona users more broadly, let alone point to the original sources for your personal perusal. That’s changed. So you can expect more posts like this one, which is about sharing just such a nugget and giving credit where credit is due.

In this case, the credit goes to the authors of Self-report measures of procrastination exhibit inconsistent concurrent validity, predictive validity, and psychometric properties (Vangsness, L., Voss, N. M., Maddox, N., Devereaux, V., & Martin, E. (2022). Self-Report Measures of Procrastination Exhibit Inconsistent Concurrent Validity, Predictive Validity, and Psychometric Properties. Frontiers in psychology, 13(784471).

This research (part of a Frontiers in Psychology collection New Perspectives on Procrastination) utilized multiple measures to carefully assess the relationship between self-reported measures of procrastination and behavioural measures. We found both of the behavioral measures particularly interesting, but we’ll let the authors speak for themselves here:

Behavioral Measures of Delay

In addition to providing self-reports, we also employed two behavioral measures of delay to test the predictive validity of the self-report measures of procrastination. These measures were derived from the dates of students’ research appointments recorded in the Sona Systems database (Sona Systems, 2021), our institution’s experiment management system. Research participation requirements are shared in syllabi, worth course credit, and have deadlines. The students in our sample all needed to complete 16 credits before the end of the semester. These students were aware that failing to complete their research credits would be disadvantageous—it would negatively impact their grade—and were reminded of this fact several times throughout the semester. Research appointments were available throughout the semester. Therefore, delaying the completion of a single research credit by a few weeks (especially early in the semester) would not place students in danger of failing the assignment. However, a pattern of delay exhibited across the course of the semester would, as research appointments are a limited resource. These circumstances gave rise to “weak” situations (Mischel, 1977) in which individual differences in pacing style were expected to be especially pronounced (Gevers et al., 2015). Thus, these measures represented a meaningful way to test predictive validity of these self-report instruments (Vangsness and Young, 2020).

The authors go on to specify in greater detail both the “Days to Completion” and the “Pacing Styles” measure, which you can read about in their paper (don’t worry, it’s open access!)

In addition to being an example of excellent research, this study contains a fantastic example of a truly creative use of Sona records to construct empirical measures and data collection, and we want to congratulate the authors as well as let them know we will probably be borrowing from their ingenuity in the future. Thanks!

Related Posts