SHARE:

Inferring Cognitive Abilities from Response Times to Web-Administered Survey Items in a Population-Representative Sample

General Information

Title
Inferring Cognitive Abilities from Response Times to Web-Administered Survey Items in a Population-Representative Sample
Author
Doerte U. Junghaenel, Stefan Schneider, Bart Orriens, Haomiao Jin, Pey-Jiuan Lee, Arie Kapteyn, Erik Meijer, Elizabeth Zelinski, Raymond Hernandez, an
Publication Type
Journal paper
Outlet
Journal of Intelligence
Year
2022
Abstract
Monitoring of cognitive abilities in large-scale survey research is receiving increasing attention. Conventional cognitive testing, however, is often impractical on a population level highlighting the need for alternative means of cognitive assessment. We evaluated whether response times (RTs) to online survey items could be useful to infer cognitive abilities. We analyzed >5 million survey item RTs from >6000 individuals administered over 6.5 years in an internet panel together with cognitive tests (numerical reasoning, verbal reasoning, task switching/inhibitory control). We derived measures of mean RT and intraindividual RT variability from a multilevel location-scale model as well as an expanded version that separated intraindividual RT variability into systematic RT adjustments (variation of RTs with item time intensities) and residual intraindividual RT variability (residual error in RTs). RT measures from the location-scale model showed weak associations with cognitive test scores. However, RT measures from the expanded model explained 22–26% of the variance in cognitive scores and had prospective associations with cognitive assessments over lag-periods of at least 6.5 years (mean RTs), 4.5 years (systematic RT adjustments) and 1 year (residual RT variability). Our findings suggest that RTs in online surveys may be useful for gaining information about cognitive abilities in large-scale survey research.