Knowledge extraction from pointer movements and its application to detect uncertainty

Catia Cepeda, Maria Camila Dias, Dina Rindlisbacher, Hugo Gamboa, Marcus Cheetham

Research output: Contribution to journalArticlepeer-review

31 Downloads (Pure)

Abstract

Pointer-tracking methods can capture a real-time trace at high spatio-temporal resolution of users' pointer interactions with a graphical user interface. This trace is potentially valuable for research on human-computer interaction (HCI) and for investigating perceptual, cognitive and affective processes during HCI. However, little research has reported spatio-temporal pointer features for the purpose of tracking pointer movements in on-line surveys. In two studies, we identified a set of pointer features and movement patterns and showed that these can be easily distinguished. In a third study, we explored the feasibility of using patterns of interactive pointer movements, or micro-behaviours, to detect response uncertainty. Using logistic regression and k-fold cross-validation in model training and testing, the uncertainty model achieved an estimated performance accuracy of 81%. These findings suggest that micro-behaviours provide a promising approach toward developing a better understanding of the relationship between the dynamics of pointer movements and underlying perceptual, cognitive and affective psychological mechanisms. Human-computer interaction; Pointer-tracking; Mouse movement dynamics; Decision uncertainty; On-line survey; Spatio-temporal features; Machine learning

Original languageEnglish
Article numbere05873
JournalHeliyon
Volume7
Issue number1
DOIs
Publication statusPublished - Jan 2021

Keywords

  • Decision uncertainty
  • Human-computer interaction
  • Machine learning
  • Mouse movement dynamics
  • On-line survey
  • Pointer-tracking
  • Spatio-temporal features

Fingerprint

Dive into the research topics of 'Knowledge extraction from pointer movements and its application to detect uncertainty'. Together they form a unique fingerprint.

Cite this