Image Features Influence Reaction Time: A Learned Probabilistic Perceptual Model for Saccade Latency
SIGGRAPHMay 5, 2022Best Paper
We aim to ask and answer an essential question "how quickly do we react after
observing a displayed visual target?" To this end, we present psychophysical
studies that characterize the remarkable disconnect between human saccadic
behaviors and spatial visual acuity. Building on the results of our studies, we
develop a perceptual model to predict temporal gaze behavior, particularly
saccadic latency, as a function of the statistics of a displayed image.
Specifically, we implement a neurologically-inspired probabilistic model that
mimics the accumulation of confidence that leads to a perceptual decision. We
validate our model with a series of objective measurements and user studies
using an eye-tracked VR display. The results demonstrate that our model
prediction is in statistical alignment with real-world human behavior. Further,
we establish that many sub-threshold image modifications commonly introduced in
graphics pipelines may significantly alter human reaction timing, even if the
differences are visually undetectable. Finally, we show that our model can
serve as a metric to predict and alter reaction latency of users in interactive
computer graphics applications, thus may improve gaze-contingent rendering,
design of virtual experiences, and player performance in e-sports. We
illustrate this with two examples: estimating competition fairness in a video
game with two different team colors, and tuning display viewing distance to
minimize player reaction time.