Authors

Luis Andres Lesmes and Michael Dorr

Abstract

We present Quantitative Visual Acuity (qVA), a novel active learning algorithm to assess visual acuity. It uses Monte Carlo simulations and an information maximization strategy during stimulus selection, and Bayesian inference to iteratively update the best estimate of the true underlying function. Compared to the state of the art, qVA uses a richer model for observer behaviour, and we use simulations to show its excellent test-retest repeatability and ability to detect change. In simulations of clinical studies with 50 "control" subjects demonstrating no visual change, and 50 "treatment" subjects demonstrating a 0.10 logMAR change (corresponding to one line of the gold-standard ETDRS letter chart), the qVA detected visual change with an AUC of 93%, relative to 78% performance by the ETDRS standard, given the same number of presented letters.

Read preprint here