Schedule of Events | Search Abstracts | Talk Sessions | Poster Sessions
Poster Session B, Wednesday, May 20, 2:30 – 3:15 pm
Board 8
Virtual Reality Balances Precise Control and Ecological Validity in Vision Science
Jade Guénot1, Andrew Freedman1, Preeti Verghese1; 1¹ Smith-Kettlewell Eye Research Institute, San Francisco, United States
Virtual reality (VR) is a valuable tool for vision science providing a middle ground between exquisitely controlled but limited laboratory conditions and the complexity of our 3D world. This is particularly important for clinical populations whose everyday visual challenges are poorly captured with static or 2D tasks. VR enables continuous recording of eyes, head, hands, and feet movements during active and engaging interactions within safe immersive environments, while also allowing controlled simulation of visual impairments. Despite these advantages, a major challenge has been stimulus control in VR experiments, but new tools such as Perception Toolbox for Virtual Reality (PTVR) are easing this process by allowing experimental design with Python and the specification of stimuli and responses by angular subtense. We illustrate VR’s potential with two experiments with macular degeneration (MD) participants and controls tested with and without gaze-contingent artificial scotomas. The first examines hand-eye coordination in 3D: participants (4 MD, 9 controls) followed a butterfly with gaze alone and with a controller-anchored net. Gaze and hand positions were recorded. The second investigates visual search during walking. Participants (2 MD, 9 controls) walked through a virtual conference room while searching for a target object among distractors. Search efficiency was measured with rotational head movements and search performance. For the butterfly-catching experiment, results showed that combining hand and eye movements enhance gaze precision in MD participants, reducing angular error by 2° and pursuit latency by 36 ms on average. In the search experiment, MD participants’ were less efficient compared to controls, requiring twice as much head rotation to locate the target. These examples demonstrate how VR, combined with tools like PTVR, enables studying visual functions in ecological yet controlled environments. This approach has the potential to identify how visual impairments impact everyday behavior, and factors that facilitate interactions with the surrounding world.
Acknowledgements: NIH funding R01 EY27390



