Schedule of Events | Search Abstracts | Talk Sessions | Poster Sessions
Poster Session B, Wednesday, May 20, 2:30 – 3:15 pm
Board 5
Perceptual Consequences of Visual Feature Reduction in Naturalistic VR
Jiaming Xu1, Sai Prashanth Raja Sundaram1, Youjin Oh1, Robbe L. T. Goris1, Mary Hayhoe1; 1UT Austin
Efficient prediction of visual input is critical for stable perception during self-motion and has direct implications for VR rendering, display optimization, and perceptually guided compression. Under natural viewing conditions, predictive structure arises from the spatial distribution of luminance, chromatic, and binocular information. In practical VR systems, the amount of visual information that can be presented is constrained by hardware limits on resolution, bandwidth, latency, etc., requiring prioritization of some visual features over others. Despite this, it remains unclear which visual features are essential and which can be reduced without compromising predictability. Here, we quantify the contribution of individual visual features to predictive structure under naturalistic VR viewing conditions. We developed a controlled VR psychophysics pipeline to assess how color and binocular disparity shape predictive perceptual representations during egocentric motion. Real-world environments were reconstructed from head-mounted locomotion videos using Gaussian Splatting, preserving fine-grained 3D scene structure. Original camera trajectories were replayed in VR, enabling precise manipulation of visual features while maintaining natural motion statistics. Three stimulus conditions were tested: fully naturalistic, color removed (grayscale), and color plus stereo removed (monocular grayscale). Predictive structure was quantified behaviorally using an AXB discrimination task to estimate perceptual distances between all pairs of video frames. These distances were used to construct perceptual trajectories and compute their curvature, a geometric summary of temporal predictability, with lower curvature indicating more predictable perceptual transitions. Measurements were collected in two environments with distinct spatial layouts. Results show that the perceptual impact of removing chromatic or binocular information depends on scene structure, indicating that predictive perceptual representations flexibly adapt to scene statistics to support context-dependent temporal prediction. The proposed VR-based psychophysical framework provides a practical tool for quantifying the relative contribution of visual features in immersive display systems and informing perceptually optimized rendering decisions.



