Abstract
As virtual reality (VR) headsets become more comfortable and accessible, their growing use in high-stakes, time-critical settings raises concerns about fatigue. Fatigue impairs perceptual and cognitive functioning. It reduces oculomotor accuracy and visual focus, and can lead to an early decline in depth estimation performance. By augmenting the visual presentation with depth information cues, adaptive designs can reduce fatigue-related depth perception errors, enhancing safety and task effectiveness. Explicit cues present depth information directly through text or color, while subtle cues adjust scene properties, such as depth-dependent blur, to convey depth information implicitly without drawing overt attention. We examined how fatigue interacts with different cues in a 27-hour within-subject protocol. Across six overnight sessions (20:00–07:00), twenty-three participants completed a VR depth perception task at varying fatigue levels and under four cue conditions: no cue (baseline), text, color, and blur. Over the night, vigilance declined, sleepiness and mental effort increased, and simulator sickness rose before stabilizing, independent of cue condition. All augmented cues reduced depth estimation error relative to base-
line. Text yielded the largest and most consistent accuracy gains, especially for farther targets and later sessions. At peak fatigue, response probability dipped for text but remained stable for blur and color, indicating an accuracy versus responsiveness trade-off. These results support mixed adaptive designs that default to subtle cues to preserve responsiveness at low alertness and introduce explicit overlays when precise metric information is needed.
line. Text yielded the largest and most consistent accuracy gains, especially for farther targets and later sessions. At peak fatigue, response probability dipped for text but remained stable for blur and color, indicating an accuracy versus responsiveness trade-off. These results support mixed adaptive designs that default to subtle cues to preserve responsiveness at low alertness and introduce explicit overlays when precise metric information is needed.
| Original language | English |
|---|---|
| Title of host publication | IEEE VR Conference |
| Publisher | IEEE |
| Publication status | E-pub ahead of print - 25 Mar 2026 |
| Event | 33rd IEEE Conference on Virtual Reality and 3D User Interfaces, IEEE VR 2026 - Daegu, Korea, Republic of Duration: 21 Mar 2026 → 25 Mar 2026 https://ieeevr.org/ |
Conference
| Conference | 33rd IEEE Conference on Virtual Reality and 3D User Interfaces, IEEE VR 2026 |
|---|---|
| Country/Territory | Korea, Republic of |
| City | Daegu |
| Period | 21/03/26 → 25/03/26 |
| Internet address |
UN SDGs
This output contributes to the following UN Sustainable Development Goals (SDGs)
-
SDG 9 Industry, Innovation, and Infrastructure
Keywords
- virtual reality
- depth perception
- fatigue
- sleep deprivation
- user study
- visual cues
- subtle cues
ASJC Scopus subject areas
- Human-Computer Interaction
Fields of Expertise
- Information, Communication & Computing
Fingerprint
Dive into the research topics of 'Depth Perception Cues in VR Under Sleep Deprivation'. Together they form a unique fingerprint.Projects
- 1 Active
-
DDIA - Data Driven Immersive Analytics in Digital Industries
Schreck, T. (Project manager on research unit), Pock, T. (Project manager on research unit), Veas, E. E. (Attendee / Assistant), Lindstaedt, S. (Project manager on research unit), Müller-Putz, G. (Project manager on research unit), Pammer-Schindler, V. (Attendee / Assistant) & Kowald, D. (Attendee / Assistant)
1/01/22 → 31/12/26
Project: Research project
Cite this
- APA
- Standard
- Harvard
- Vancouver
- Author
- BIBTEX
- RIS