Skip to main navigation Skip to search Skip to main content

Depth Perception Cues in VR Under Sleep Deprivation

  • Ammaar Zaman
  • , James Baumeister
  • , Ernst Kruijff
  • , Eduardo Enrique Veas
  • , Aleksandra Krajnc
  • , Neven El Sayed

Research output: Chapter in Book/Report/Conference proceedingConference paperpeer-review

Abstract

As virtual reality (VR) headsets become more comfortable and accessible, their growing use in high-stakes, time-critical settings raises concerns about fatigue. Fatigue impairs perceptual and cognitive functioning. It reduces oculomotor accuracy and visual focus, and can lead to an early decline in depth estimation performance. By augmenting the visual presentation with depth information cues, adaptive designs can reduce fatigue-related depth perception errors, enhancing safety and task effectiveness. Explicit cues present depth information directly through text or color, while subtle cues adjust scene properties, such as depth-dependent blur, to convey depth information implicitly without drawing overt attention. We examined how fatigue interacts with different cues in a 27-hour within-subject protocol. Across six overnight sessions (20:00–07:00), twenty-three participants completed a VR depth perception task at varying fatigue levels and under four cue conditions: no cue (baseline), text, color, and blur. Over the night, vigilance declined, sleepiness and mental effort increased, and simulator sickness rose before stabilizing, independent of cue condition. All augmented cues reduced depth estimation error relative to base-
line. Text yielded the largest and most consistent accuracy gains, especially for farther targets and later sessions. At peak fatigue, response probability dipped for text but remained stable for blur and color, indicating an accuracy versus responsiveness trade-off. These results support mixed adaptive designs that default to subtle cues to preserve responsiveness at low alertness and introduce explicit overlays when precise metric information is needed.
Original languageEnglish
Title of host publicationIEEE VR Conference
PublisherIEEE
Publication statusE-pub ahead of print - 25 Mar 2026
Event33rd IEEE Conference on Virtual Reality and 3D User Interfaces, IEEE VR 2026 - Daegu, Korea, Republic of
Duration: 21 Mar 202625 Mar 2026
https://ieeevr.org/

Conference

Conference33rd IEEE Conference on Virtual Reality and 3D User Interfaces, IEEE VR 2026
Country/TerritoryKorea, Republic of
CityDaegu
Period21/03/2625/03/26
Internet address

UN SDGs

This output contributes to the following UN Sustainable Development Goals (SDGs)

  1. SDG 9 - Industry, Innovation, and Infrastructure
    SDG 9 Industry, Innovation, and Infrastructure

Keywords

  • virtual reality
  • depth perception
  • fatigue
  • sleep deprivation
  • user study
  • visual cues
  • subtle cues

ASJC Scopus subject areas

  • Human-Computer Interaction

Fields of Expertise

  • Information, Communication & Computing

Fingerprint

Dive into the research topics of 'Depth Perception Cues in VR Under Sleep Deprivation'. Together they form a unique fingerprint.

Cite this