Lost in Style: Gaze-driven Adaptive Aid for VR Navigation (CHI 2019)

Lost in Style: Gaze-driven Adaptive Aid for VR Navigation (CHI 2019)


A key challenge for virtual reality level
designers is striking a balance between maintaining
the immersiveness of virtual reality and providing users with an on-screen navigation
aid for a virtual experience Aids like mini-maps may obstruct users’ vision
leading to frustration. Navigation aids like arrows may be distracting. These aids are often necessary for wayfinding
in virtual environments with complex paths. We present an adaptive aid for navigation
in VR. Whenever a user feels in need of navigation
aid in VR, we can capture their gaze using a VR headset. Our adaptive aid then uses their gazepatterns
in predicting their need for navigation aid, then displays a navigation aid accordingly. Here the user feels the need for navigation
help once she saw a barrier blocking her path. Our classifier was able detect the need for
navigation aid based on her gaze patterns and displayed an arrow to guide her to her
destination. Our adaptive arrow diverted more of users
gaze on the scene rather than the arrow’s region. Comparing this to the permanant arrow which
had less gaze points outside of the arrow’s region. We designed a new scene to validate our method. We were able to predict the users’ need for
navigation aid with an 80% accuracy.

1 thought on “Lost in Style: Gaze-driven Adaptive Aid for VR Navigation (CHI 2019)

Leave a Reply

Your email address will not be published. Required fields are marked *