Navigation beyond Wayfinding: Robots Collaborating with Visually Impaired Users for Environmental Interactions
Abstract: Robotic guidance systems have shown promise in supporting blind and visually impaired (BVI) individuals with wayfinding and obstacle avoidance. However, most existing systems assume a clear path and do not support a critical aspect of navigation—environmental interactions that require manipulating objects to enable movement. These interactions are challenging for a human–robot pair because they demand (i) precise localization and manipulation of interaction targets (e.g., pressing elevator buttons) and (ii) dynamic coordination between the user’s and robot’s movements (e.g., pulling out a chair to sit). We present a collaborative human–robot approach that combines our robotic guide dog’s precise sensing and localization capabilities with the user’s ability to perform physical manipulation. The system alternates between two modes: lead mode, where the robot detects and guides the user to the target, and adaptation mode, where the robot adjusts its motion as the user interacts with the environment (e.g., opening a door). Evaluation results show that our system enables navigation that is safer, smoother, and more efficient than both a traditional white cane and a non-adaptive guiding system, with the performance gap widening as tasks demand higher precision in locating interaction targets. These findings highlight the promise of human–robot collaboration in advancing assistive technologies toward more generalizable and realistic navigation support.
For more details, see PDF
Team:
- Shaojun Cai
- Nuwan Janaka
- Ashwin Ram
- Janidu Shehan
- Yingjia Wan
- Kotaro Hara
- David Hsu