2025-10-13 ミシガン大学

Yuxuan Liu (left with headset), a doctoral student in computer science and engineering, and Chen Liang (right), another doctoral student of computer science and engineering, demonstrate how HandProxy follows voice commands inside a demo app. Image credit: Marcin Szczepanski, Michigan Engineering
<関連情報>
- https://news.umich.edu/this-digital-hand-enables-hands-free-virtual-reality/
- https://dl.acm.org/doi/10.1145/3749484
HandProxy: 仮想プロキシハンドによる没入型環境における音声インターフェースのアフォーダンスの拡張 HandProxy: Expanding the Affordances of Speech Interfaces in Immersive Environments with a Virtual Proxy Hand
Chen Liang, Yuxuan Liu, Martez Mott, Anhong Guo
Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies Published: 03 September 2025
DOI:https://doi.org/10.1145/3749484
Abstract
Hand interactions are increasingly used as the primary input modality in immersive environments, but they are not always feasible due to situational impairments, motor limitations, and environmental constraints. Speech interfaces have been explored as an alternative to hand input in research and commercial solutions, but are limited to initiating basic hand gestures and system controls. We introduce HandProxy, a system that expands the affordances of speech interfaces to support expressive hand interactions. Instead of relying on predefined speech commands directly mapped to possible interactions, HandProxy enables users to control the movement of a virtual hand as an interaction proxy, allowing them to describe the intended interactions naturally while the system translates speech into a sequence of hand controls for real-time execution. A user study with 20 participants demonstrated that HandProxy effectively enabled diverse hand interactions in virtual environments, achieving a 100% task completion rate with an average of 1.09 attempts per speech command and 91.8% command execution accuracy, while supporting flexible, natural speech input with varying levels of control and granularity.


