ASAP 2010 - 21st IEEE International Conference on Application-specific Systems, Architectures and Processors
Download PDF

Abstract

We integrated touch menus into a cohesive smartphone-based VR controller. Smartphone touch surfaces offer new interaction styles and also aid VR interaction when tracking is absent or imprecise or when users have limited arm mobility or fatigue. In Handymenu, a touch surface is split into two areas: one for menu interaction and the other for spatial interactions such as VR object selection, manipulation, navigation, or parameter adjustment. Users in our studies transitioned between the two areas and performed nested, repeated selections. A formal experiment included VR object selection (ray and touch), menu selection (ray and touch), menu layout (pie and grid), as well as touch and visual feedback sizes in some cases (two levels each).
Like what you’re reading?
Already a member?
Get this article FREE with a new membership!

Similar Articles