Hybrid and Coordinated 3D Interaction in Immersive Virtual Environments


Downloadable Content

open in viewer

Through immersive stereoscopic displays and natural user interfaces, virtual reality (VR) is capable of offering the user a sense of presence in the virtual space, and has been long expected to revolutionize how people interact with virtual content in various application scenarios. However, with many technical challenges solved over the last three decades to bring low cost and high fidelity to VR experiences, we still do not see VR technology used frequently in many seemingly suitable applications. Part of this is due to the lack of expressiveness and efficiency of traditional “simple and reality-based” 3D user interfaces (3DUIs). The challenge is especially obvious when complex interaction tasks with diverse requirements are involved, such as editing virtual objects from multiple scales, angles, perspectives, reference frames, and dimensions. A common approach to overcome such problems is through hybrid user interface (HUI) systems that combine complementary interface elements to leverage their strengths. Based on this method, the first contribution of this dissertation is the proposal of Force Extension, an interaction technique that seamlessly integrates position-controlled touch and rate-controlled force input for efficient multi-touch interaction in virtual environments. Using carefully designed mapping functions, it is capable of offering fluid transitions between the two contexts, as well as simulating shear force input realistically for multi-touch gestures. The second contribution extends the HUI concept into immersive VR by introducing a Hybrid Virtual Environment (HVE) level editing system that combines a tablet and a Head-Mounted Display (HMD). The HVE system improves user performance and experience in complex high-level world editing tasks by using a “World-In-Miniature” and 2D GUI rendered on a multi-touch tablet device to compensate for the interaction limitations of a traditional HMD- and wand-based VR system. The concept of Interaction Context (IC) is introduced to explain the relationship between tablet interaction and the immersive interaction, and four coordination mechanisms are proposed to keep the perceptual, functional, and cognitive flow continuous during IC transitions. To offer intuitive and realistic interaction experiences, most immersive 3DUIs are centered on the user’s virtual avatar, and obey the same physics rules of the real world. However, this design paradigm also employs unnecessary limitations that hinders the performance of certain tasks, such as selecting objects in cluttered space, manipulating objects in six degrees of freedom, and inspecting remote spaces. The third contribution of this dissertation proposes the Object Impersonation technique, which breaks the common assumption that one can only immerse in the VE from a single avatar, and allows the user to impersonate objects in the VE and interact from their perspectives and reference frames. This hybrid solution of avatar- and object-based interaction blurs the line between travel and object selection, creating a unique cross-task interaction experience in the immersive environment. Many traditional 3DUIs in immersive VR use simple and intuitive interaction paradigms derived from real world metaphors. But they can be just as limiting and ineffective as in the real world. Using the coordinated HUI or HVE systems presented in this dissertation, one can benefit from the complementary advantages of multiple heterogeneous interfaces (Force Extension), VE representations (HVE Level Editor), and interaction techniques (Object Impersonation). This advances traditional 3D interaction into the more powerful hybrid space, and allows future VR systems to be applied in more application scenarios to provide not only presence, but also improved productivity in people’s everyday tasks.

  • English
  • etd-042915-005402
Defense date
  • 2015
Date created
  • 2015-04-29
Resource type
Rights statement


In Collection:



Permanent link to this page: