Abstract
Virtual Reality (VR) has the potential to support mobile knowledge workers by complementing traditional input devices with a large three-dimensional output space and spatial input. Previous research on supporting VR knowledge work explored domains such as text entry using physical keyboards and spreadsheet interaction using combined pen and touch input. Inspired by such work, this paper probes the VR design space for authoring presentations in mobile settings. We propose PoVRPoint-a set of tools coupling pen- and touch-based editing of presentations on mobile devices, such as tablets, with the interaction capabilities afforded by VR. We study the utility of extended display space to, for example, assist users in identifying target slides, supporting spatial manipulation of objects on a slide, creating animations, and facilitating arrangements of multiple, possibly occluded shapes or objects. Among other things, our results indicate that 1) the wide field of view afforded by VR results in significantly faster target slide identification times compared to a tablet-only interface for visually salient targets; and 2) the three-dimensional view in VR enables significantly faster object reordering in the presence of occlusion compared to two baseline interfaces. A user study further confirmed that the interaction techniques were found to be usable and enjoyable.
Original language | English |
---|---|
Pages (from-to) | 2069-2079 |
Number of pages | 11 |
Journal | IEEE transactions on visualization and computer graphics |
Volume | 28 |
Issue number | 5 |
Early online date | 15 Feb 2022 |
DOIs | |
Publication status | Published - May 2022 |
Keywords
- Virtual Reality
- Presentation Authoring
- Mobile Knowledge Work
- Pen and Touch Interaction