Povrpoint: Authoring presentations in mobile virtual reality

Verena Biener, Travis Gesslein, Daniel Schneider, Felix Kawala, Alexander Otte, Per Ola Kristensson, Michel Pahud, Eyal Ofek, Cuauhtli Campos, Matjaž Kljun

Research output: Contribution to journalArticlepeer-review

Abstract

Virtual Reality (VR) has the potential to support mobile knowledge workers by complementing traditional input devices with a large three-dimensional output space and spatial input. Previous research on supporting VR knowledge work explored domains such as text entry using physical keyboards and spreadsheet interaction using combined pen and touch input. Inspired by such work, this paper probes the VR design space for authoring presentations in mobile settings. We propose PoVRPoint-a set of tools coupling pen- and touch-based editing of presentations on mobile devices, such as tablets, with the interaction capabilities afforded by VR. We study the utility of extended display space to, for example, assist users in identifying target slides, supporting spatial manipulation of objects on a slide, creating animations, and facilitating arrangements of multiple, possibly occluded shapes or objects. Among other things, our results indicate that 1) the wide field of view afforded by VR results in significantly faster target slide identification times compared to a tablet-only interface for visually salient targets; and 2) the three-dimensional view in VR enables significantly faster object reordering in the presence of occlusion compared to two baseline interfaces. A user study further confirmed that the interaction techniques were found to be usable and enjoyable.
Original languageEnglish
Pages (from-to)2069-2079
Number of pages11
JournalIEEE transactions on visualization and computer graphics
Volume28
Issue number5
Early online date15 Feb 2022
DOIs
Publication statusPublished - May 2022

Keywords

  • Virtual Reality
  • Presentation Authoring
  • Mobile Knowledge Work
  • Pen and Touch Interaction

Cite this