ISMAR 2018
IEEEIEEE computer societyIEEE vgtcACM In-CooperationACM In-Cooperation

Sponsors

Platinum Apple
Silver MozillaIntelDaqriPTCAmazon
Bronze FacebookQualcommUmajinDisney ResearchUniSA VenturesReflektOccipital
SME EnvisageARKhronos
Academic TUMETHZ

Erwan Normand and Michael McGuffin. Enlarging a smartphone with ar to create a handheld vesad (virtually extended screen-aligned display). In Proceedings of the IEEE International Symposium for Mixed and Augmented Reality 2018 (To appear). 2018.
[BibTeX▼]

Abstract

We investigate using augmented reality to extend the screen of a smartphone beyond its physical limits with a virtual surface that is co-planar with the phone and that follows as the phone is moved. We call this extension a VESAD, or Virtually Extended Screen-Aligned Display. We illustrate and describe several ways that a VESAD could be used to complement the physical screen of a phone, and describe two novel interaction techniques: one where the user performs a quick rotation of the phone to switch the information shown in the VESAD, and another called "slide-and-hang" whereby the user can detach a VESAD and leave it hanging in mid-air, using the phone to establish the initial position and orientation of the virtual window. We also report an experiment that compared three interfaces used for an abstract classification task: the first using only a smartphone, the second using the phone for input but with a VESAD for output, and the third where the user performed input in mid-air on the VESAD (as detected by a Leap Motion). The second user interface was found to be superior in time and selection count (a metric of mistakes committed by users) and was also subjectively preferred over the other two interfaces. This demonstrates the added value of a VESAD for output over a phone's physical screen, and also demonstrates that input on the phone's screen was better than input in mid-air in our experiment.