Prototyping reality

on

One of our ongoing goals at the lab is to understand how best to take advantage Augmented Reality (AR) to annotate physical objects with digital media. Unfortunately, the objects we tend to focus on (such as mutli-function devices or printers) are often large and relatively immobile, making it difficult for us to visit remote sites to demonstrate our technologies.

To address this problem, we are experimenting with paper-based models of the physical objects we want to augment, which are much more lightweight and mobile while still approximating the embodied experience of a 3D device (see Figure 1). In order to register the paper-based models with AR tracking systems, we can either scan the entire paper-based object or, if the object corresponds to a cube or rectangular box, we can register each side as independent images (images may in fact correspond to registration images used in the actual scene). In either case, this paper-based object is mobile and easily reconfigurable, giving us much more flexibility in how, when, and where we present AR content (Figure 2).

Figure 1. Our printer paper prototype.
Figure 2. Viewing digital content affixed to the paper printer prototype with a mobile AR tool.

This approach represents somewhat of an inversion of typical paper-based prototyping methods in which user-interface elements are prototyped rather than the physical object against which they are registered (which do not exist for most 2D interfaces). Marc Rettig introduced lo-fi prototyping with paper UI elements in his influential paper Prototyping for Tiny Fingers , and this method was adopted rapidly throughout the user experience community. Recently, researchers have extended it to AR scenarios as well.

PapAR was one of the first to adapt paper prototyping techniques to AR for head-mounted displays. It is a straightforward design that involves a base layer with real-world elements drawn in paper similar to a typical paper prototype as well as a transparent overlay onto which are draw AR interactors. This is a simple and elegant “glass pane” approach familiar to user experience professionals.

Figure 3. In PapAR, authors move a transparent AR overlay over a sketched real-world scene.

Michael Nebeling’s work at the University of Michigan School of Information pushes this concept further. Inspired by issues with an earlier AR creation toolkit (the influential DART system), Nebeling et al. first built ProtoAR, which allows AR designers to integrate 2D paper sketches as well as 3D Play-Doh mockups into a prototype AR scene. The toolkit includes a desktop and a mobile app that creators can use to scan physical objects, integrate into an AR scene, and link to real-world markers.

The researchers later extended this toolkit to allow authors to adjust the representation of AR content live, facilitating Wizard-of-Oz style user testing (see their CHI presentation on this work).

Closer to our approach are tools that augment paper prototypes with digital resources to experiment with AR content. For example, the ARcadia system supports authoring AR-based tangible computing interfaces. In this system, content creators attach markers to paper prototypes then use a desktop tool to augment the prototypes with digital content.

We have a long tradition of using and extending lightweight prototyping methods at FXPAL. In light of recent events, we expect to focus future work on extending lightweight AR prototyping tools to support remote experimentation and design iteration.

Share on: 

Add your voice