Yvonne Jansen - Embedded Data Representations

Embedded Data Representations

People

Abstract

We introduce embedded data representations, the use of visual and physical representations of data that are deeply integrated with the physical spaces, objects, and entities to which the data refers. Technologies like lightweight wireless displays, mixed reality hardware, and autonomous vehicles are making it increasingly easier to display data in-context. While researchers and artists have already begun to create embedded data representations, the benefits, trade-offs, and even the language necessary to describe and compare these approaches remain unexplored. In this paper, we formalize the notion of physical data referents – the real-world entities and spaces to which data corresponds – and examine the relationship between referents and the visual and physical representations of their data. We differentiate situated representations, which display data in proximity to data referents, and embedded representations, which display data so that it spatially coincides with data referents. Drawing on examples from visualization, ubiquitous computing, and art, we explore the role of spatial indirection, scale, and interaction for embedded representations. We also examine the tradeoffs between non-situated, situated, and embedded data displays, including both visualizations and physicalizations. Based on our observations, we identify a variety of design challenges for embedded data representation, and suggest opportunities for future research and applications.

Presentation

Watch the presentation given by Wesley Willett during the InfoVis 2016 conference:

InfoVis 2016: [TVCG] Embedded Data Representations from VGTCommunity on Vimeo.

Figures

Situated representions extend the traditional visualization pipeline to the physical world. Raw data and data presentations are both linked to a physical referent.

Embedded representations are composed of multiple physical referents and physical presentations that can be interpreted together.

Set relationship between different representation types.

Future scenario: Displays embedded in garments or attached to the skin could visualize medical or fitness data in-context.

Future scenario: Embedded visualization tools could be particularly useful for tasks like search and rescue that take place in large outdoor environments.

Future scenario: Plates, cutlery, or serving ware with embedded displays could visualize nutrition information about a meal in real time.

Read the Paper