hci-approaches

The goal of our group is to understand how people perceive and interact with information and technologies and how to augment technologies to increase users’ expertise and to support their cognition. A central aspect of our group is to use and combine multiple approaches to address HCI challenges in a wide range of applications.

Approach

Empirical Studies serve to understand human perception and behavior or to validate hypotheses, models, or prototypes. We consider both qualitative and quantitative experiments in controlled environments as well as in the field.

Models serve to synthesize complex phenomena to provide theoretical foundations which can then guide the design of interactive systems. We consider both descriptive models (taxonomy) and predictive models (behavioral and cognitive models).

Interaction Design serves to explore the scientific design space of interaction and visualization techniques. We design, implement, and evaluate both hardware and software solutions for goals such as improving performance, facilitating the transition from novice to expert behaviors, or leveraging users’ cognition.

Engineering. Our experience has shown the need of developing tools for both HCI researchers and designers to create and study interaction techniques.

Applications

As illustrated by our previous research projects presented below, we considerĀ a wide range of different applications using existing, emerging and future technologies for desktop as well as beyond desktop interaction featuring mobile devices, wearable devices, tangible interaction, augmented reality (AR), virtual reality (VR), large displays, or shape-changing interfaces.

Research Projects

VersaPen

An Adaptable, Modular and Multimodal I/O Pen

Embedded Vis

Representations of data deeply integrated with the physical environment

TouchToken

Combining tangible interaction and multi-touch input

MapSense

Multi-sensory technologies for children living with visual impairments

ISkin

A novel class of skin-worn sensors for touch input on the body

Menu Modeling

Understanding and predicting how users interact with menus

ShoeSense

A new perspective on hand gesture and wearable computing