So far, metamaterials were understood as materials—we want to think of them as machines.
Recently, researchers started to engineer not only the outer shape of objects, but also their internal microstructure. Such objects, typically based on 3D cell grids, are also known as metamaterials. Metamaterials have been used,for example, to create materials with soft and hard regions.
So far, metamaterials were understood as materials—we want to think of them as machines. We demonstrate metamaterial objects that perform a mechanical function. Such metamaterial mechanisms consist of a single block of material the cells of which play together in a well-defined way in order to achieve macroscopic movement. Our metamaterial door latch, for example, transforms the rotary movement of its handle into a linear motion of the latch.
dragging a physical tactor across the user’s skin produces a stronger tactile stimulus than vibrotactile
We propose a new type of tactile displays that drag a physical tactor across the skin in 2D. We call this skin drag. We demonstrate how this allows us to communicate geometric shapes or characters to users.The main benefit of our approach is that it simultaneously produces two types of stimuli, i.e., (1) it moves a tactile stimulus across skin locations and (2) it stretches the user’s skin. Skin drag thereby combines the essential stimuli produced by vibrotactile and skin stretch. In our study, skin drag allowed participants to recognize tactile shapes significantly better than a vibrotactile array of comparable size. We present two arm-worn prototype devices that implement our concept.
providing location awareness of multiple moving objects in a detail view on large displays
Overview+Detail interfaces can be used to examine the details of complex data while retaining the data’s overall context. Dynamic data introduce challenges for these interfaces, however, as moving objects may exit the detail view, as well as a person’s field of view if they are working at a large interactive surface. To address this “off-view” problem, we propose a new information visualization technique, called Canyon. This technique attaches a small view of an off-view object, including some surrounding context, to the external boundary of the detail view. The area between the detail view and the region containing the off-view object is virtually “folded” to conserve space.
a study of human preferences in usage of gesture types for HCI
David Lindlbauer and I conducted the qualitative study reported in this paper. We investigated which hand gestures people would use for specific actions. These included selecting, moving, removing and identifying objects. Tasks were done in pairs, where one participant gave instrucions soley through gestures while the other participant executed them. They were located in seperate rooms and communicated via video cameras. The gestures were analyzed and categorized.