we integrate simple computation into 3D printed objects’ cell structure
In this paper, we explore how to embody mechanical computation into 3D printed objects, i.e., without electronic sensors, actuators, or controllers typically used for this purpose. A key benefit of our approach is that the resulting objects can be 3D printed in one piece and thus do not require assembly.
We are building on 3D printed cell structures, also known as metamaterials. We introduce a new type of cell that propagates a digital mechanical signal using an embedded bistable spring. When triggered, the embedded spring discharges and the resulting impulse triggers one or more neighboring cells, resulting in signal propagation. We extend this basic mechanism to implement simple logic functions.
So far, metamaterials were understood as materials—we want to think of them as machines.
Recently, researchers started to engineer not only the outer shape of objects, but also their internal microstructure. Such objects, typically based on 3D cell grids, are also known as metamaterials. Metamaterials have been used,for example, to create materials with soft and hard regions.
So far, metamaterials were understood as materials—we want to think of them as machines. We demonstrate metamaterial objects that perform a mechanical function. Such metamaterial mechanisms consist of a single block of material the cells of which play together in a well-defined way in order to achieve macroscopic movement. Our metamaterial door latch, for example, transforms the rotary movement of its handle into a linear motion of the latch.
dragging a physical tactor across the user’s skin produces a stronger tactile stimulus than vibrotactile
We propose a new type of tactile displays that drag a physical tactor across the skin in 2D. We call this skin drag. We demonstrate how this allows us to communicate geometric shapes or characters to users.The main benefit of our approach is that it simultaneously produces two types of stimuli, i.e., (1) it moves a tactile stimulus across skin locations and (2) it stretches the user’s skin. Skin drag thereby combines the essential stimuli produced by vibrotactile and skin stretch. In our study, skin drag allowed participants to recognize tactile shapes significantly better than a vibrotactile array of comparable size. We present two arm-worn prototype devices that implement our concept.
providing location awareness of multiple moving objects in a detail view on large displays
Overview+Detail interfaces can be used to examine the details of complex data while retaining the data’s overall context. Dynamic data introduce challenges for these interfaces, however, as moving objects may exit the detail view, as well as a person’s field of view if they are working at a large interactive surface. To address this “off-view” problem, we propose a new information visualization technique, called Canyon. This technique attaches a small view of an off-view object, including some surrounding context, to the external boundary of the detail view. The area between the detail view and the region containing the off-view object is virtually “folded” to conserve space.
an interactive movie
Last Call is an interactive theatrical where the viewer is able to communicate with the protagonist. One viewer in the audience is randomly picked an called by the protagonist seeking for help. The viewer hears the protagonists breathing and question through the phone, as well as on screen.
I developed the software that manages the request and repsonses to the external voice recognition software and plays the corresponding movie parts. This project was realized during my employment at Powerflasher (now Interactive Pioneers) in 2009. It won several awards, among others from Cannes Lions Festival, ADC Germany and New York Festivals.
a MIDI controller
KontrollWerk is a multitouch midi controller allowing the user to DJ or VJ to create his very own layout, to arrange all the controls he needs wherever he wants on the surface. We realized this kind of midi controller with regard to live performances. By projecting the DJ’s action live during the performance, you can let the people really be part of the DJ’s performance.
It was a team project with David Lindlbauer and Stefan Wasserbauer developed during a term project at the University of Applied Sciences in Hagenberg, Austria in 2008/09.