Multimodal Assistance Systems for Child Cyclists
Child cyclists are often at greater risk for traffic accidents. This is in part due to the development of children’s motor and perceptual-motor abilities. In our approach we focus on assistance of children on the road using multimodal feedback, such as visual, auditory, and vibrotactile. We developed an indoor bicycle simulator and explored how alerts and warning signals situated in helmets and bicycle could be conveyed to children in an understandable and intuitive way. To increase the ecological validity of our findings in the bicycle simulator, we investigate the multimodal assistance on the controlled test-track.
This work was part of a bigger research project Safety4Bikes, which is presented in the following video:
Tangible Awareness Systems for Working Colleagues
Metaphor-Based Ambient Tangible Artifacts AwareCups and AwareHouse employ the Tin can telephone and the Open door policy metaphors as a basis for the design.
AwareCups: A person can write a name of a colleague using a marker similar to how coffee shops provide visitors with cups that have their names on them (e. g., Starbucks). Availability of a colleague is represented via a RGB LED light strip positioned around the cup; red indicates that a person is busy, green signifies a person is free, and the off state represents a lack of information about a person’s availability. When a person wants to know more sensitive information about another colleague (e. g., amount of free time and location), she brings the cup to her ear and listens to a voice based recording, such as, “In the lounge, free for the next 5 minutes.”
AwareHouse: Each door is mapped to one colleague with a label to indicate the person’s name. Following the open door policy metaphor, it shows the colleague’s availability via an opened door (free) or a closed door (busy). On the right side of the door, a doorbell button can be pressed to receive more detailed information about a particular colleague. Each door contains an integrated 7-segment display that shows a number for the amount of free time in minutes. When the door is closed and the user presses the button, she hears a voice recording that conveys the location of the colleague along with the amount of free time. The display on the door indicates the minutes before a colleague is free from other commitments. For example, one possible recording would be, “In the office, free in 5 minutes.” When the door is open and the button is pressed, the user hears a voice recording of the colleague’s location with the amount of free time. The display indicates the amount of a person’s free time.
StoryBox is a tangible device that allows sharing photos, tangible artifacts, and audio recordings of everyday life over distance. With StoryBox, users can share crafted objects, pictures, written messages, and audio samples in an asynchronous manner. (Collaboration with Torben Wallbaum)
AwareKit is a tangible toolkit aimed at supplementing existing electronic calendar systems and exploits quick, fun and playful interaction. It integrates an attractive design and utilizes touch and rotation as interaction techniques to access different types of information.
CubeLendar is an interactive calendar in the form of a cube, which provides an overview of the events, weather, time, and date. It is a computer device that integrates an attractive design and exploits rotation as an interaction technique to access different types of information presented on each side of the cube. CubeLendar is aimed to notify about calendar events via light and represent potential situations for spontaneous communication with remote co-workers.
Ambient Light Displays and Light-based Navigation for Car Drivers
We established light patterns and guidelines for building new ambient light systems. To decrease driver distraction while driving with a navigation system, we explore the use of ambient light as a navigation aid in the car, in order to shift navigation aids to the periphery of human attention.
Interactive Desktop Workspaces
We presented two gaze-based interaction techniques, eliminating the need of using direct interaction with a vertical screen. With gaze-based interaction techniques the direct touch is only needed for a horizontal screen. The developed gaze-based techniques are called ITSS and ITOS. Indirect Touch Screen Selection (ITSS) allows users to select the screen they intend to interact by simply looking at it and absolutely maps the touch input from the horizontal to the vertical screen. Indirect Touch Object Selection (ITOS) highlights objects the user is looking at and uses relative direct touch mapping.