Technologies to enable multimodal interaction are now sufficiently mature that research is turning away from pure technology development and looking towards interaction and design issues. Robust solutions exist to display audio and haptic feedback in many forms – for instance as speech and non speech sounds and through tactile and force feedback sensations. Furthermore, it has been demonstrated that the novel interactions supported by these modalities can confer benefits for all users. However, many questions remain: how can we design effective haptic, audio and multimodal interfaces? In what new application areas can we apply these techniques? Are there design methods that are useful? Or evaluation techniques that are particularly appropriate?
While multimodal interfaces are attracting more and more attention, there is relatively little work on how the haptic and auditory modalities can be efficiently and effectively combined in an interface. Is there information which is better communicated using one modality rather than another? How can we link haptic and auditory displays so that changes in one modality are reflected in the other? Can we create complimentary relationships between the information displayed to each sense? Additionally, how should we interact with these new displays and interfaces? Is a direct manipulation interaction style still appropriate? A technique that works well with a force feedback device but may not be appropriate for all types of displays. How should we interact with a tactile display, or manipulate a sonified graph?
Whilst audio and haptic interaction has been shown to be a useful tool, neither sense has the bandwidth of the visual modality. Careful, considered and informed interaction design will play a vital role if multimodal systems are to move beyond the lab and into the real world. This workshop seeks novel research addressing this human-centric challenge.
More information at: www.haid2007.org/index.shtml