Exploring True Multi-User Multimodal Interaction over a Digital Table

Abstract
True multi-user, multimodal interaction over a digital table lets co-located people simultaneously gesture and speak commands to control an application. We explore this design space through a case study, where we implemented an application that supports the KJ creativity method as used by industrial designers. Four key design issues emerged that have a significant impact on how people would use such a multi-user multimodal system. First, parallel work is affected by the design of multimodal commands. Second, individual mode switches can be confusing to collaborators, especially if speech commands are used. Third, establishing personal and group territories can hinder particular tasks that require artefact neutrality. Finally, timing needs to be considered when designing joint multimodal commands. We also describe our model view controller architecture for true multi-user multimodal interaction.
Description
Keywords
Computer Science
Citation