Abstract
In this paper we suggest the use of tangible user interfaces (TUIs)
for human-robot interaction (HRI) applications. We discuss the potential
benefits of this approach while focusing on low-level of autonomy tasks. We
present an experimental robotic interaction testbed we implemented to support
our investigation. We used the testbed to explore two HRI-related task-sets:
robotic navigation control and robotic posture control. We discuss the
implementation of these two task-sets using an AIBO robot dog. Both tasks were
also mapped to two different robotic control interfaces: keypad interface
which resembles the interaction approach common in HRI, and a gesture input
mechanism based on Nintendo Wiimotes and Nunchuks. We discuss the interfaces
implementation and conclude with a detailed user study we performed to compare
these different HRI techniques in the two robotic tasks-sets.
Notes
We are currently acquiring citations for the work deposited into this collection. We recognize the distribution rights of this item may have been assigned to another entity, other than the author(s) of the work.If you can provide the citation for this work or you think you own the distribution rights to this work please contact the Institutional Repository Administrator at digitize@ucalgary.ca