In this video, we suggest the use of tangible user interfaces (TUIs)
for human-robot interaction (HRI) applications. We present an
experimental robotic interaction test bed to support our
investigation. We use the test bed to explore two HRI-related
task-sets: robotic navigation control and robotic posture control.
Both tasks were mapped to two different robotic control interfaces:
keypad interface which resembles the interaction approach currently
common in HRI, and a gesture input mechanism based on Nintendo Wii™