Browsing by Author "Guo, Cheng"
Now showing 1 - 8 of 8
Results Per Page
Sort Options
- ItemOpen AccessExploring Tangible User Interfaces in Human-Robot Interaction(2007-10-17) Guo, Cheng; Sharlin, EhudMouse, keyboard and graphical user interfaces are commonly used in the field of human-robot interaction (HRI) for robot control. Although these traditional user interfaces are being accepted as the standard for the majority of computational tasks, their generic natural and interaction styles may not fit well with robot navigation tasks. In our proposed research, we intend to explore alternative UIs that could take the advantage of human innate skills in physical object manipulation and spatial perception to overcome the problems associated with traditional UIs. We suggest the use of tangible user interfaces (TUIs) for HRI applications, especially for one-to-many robot navigation tasks. We hope our proposed idea will give insight on future HRI interface design.
- ItemOpen AccessExploring the Use of Tangible User Interfaces for Human-Robot Interaction: A Comparative Study(2007-09-26) Guo, Cheng; Sharlin, EhudIn this paper we suggest the use of tangible user interfaces (TUIs) for human-robot interaction (HRI) applications. We discuss the potential benefits of this approach while focusing on low-level of autonomy tasks. We present an experimental robotic interaction testbed we implemented to support our investigation. We used the testbed to explore two HRI-related task-sets: robotic navigation control and robotic posture control. We discuss the implementation of these two task-sets using an AIBO robot dog. Both tasks were also mapped to two different robotic control interfaces: keypad interface which resembles the interaction approach common in HRI, and a gesture input mechanism based on Nintendo Wiimotes and Nunchuks. We discuss the interfaces implementation and conclude with a detailed user study we performed to compare these different HRI techniques in the two robotic tasks-sets.
- ItemOpen AccessMonitoring the Home Environment Using Domestic Robot(2007-04-11) Guo, Cheng; Boyd, Jeffery; Greenberg, Saul; Sharlin, EhudToday, robots are no longer limited to laboratory experiments; they have found their way into our homes. Being a physical entity itself, a robot can provide the functionality a regular computer does not possess. In the AIBO Monitor project, we use a domestic robot dog as a mediator for people to remotely monitor their home environment. Several design ideas have been explored and are presented in an attempt to maximize the user awareness of the robot s interaction with the environment.
- ItemOpen AccessNew paradigms for human-robot interaction using tangible user interfaces(2008) Guo, Cheng; Sharlin, Ehud
- ItemOpen AccessTouch and Toys - new techniques for interaction with a remote group of robots(2008-09-26T17:16:34Z) Guo, Cheng; Young, James E.; Sharlin, EhudInteraction with a remote team of robots in real time is a difficult human-robot interaction (HRI) problem exacerbated by the complications of unpredictable real- world environments, with solutions often resorting to a larger-than-desirable ratio of operators to robots. We present two innovative interfaces that allow a single operator to interact with a group of remote robots. Using a tabletop computer the user can configure and manipulate groups of robots directly by either using their fingers (touch) or by manipulating a set of physical toys (tangible user interfaces). We recruited participants to partake in an extensive user study that required them to interact with a small group of remote robots in simple tasks, and present our findings as a set of design considerations.
- ItemOpen AccessUsing Touch and Toys for Multiple Robots Control: video report(2009-04-13T20:38:29Z) Guo, Cheng; Young, James; Sharlin, EhudInteraction with a remote team of robots in real time is a difficult human-robot interaction (HRI) problem exacerbated by the complications of unpredictable real-world environments, with solutions often resorting to a larger-than-desirable ratio of operators to robots. We present two innovative interfaces that allow a single operator to interact with a group of remote robots. Using a tabletop computer the user can configure and manipulate groups of robots directly by either using their fingers (touch) or by manipulating a set of physical toys (tangible user interfaces)
- ItemOpen AccessUsing Wiimote for Single Robot Control: video report(2009-04-13T17:54:48Z) Guo, Cheng; Sharlin, EhudIn this video, we suggest the use of tangible user interfaces (TUIs) for human-robot interaction (HRI) applications. We present an experimental robotic interaction test bed to support our investigation. We use the test bed to explore two HRI-related task-sets: robotic navigation control and robotic posture control. Both tasks were mapped to two different robotic control interfaces: keypad interface which resembles the interaction approach currently common in HRI, and a gesture input mechanism based on Nintendo Wii™ game controllers.
- ItemOpen AccessUtilizing Physical Objects and Metaphors for Human Robot Interaction(2008-02-04) Guo, Cheng; Sharlin, EhudMouse, keyboard and graphical user interfaces are commonly used in the field of human-robot interaction (HRI) for robot control. Although these traditional user interfaces (UI) are being accepted as the standard for the majority of computational tasks, their generic nature and interaction styles may not offer ideal mapping to various robotic tasks, such as locomotion and navigation. In our research we intend to explore alternative UIs that could take advantage of human innate skills of physical object manipulation and spatial perception, and overcome some of the problems associated with traditional UIs. We suggest the use of tangible user interfaces (TUIs) for HRI applications, leveraging on existing and well-learned physical metaphors for interaction with robots, and exploring new ways to tangibly control one-to-many robot group interaction tasks. In this paper we will describe our current research efforts and findings, and outline our proposed research plans.