Centre for Robotics and Intelligent Systems of the University of Plymouth - Robotics Intelligence Laboratory


  Teaching a robot how to play a card game

MIBL: Multimodal Instruction-Based  Learning for Personal Robots
With the kind support of Nuance

Aim:  The overall aim is the development of human-robot interfaces allowing the instruction of robots by untrained users, using communication methods natural to humans.

This project focuses on card game instructions, in a scenario where a user of a personal robot wishes to play a new card game with the robot, and needs to first explain the rules of the game. Game instructions are a good example of more general instructions to a personal robot, due to the  range of instruction type they contain: sequences of actions to perform and rules to apply.

The objective is developing a robot-student able to understand the instruction from the humans teacher and integrate them in a way that supports a game playing behaviour. 

The project starts with recordings of a corpus of instructions between a human teacher and a human student (Figure 1, Publication 1).

Starting a robot development project with recording users is an approach termed "corpus-based robotics" (publication 2).  


Fig 1. Setup for corpus collection. The teacher communicates with the student (on the left) by using spoken instructions and gestures mediated by the touch screen.

One of the problems to be solved was the synchronization of chunks of verbal instruction and the corresponding chunks of gestural demonstrations (Publications 3 & 4).

Fig 2: Time-lines of speech and gesture, where diagonal lines indicate which utterance and gesture are paired together.

Current work covers:

- The development of a semi-automatic method for the design of speech recognition grammars starting from a corpus.

- The analysis of game rule instructions to infer and implement cognitive functions required from a learner robot.

Joerg Wolf    university page - personal page
    Guido Bugmann
university page - personal page


1. "Multimodal Corpus Collection for the Design of User-Programmable Robots" (PDF 137KB)
Wolf J.C., Bugmann G. (2005)
Proc. Taros'05, London, p. 251-255.

2. "The Impact of Spoken Interfaces on the Design of Service Robots".
Bugmann G., Wolf  J. C., Robinson P. (2005)
Industrial Robot, 32:6, 499-504

3. "Timing of visual and spoken input in robot instructions."
Wolf J.C. and Bugmann G.
Proceeding of EUROS'06 International Workshop on Vision Based Human-Robot
, 18 March 2006, Palermo

4. "Linking Speech and Gesture in Multimodal Instruction Systems" (PDF 410KB)
Wolf J.C. and Bugmann G (2006).
Proc. IEEE Roman'06, 6-8 Sept. 2006, Hatfield, UK, p. 141-144

5.  "Understanding Rules in Human-Robot Instructions" (PDF 258KB)
Wolf J.C. and Bugmann G. (2007) Proceedings IEEE ROMAN'07, Jeju Island, Korea
, pp. 714-719

6. "Converting Multi-Modal Task Instructions to Rule-Based Robot Instructions" (PDF 191KB)
Wolf J.C. and Bugmann G. (2008) Proceedings of IEEE ROMAN'08, Munich, Germany, pp.

Guido Bugmann, 15 October 2008

Video of the MIBL robot can be found here: "MIBL Robot Learning to deal" Video WMA