Click here to get back to the home page.


Despite great advances in robot navigation, the methods developed are still faced with significant challenges. Humans and animals however seamlessly overcome these challenges displaying navigational capabilities that far outperform those of today’s robots.

More recently, advances in neuroscience are allowing us to observe the brain activity of animals (mainly rodents) while they perform navigational tasks. This is providing a remarkably accurate insight as to how the brain works in order to solve navigational problems even though we are only able to observe a very small part of the whole mechanism due to its scale and complexity (number of brain cells and their complex interconnectivity).

This neuroscientific research has led to the discovery of several types of brain cells that are thought to play specific roles in animal navigation. These cells include place cells, head-direction cells, grid cells, border cells and spatial view cells.

I and colleagues at Keele university have implemented a computational model of head-direction cells for a a small mobile robot (pictured below). References to published work describing the details of this work can be found here.

Figure: The LEGO® Mindstorms® NXT robot. The robot is equipped with an on-board omnidirectional video camera (above the NXT brick), a gyroscopic sensor and an acceleration sensor (pictured to the left and right above the wheels). For locomotion, the robot uses two active wheels (seen at the front) and a dummy castor.

Ongoing research at Keele is investigating computational models of the other cell types (mentioned above) that are thought to play a role in biological navigation. As well as single-cell-type, models incorporating multiple types of cells are investigated with the aim of producing more complete robot navigation systems.

Models are applied to miniature mobile robots such as the Mark III robots and robots created using the LEGO® Mindstorms® NXT sets. Larger mobile platforms are also used such as the SCITOS G5 robot (by MetraLabs, Germany).

Figure: The SCITOS G5 robot. The robot contains a variety of sensors (laser rangefinder, sonar and a video camera) and can be autonomously controlled by an on-board computer.

Figure: The Mark III robot. The robot employs relatively simplistic infra-red rangefinder and encoder sensors. It is controlled by an on-board microcontroller but can be integrated with digital data radio tranceivers that allow it to be remotely controlled by a PC.


Click here to get back to the home page.