Page 166 - Introduction to Autonomous Mobile Robots
P. 166
151
Perception
We will not present a detailed derivation here but will use equation (4.60) to solve an
example problem in section 4.3.1.1.
4.3 Feature Extraction
An autonomous mobile robot must be able to determine its relationship to the environment
by making measurements with its sensors and then using those measured signals. A wide
variety of sensing technologies are available, as shown in the previous section. But every
sensor we have presented is imperfect: measurements always have error and, therefore,
uncertainty associated with them. Therefore, sensor inputs must be used in a way that
enables the robot to interact with its environment successfully in spite of measurement
uncertainty.
There are two strategies for using uncertain sensor input to guide the robot’s behavior.
One strategy is to use each sensor measurement as a raw and individual value. Such raw
sensor values could, for example, be tied directly to robot behavior, whereby the robot’s
actions are a function of its sensor inputs. Alternatively, the raw sensor values could be
used to update an intermediate model, with the robot’s actions being triggered as a function
of this model rather than the individual sensor measurements.
The second strategy is to extract information from one or more sensor readings first,
generating a higher-level percept that can then be used to inform the robot’s model and per-
haps the robot’s actions directly. We call this process feature extraction, and it is this next,
optional step in the perceptual interpretation pipeline (figure 4.34) that we will now discuss.
In practical terms, mobile robots do not necessarily use feature extraction and scene
interpretation for every activity. Instead, robots will interpret sensors to varying degrees
depending on each specific functionality. For example, in order to guarantee emergency
stops in the face of immediate obstacles, the robot may make direct use of raw forward-
facing range readings to stop its drive motors. For local obstacle avoidance, raw ranging
sensor strikes may be combined in an occupancy grid model, enabling smooth avoidance
of obstacles meters away. For map-building and precise navigation, the range sensor values
and even vision sensor measurements may pass through the complete perceptual pipeline,
being subjected to feature extraction followed by scene interpretation to minimize the
impact of individual sensor uncertainty on the robustness of the robot’s mapmaking and
navigation skills. The pattern that thus emerges is that, as one moves into more sophisti-
cated, long-term perceptual tasks, the feature extraction and scene interpretation aspects of
the perceptual pipeline become essential.
Feature definition. Features are recognizable structures of elements in the environment.
They usually can be extracted from measurements and mathematically described. Good
features are always perceivable and easily detectable from the environment. We distinguish