Page 129 - Dynamic Vision for Perception and Control of Motion
P. 129
4.1 Structuring of Application Domains 113
It is recommended to have a steady representation available of intensity statis-
tics and their trends in the image sequence: Averages and variances of maximum
and minimum image intensities and of maximum and minimum intensity gradients
in representative regions. When surfaces are wet and the sun comes out, light re-
flections may lead to highlights. Water surfaces (like puddles) rippled by wind may
exhibit relatively large glaring regions which have to be excluded from image in-
terpretation for meaningful results. Driving toward a low standing sun under these
conditions can make vision impossible. When there are multiple light sources like
at night in an urban area, regions with stable visual features have to be found al-
lowing tracking and orientation by avoiding highlighted regions.
Headlights of other vehicles may also become hard to deal with in rainy condi-
tions. Backlights and stoplights when braking are relatively easy to handle but re-
quire color cameras for proper recognition. In RGB-color representation, stop
lights are most efficiently found in the R-image, while flashing blue lights on vehi-
cles for ambulance or police cars are most easily detected in the B-channel. Yellow
or orange lights for signaling intentions (turn direction indicators) require evalua-
tion of several RGB channels or just the intensity signal. Stationary flashing lights
at construction sites (light sequencing, looking like a hopping light) for indication
of an unusual traffic direction require good temporal resolution and correlation
with subject vehicle perturbations to be perceived correctly.
Recognition of weather conditions (Appendix A.1.3) is especially important
when they affect the interaction of the vehicle with the ground (acceleration, decel-
eration through friction between tires and surface material). Recognizing and ad-
justing behavior to rain, hail, and snow conditions may prevent accidents by cau-
tious driving. Slush and loose or wet dirt or gravel on the road may have similar
effects and should thus be recognized. Heavy winds and gusts can have a direct ef-
fect on driving stability; however, they are not directly visible but only by secon-
dary effects like dust or leaves whirling up or by moving grass surfaces and plants
or branches of trees. Advanced vision systems should be able to perceive these
weather conditions (maybe supported by inertial sensors directly feeling the accel-
erations on the body). Recognizing fine shades of texture may be a capability for
achieving this; at present, this is beyond the performance level of microprocessors
available at low cost, but the next decade may open up this avenue.
Roadway recognition (Appendix A.2) has been developed to a reasonable state
since recursive estimation techniques and differential geometry descriptions have
been introduced two decades ago. For freeways and other well-kept, high-speed
roads (Appendices A.2.1 and A.2.2), lane and road recognition can be considered
state of the art. Additional developments are still required for surface state recogni-
tion, for understanding the semantics of lane markings, arrows, and other lines
painted on the road as well as detailed perception of the infrastructure along the
road. This concerns repeating poles with different reflecting lights on both sides of
the roadway, the meaning of which may differ from one country to the next, and
guiderails on road shoulders and many different kinds of traffic and navigation
signs which have to be distinguished from advertisements. On these types of roads
there is only unidirectional traffic (one-way), usually, and navigation has to be
done by proper lane selection.