Page 100 -
P. 100
2.3 The digital camera 79
◦
whereas the IQ components are the UV components rotated through an angle of 33 .In
composite (NTSC and PAL) video, the chroma signals were then low-pass filtered horizon-
tally before being modulated and superimposed on top of the Y’ luma signal. Backward
compatibility was achieved by having older black-and-white TV sets effectively ignore the
high-frequency chroma signal (because of slow electronics) or, at worst, superimposing it as
a high-frequency pattern on top of the main signal.
While these conversions were important in the early days of computer vision, when frame
grabbers would directly digitize the composite TV signal, today all digital video and still
image compression standards are based on the newer YCbCr conversion. YCbCr is closely
related to YUV (the C b and C r signals carry the blue and red color difference signals and have
more useful mnemonics than UV) but uses different scale factors to fit within the eight-bit
range available with digital signals.
For video, the Y’ signal is re-scaled to fit within the [16 ... 235] range of values, while
the Cb and Cr signals are scaled to fit within [16 ... 240] (Gomes and Velho 1997; Fairchild
2005). For still images, the JPEG standard uses the full eight-bit range with no reserved
values,
Y 0.299 0.587 0.114 R 0
⎡ ⎤ ⎡ ⎤ ⎡ ⎤ ⎡ ⎤
= −0.168736 −0.331264 0.5 G + 128 , (2.115)
C b
⎣ ⎦ ⎣ ⎦ ⎣ ⎦ ⎣ ⎦
C r 0.5 −0.418688 −0.081312 B 128
where the R’G’B’ values are the eight-bit gamma-compressed color components (i.e., the
actual RGB values we obtain when we open up or display a JPEG image). For most appli-
cations, this formula is not that important, since your image reading software will directly
provide you with the eight-bit gamma-compressed R’G’B’ values. However, if you are trying
to do careful image deblocking (Exercise 3.30), this information may be useful.
Another color space you may come across is hue, saturation, value (HSV), which is a pro-
jection of the RGB color cube onto a non-linear chroma angle, a radial saturation percentage,
and a luminance-inspired value. In more detail, value is defined as either the mean or maxi-
mum color value, saturation is defined as scaled distance from the diagonal, and hue is defined
as the direction around a color wheel (the exact formulas are described by Hall (1989); Foley,
van Dam, Feiner et al. (1995)). Such a decomposition is quite natural in graphics applications
such as color picking (it approximates the Munsell chart for color description). Figure 2.32l–
n shows an HSV representation of a sample color image, where saturation is encoded using a
gray scale (saturated = darker) and hue is depicted as a color.
If you want your computer vision algorithm to only affect the value (luminance) of an
image and not its saturation or hue, a simpler solution is to use either the Yxy (luminance +
chromaticity) coordinates defined in (2.104) or the even simpler color ratios,
R G B
r = ,g = ,b = (2.116)
R + G + B R + G + B R + G + B
(Figure 2.32e–h). After manipulating the luma (2.112), e.g., through the process of histogram
equalization (Section 3.1.4), you can multiply each color ratio by the ratio of the new to old
luma to obtain an adjusted RGB triplet.
While all of these color systems may sound confusing, in the end, it often may not mat-
ter that much which one you use. Poynton, in his Color FAQ, http://www.poynton.com/
ColorFAQ.html, notes that the perceptually motivated L*a*b* system is qualitatively similar