Page 192 -
P. 192
3.9 Exercises 171
could be replaced with robust pairwise potentials; Boykov, Veksler, and Zabih (2001) devel-
oped iterative binary, graph cut algorithms for optimizing multi-label MRFs; Kolmogorov
and Zabih (2004) characterized the class of binary energy potentials required for these tech-
niques to work; and Freeman, Pasztor, and Carmichael (2000) popularized the use of loopy
belief propagation for MRF inference. Many more additional references can be found in
Sections 3.7.2 and 5.5, and Appendix B.5.
3.9 Exercises
Ex 3.1: Color balance Write a simple application to change the color balance of an image
by multiplying each color value by a different user-specified constant. If you want to get
fancy, you can make this application interactive, with sliders.
1. Do you get different results if you take out the gamma transformation before or after
doing the multiplication? Why or why not?
2. Take the same picture with your digital camera using different color balance settings
(most cameras control the color balance from one of the menus). Can you recover what
the color balance ratios are between the different settings? You may need to put your
camera on a tripod and align the images manually or automatically to make this work.
Alternatively, use a color checker chart (Figure 10.3b), as discussed in Sections 2.3 and
10.1.1.
3. If you have access to the RAW image for the camera, perform the demosaicing yourself
(Section 10.3.1) or downsample the image resolution to get a “true” RGB image. Does
your camera perform a simple linear mapping between RAW values and the color-
balanced values in a JPEG? Some high-end cameras have a RAW+JPEG mode, which
makes this comparison much easier.
4. Can you think of any reason why you might want to perform a color twist (Sec-
tion 3.1.2) on the images? See also Exercise 2.9 for some related ideas.
Ex 3.2: Compositing and reflections Section 3.1.3 describes the process of compositing
an alpha-matted image on top of another. Answer the following questions and optionally
validate them experimentally:
1. Most captured images have gamma correction applied to them. Does this invalidate the
basic compositing equation (3.8); if so, how should it be fixed?
2. The additive (pure reflection) model may have limitations. What happens if the glass is
tinted, especially to a non-gray hue? How about if the glass is dirty or smudged? How
could you model wavy glass or other kinds of refractive objects?
Ex 3.3: Blue screen matting Set up a blue or green background, e.g., by buying a large
piece of colored posterboard. Take a picture of the empty background, and then of the back-
ground with a new object in front of it. Pull the matte using the difference between each
colored pixel and its assumed corresponding background pixel, using one of the techniques
described in Section 3.1.3)orby Smith and Blinn (1996).