Page 295 - Design for Six Sigma a Roadmap for Product Development
P. 295
Axiomatic Design 265
Hartley (1928) introduced a logarithmic measure of information in
the context of communication theory. Hartley, and later Shannon
(1948), introduced their measure for the purpose of measuring infor-
mation in terms of uncertainty. Hartley’s information measure is
essentially the logarithm of the cardinality or source alphabet size (see
Definition 8.1), while Shannon formulated his measure in terms of
probability theory. Both measures are information measures, and
hence are measures of complexity. However, Shannon called his mea-
sure “entropy” because of the similarity to the mathematical form of
that used in statistical mechanics.
Hartley’s information measure (Hartley 1928) can be used to explore
the concepts of information and uncertainty in a mathematical frame-
work. Let X be a finite set with a cardinality |X| n. For example, X
can be the different processes in a transactional DFSS project, a flow-
chart, or a set of DPs in a product DFSS project. A sequence can be
generated from set X by successive selection of its elements. Once a
selection is made, all possible elements that might have been chosen
are eliminated except for one. Before a selection is made, ambiguity is
experienced. The level of ambiguity is proportional to the number of
alternatives available. Once a selection is made, no ambiguity sus-
tains. Thus, the amount of information obtained can be defined as the
amount of ambiguity eliminated.
Hartley’s information measure I is given by I log 2 N (bits) where
N n and s is the sequence of selection. The conclusion is that the
s
amount of uncertainty needed to resolve a situation, or the amount of
complexity to be reduced in a design problem is equivalent to the
potential information involved. A reduction of information of I bits
represents a reduction in complexity or uncertainty of I bits.
Definition 8.1. A source of information is an ordered pair (X,P)
where X {x 1 , x 2 ,…,x n } is a finite set, known as a source alphabet, and
P is a probability distribution on X. We denote the probability of x i by p i .
The elements of set X provide specific representations in a certain
context. For example, it may represent the set of all possible tolerance
intervals of a set of DPs. The association of set X with probabilities
suggests the consideration of a random variable as a source of infor-
mation. It conveys information about the variability of its behavior
around some central tendency. Suppose that we select at random an
arbitrary element of X,say, x i with probability p i . Before the sampling
occurs, there is a certain amount of uncertainty associated with the
outcome. However, an equivalent amount of information is gained
about the source after sampling, and therefore uncertainty and infor-
mation are related. If X {x 1 }, then there is no uncertainty and no
information gained. At the other extreme, maximum uncertainty