Page 342 - The Handbook for Quality Management a Complete Guide to Operational Excellence
P. 342

328   C o n t i n u o u s   I m p r o v e m e n t                                A n a l y z e   S t a g e    329


                                   R Square. The square of multiple R, it measures the proportion of total
                                   variation about the mean explained by the regression. For the example,
                                   R = 0.717, which indicates that the fitted equation explains 71.7 percent
                                    2
                                   of the total variation about the average satisfaction level.
                                   Adjusted R Square. A measure of R “adjusted for degrees of freedom,”
                                                                  2
                                   which is necessary when there is more than one independent variable.
                                   Standard error. The standard deviation of the residuals. The residual is
                                   the difference between the observed value of y and the predicted value
                                   y’ based on the regression equation.
                                   Observations. Refer to the number of cases in the regression analysis, or n.
                                   ANOVA, or ANalysis Of Variance. A table examining the hypothesis that
                                   the variation explained by the entire regression is zero. If this is so,
                                   then the observed association could be explained by chance alone. The
                                   rows and columns are those of a standard one-factor ANOVA table.
                                   For this example, the important item is the column labeled “Significance
                                   F.”  The  value  shown,  0.00,  indicates  that  the  probability  of  getting
                                   these results due to chance alone is less than 0.01; that is, the association
                                   is probably not due to chance alone. Note that the ANOVA applies to
                                   the entire model, not to the individual variables. In other words, the
                                   ANOVA tests the hypothesis that the explanatory power of all of the
                                   independent variables combined is zero.

                                   The next table in the output examines each of the terms in the linear
                                model separately. The intercept is as described above; it corresponds to our
                                term a in the linear equation. Our model uses two independent variables.
                                In our termi nology, staff = b , food = b . Thus, reading from the coefficients
                                                                   2
                                                         1
                                column, the linear model is:
                                        Satisfaction = –1.188 + 0.902  *  staff + 0.379  *  food + error

                                   The remaining columns test the hypotheses that each coefficient in the
                                model is actually zero.
                                   Standard error column. Gives the standard deviations of each term, that
                                   is, the standard deviation of the intercept = 0.565, etc.
                                    t Stat column. The coefficient divided by the t statistic; that is, it shows
                                   how many standard deviations the observed coefficient is from zero.

                                   P-value.  Shows  the  area  in  the  tail  of  a  t  distribution  beyond  the
                                   computed t value. For most experimental work a P value less that 0.05
                                   is accepted as an indication that the coefficient is significantly different
                                   from zero.
                                   Lower 95% and upper 95% columns. A 95 percent confidence interval on
                                   the coefficient. If the confidence interval does not include zero, we will
                                   reject the hypothesis that the coefficient is zero.







          15_Pyzdek_Ch15_p305-334.indd   329                                                          11/20/12   10:33 PM
   337   338   339   340   341   342   343   344   345   346   347