Page 79 - Statistics II for Dummies
P. 79

Chapter 4: Getting in Line with Simple Linear Regression  63


                                Making point estimates by

                                using the regression line

                                When you have a line that estimates y given x, you can use it to give a one-
                                number estimate for the (average) value of y for a given value of x. This is
                                called making a point estimate. The basic idea is to take a reasonable value of
                                x, plug it into the equation of the regression line, and see what you get for the
                                value of y.

                                In the textbook-weight example, the best-fitting line (or model) is the line
                                y = 3.69 + 0.113x. For an average student who weighs 60 pounds, for
                                example, a one-number point estimate of the average textbook weight is
                                3.69 + (0.113 * 60) = 10.47 pounds (those poor little kids!). If the average
                                student weighs 100 pounds, the estimated average textbook weight is
                                3.69 + (0.113 * 100) = 14.99, or nearly 15 pounds, plus or minus something.
                                (You find out what that something is in the following section.)


                      No Conclusion Left Behind: Tests and

                      Confidence Intervals for Regression



                                After you have the slope of the best-fitting regression line for your data (see
                                the previous sections), you need to step back and take into account the fact
                                that sample results will vary. You shouldn’t just say, “Okay, the slope of this
                                line is 2. I’m done!” It won’t be exactly 2 the next time. This variability is why
                                statistics professors harp on adding a margin of error to your sample results;
                                you want to be sure to cover yourself by adding that plus or minus.

                                In hypothesis testing, you don’t just compare your sample mean to the
                                population mean and say, “Yep, they’re different alright!” You have to
                                standardize your sample result using the standard error so that you can
                                put your results in the proper perspective (see Chapter 3 for a review of
                                confidence intervals and hypothesis tests).

                                The same idea applies here with regression. The data were used to figure
                                out the best-fitting line, and you know it fits well for that data. That’s not to
                                say that the best-fitting line will work perfectly well for a new data set taken
                                from the same population. So, in regression, all your results should involve
                                the standard error with them in order to allow for the fact that sample results
                                vary. That goes for estimating and testing for the slope and y-intercept and
                                for any predictions that you make.












          09_466469-ch04.indd   63                                                                   7/24/09   10:20:37 AM
   74   75   76   77   78   79   80   81   82   83   84