Page 83 - Compact Numerical Methods For Computers
P. 83
Chapter 6
LINEAR EQUATIONS-A DIRECT APPROACH
6.1. INTRODUCTION
So far we have been concerned with solving linear least-squares problems. Now
the usually simpler problem of linear equations will be considered. Note that a
program designed to solve least-squares problems will give solutions to linear
equations. The residual sum of squares must be zero if the equations are
consistent. While this is a useful way to attack sets of equations which are
suspected to possess singular coefficient matrices, since the singular-value decom-
position permits such to be identified, in general the computational cost will be
too high. Therefore this chapter will examine a direct approach to solving systems
of linear equations. This is a variant of the elimination method taught to students
in secondary schools, but the advent of automatic computation has changed only
its form, showing its substance to be solid.
6.2. GAUSS ELIMINATION
Let us now look at this approach to the solution of equations (2.2). First note that
if A is upper-triangular so that A = 0 if i > j, then it is easy to solve for x by a
ij
back-substitution, that is
(6.1)
(6.2)
and generally
(6.3)
These equations follow directly from equations (2.2) and the supposed upper- or
right-triangular structure of A. The Gauss elimination scheme uses this idea to
find solutions to simultaneous linear equations by constructing the triangular form
Rx = f (6.4)
from the original equations.
Note that each of the equations (2.2), that is
for i = 1, 2, . . . , n (6.5)
can be multiplied by an arbitrary quantity without altering the validity of the
equation; if we are not to lose any information this quantity must be non-zero.
72