Numerical Analysis-4. Root-Finding (Overview) PDF
Numerical Analysis-4. Root-Finding (Overview) PDF
Numerical Analysis-4. Root-Finding (Overview) PDF
ROOT-FINDING
Solution to Nonlinear Equation, = 0.
2 Bisection Method
This is the simplest and the most robust method for finding a root to an equation.
One of the main drawbacks is that we need two initial guesses and which
bracket the root.
Let = ( ) and = ( ) such that 0. Clearly, if = 0 then one or
both of and must be a root of () = 0 based on the Intermediate Value
Theorem.
The basic algorithm for the bisection method relies on repeated application of :
Let = ( + )/2,
If = () = 0 then = is an exact solution,
elseif < 0 then the root lies in the interval ( , ),
else the root lies in the interval ( , ).
1
1/6/2016
3
Bisection Method
Figure 1: Graphical representation of the bisection method showing two initial guesses
( and bracketting the root).
2
1/6/2016
5
False Position Method
Figure 2: Graphical representation of the false position method showing two initial guesses
( and bracketting the root).
6 Newton-Raphson Method
Consider the Taylor Series expansion of () about some point = 0 :
() = (0 ) + ( 0 )(0 ) + ( 0 )2(0 ) + (| 0 |3).
Setting he quadratic and higher terms to zero and solving the linear
approximation of () = 0 for x gives
0
1 = 0
0
Subsequent iterations are defined in a similar manner as
+1 =
Geometrically, +1 can be interpreted as the value of at which a line, passing
through the point ( , ( )) and tangent to the curve () at that point,
crosses the axis.
3
1/6/2016
7
Newton-Raphson Method
When it works,
Newton-Raphson
converges much
more rapidly than
the bisection or
linear interpolation.
Figure 3: Graphical representation of the Newton-Raphson method with one initial guesse .
Newton-Raphson Method
However, if ()
vanishes at an
iteration point, or
indeed even between
the current estimate
and the root, then
the method will fail
to converge.
Figure 4: Divergence of the Newton-Raphson method due to the presence of of a turning point close to
the root.
4
1/6/2016
9
Secant (Chord) Method
This method is essentially the same as Newton-Raphson except that the
derivative () is approximated by a finite difference based on the current and
the preceding estimate for the root, i.e.
1
1
and this is substituted into the Newton-Raphson algorithm to give
1
+1 =
1
This formula is identical to that for the Linear Interpolation method. The
difference is that instead of replacing one of the two estimates so that the root
is always bracketed, the oldest point is always discarded in favour of the new.
This means it is not necessary to have two initial guesses bracketing the root,
but on the other hand, convergence is not guaranteed.
10
Secant Method
5
1/6/2016
11
Secant Method
12
Direct Iteration
A simple and often useful method involves rearranging and possibly transforming
the function () by ((), ) to obtain () = ((), ).
The only restriction on ((), ) is that solutions to () = 0 have a one to
one relationship with solutions to () = for the roots being sort.
The iteration formula for this method is then just
+1 = ( ).
The method will converge only if || < 1. The sign of determines whether the
convergence (or divergence) is monotonic (positive ) or oscillatory (negative ).
6
1/6/2016
13
Direct Iteration Method
Figure 7: Convergence from the root using the Direct Iteration method.
Prepared by: ENGR. ALEXANDER S. CARRASCAL