Last modified: July 01, 2022

This article is written in: πŸ‡ΊπŸ‡Έ

Newton's Method

Newton's method (or the Newton-Raphson method) is a powerful root-finding algorithm that exploits both the value of a function and its first derivative to rapidly refine approximations to its roots. Unlike bracketing methods that work by enclosing a root between two points, Newton's method is an open method, requiring only a single initial guess. When it does converge, it does so very quickly, often achieving quadratic convergence. However, the method offers no guarantees of convergence from any arbitrary starting point, and it requires that you can compute (or approximate) the derivative of the function.

Physically and geometrically, Newton's method can be viewed as follows: starting from an initial guess x0, you draw the tangent line to the curve y=f(x) at the point (x0,f(x0)). The intersection of this tangent line with the x-axis provides a better approximation to the root than x0, assuming certain conditions on the function and the initial guess. Repeating this procedure iteratively refines the approximation.

Conceptual Illustration:

Imagine the curve f(x)=x2βˆ’4:

output(15)

At x0=3, the tangent line is drawn. Its intersection with the x-axis is the next approximation x1. Repeating this yields even better approximations rapidly converging to the exact root x=2.

Mathematical Formulation

Let f:R→R be a differentiable function. We wish to solve:

f(x)=0

Newton's method starts with an initial guess x0 and generates a sequence xn defined by:

xn+1=xnβˆ’f(xn)fβ€²(xn) where fβ€²(x) is the first derivative of f(x).

Geometric Derivation:

Consider the first-order Taylor expansion of f around xn:

f(x)β‰ˆf(xn)+fβ€²(xn)(xβˆ’xn)

If x is the root we want (i.e., f(x)=0), we set the above approximation to zero and solve for x:

0=f(xn)+fβ€²(xn)(xβˆ’xn)⟹x=xnβˆ’f(xn)fβ€²(xn)

This gives the iterative scheme for Newton’s method.

Derivation

I. Assumption of Differentiability:

We assume f is at least once differentiable near the root. This allows us to use local linearization.

II. Local Linear Approximation:

Near a guess xn, write:

f(x)β‰ˆf(xn)+fβ€²(xn)(xβˆ’xn)

III. Solving for the Root of the Linear Approximation:

Set f(x)=0 in the linear approximation:

0β‰ˆf(xn)+fβ€²(xn)(xβˆ’xn)

Hence:

xβ‰ˆxnβˆ’f(xn)fβ€²(xn)

IV. Iteration:

Define:

xn+1=xnβˆ’f(xn)fβ€²(xn)

As nβ†’βˆž, if the sequence converges, $x_n \to x^wheref(x^)=0.Undersuitableconditions(e.g.,iff$ is well-behaved and the initial guess is sufficiently close to the actual root), Newton's method converges quadratically to the root.

Algorithm Steps

Input:

Initialization:

Set iteration counter n=0.

Iteration:

I. Evaluate f(xn) and fβ€²(xn).

II. If fβ€²(xn)=0, the method cannot proceed (division by zero). Stop or choose a new xn.

III. Update:

xn+1=xnβˆ’f(xn)fβ€²(xn)

IV. Check Convergence:

V. Increment n=n+1 and repeat from step I.

Output:

Example

Function:

f(x)=x2βˆ’4.

Known Root:

x=2 satisfies f(2)=0.

Apply Newton's Method:

fβ€²(x)=2x.

Iteration 1:

x1=x0βˆ’f(x0)fβ€²(x0)=3βˆ’56=3βˆ’0.8333=2.1667

Iteration 2:

x2=2.1667βˆ’0.69454.3333β‰ˆ2.1667βˆ’0.1602=2.0065

Iteration 3:

x3=2.0065βˆ’0.02604.013β‰ˆ2.0065βˆ’0.00648=2.0000

After a few iterations, we have x3β‰ˆ2.0000, which is very close to the actual root x=2.

Advantages

  1. Fast convergence makes Newton's method highly efficient when the initial guess is close to the actual root and fβ€²(x)β‰ 0, as it exhibits quadratic convergence.
  2. Simplicity in implementation requires only evaluations of the function and its derivative at each step, making the method conceptually straightforward.
  3. Fewer iterations are typically needed to achieve a desired accuracy compared to bracketing methods like the bisection method, assuming the method converges.

Limitations

  1. Requirement of the derivative means that fβ€²(x) must either be computable or approximated, which can be challenging or computationally expensive for certain functions.
  2. No guaranteed convergence occurs if the initial guess is far from the root, or if fβ€²(x) is small or zero near the approximation, potentially leading to divergence.
  3. Wrong root or complex behavior may arise in cases where the function has multiple roots or inflection points, causing convergence to an unintended root or erratic behavior.
  4. Division by zero is a critical issue if fβ€²(xn)=0 at any iteration, requiring safeguards or modifications to the standard algorithm to handle such cases.

Table of Contents

    Newton's Method
    1. Mathematical Formulation
    2. Derivation
    3. Algorithm Steps
    4. Example
    5. Advantages
    6. Limitations