Newton's method (or the Newton-Raphson method) is a powerful root-finding algorithm that exploits both the value of a function and its first derivative to rapidly refine approximations to its roots. Unlike bracketing methods that work by enclosing a root between two points, Newton's method is an open method, requiring only a single initial guess. When it does converge, it does so very quickly, often achieving quadratic convergence. However, the method offers no guarantees of convergence from any arbitrary starting point, and it requires that you can compute (or approximate) the derivative of the function.
Physically and geometrically, Newton's method can be viewed as follows: starting from an initial guess
Conceptual Illustration:
Imagine the curve
At
Let
Newton's method starts with an initial guess
Geometric Derivation:
Consider the first-order Taylor expansion of
If
This gives the iterative scheme for Newton’s method.
I. Assumption of Differentiability:
We assume
II. Local Linear Approximation:
Near a guess
III. Solving for the Root of the Linear Approximation:
Set
Hence:
IV. Iteration:
Define:
As
Input:
- A differentiable function
$f(x)$ . - Its derivative
$f'(x)$ . - An initial guess
$x_0$ . - A tolerance
$\epsilon > 0$ or maximum iteration count$n_{\max}$ .
Initialization:
Set iteration counter
Iteration:
I. Evaluate
II. If
III. Update:
IV. Check Convergence:
- If
$|x_{n+1} - x_n| < \epsilon$ or$|f(x_{n+1})| < \epsilon$ , stop. - If
$n > n_{\max}$ , stop.
V. Increment
Output:
- Approximate root
$x_{n+1}$ . - Number of iterations performed.
Function:
Known Root:
Apply Newton's Method:
- Initial guess:
$x_0 = 3$ . - Derivative:
Iteration 1:
$f(x_0) = f(3) = 3^2 -4 = 9-4=5$ $f'(3)=2\cdot3=6$ - Update:
Iteration 2:
- Evaluate
$f(2.1667)= (2.1667)^2 -4 \approx 4.6945 -4=0.6945$ - Evaluate
$f'(2.1667)=2\cdot 2.1667=4.3333$ - Update:
Iteration 3:
- Evaluate
$f(2.0065)=(2.0065)^2 -4 \approx 4.0260 -4=0.0260$ $f'(2.0065)=2\cdot2.0065=4.013.$ - Update:
After a few iterations, we have
-
Fast convergence makes Newton's method highly efficient when the initial guess is close to the actual root and
$f'(x) \neq 0$ , as it exhibits quadratic convergence. - Simplicity in implementation requires only evaluations of the function and its derivative at each step, making the method conceptually straightforward.
- Fewer iterations are typically needed to achieve a desired accuracy compared to bracketing methods like the bisection method, assuming the method converges.
-
Requirement of the derivative means that
$f'(x)$ must either be computable or approximated, which can be challenging or computationally expensive for certain functions. -
No guaranteed convergence occurs if the initial guess is far from the root, or if
$f'(x)$ is small or zero near the approximation, potentially leading to divergence. - Wrong root or complex behavior may arise in cases where the function has multiple roots or inflection points, causing convergence to an unintended root or erratic behavior.
-
Division by zero is a critical issue if
$f'(x_n) = 0$ at any iteration, requiring safeguards or modifications to the standard algorithm to handle such cases.