Modelling the Newton Raphson Method in Python

In this tutorial, we will explore how to find roots of polynomial or transcendental equations using the Newton-Raphson method. This is an iterative numerical method that starts with an initial guess and converges to the root through successive approximations.

x_g, f(x_g) x_n Newton-Raphson Method x f(x)

The method works by drawing a tangent line at the current guess point and finding where this tangent intersects the x-axis. This intersection becomes the next approximation.

Mathematical Formula

The Newton-Raphson formula is derived from the slope of the tangent line ?

$$f'(x_g) = \frac{0 - f(x_g)}{x_n - x_g}$$

Rearranging to solve for x_n ?

$$x_n = x_g - \frac{f(x_g)}{f'(x_g)}$$

The process continues until convergence is achieved, typically when $|x_g - x_n|

Python Implementation

Let's find the roots of the equation $x^2 + 4x + 3 = 0$ using Python ?

import numpy as np
import matplotlib.pyplot as plt

# Define the function and its derivative
def f(x):
    return x**2 + 4*x + 3

def df_dx(x):
    return 2*x + 4

# Newton-Raphson implementation
def newton_raphson(initial_guess, tolerance=1e-5, max_iterations=100):
    xg = initial_guess
    iterations = []
    
    print(f"{'Iteration':^10}{'x_g':^15}{'f(x_g)':^15}{'Error':^15}")
    print("-" * 55)
    
    for i in range(max_iterations):
        # Calculate next approximation
        xn = xg - f(xg) / df_dx(xg)
        
        # Calculate error
        error = abs(xn - xg)
        
        # Store iteration data
        iterations.append({'iteration': i+1, 'xg': xg, 'f_xg': f(xg), 'error': error})
        
        print(f"{i+1:^10}{xg:^15.5f}{f(xg):^15.5f}{error:^15.5f}")
        
        # Check for convergence
        if error < tolerance:
            print("-" * 55)
            print(f"Root found: {xn:.5f}")
            return xn, iterations
        
        xg = xn
    
    print("Maximum iterations reached")
    return xn, iterations

# Find first root with initial guess = 10
root1, iterations1 = newton_raphson(10)
Iteration     x_g          f(x_g)        Error    
-------------------------------------------------------
    1        10.00000     143.00000      5.95833
    2         4.04167      35.50174      2.93808
    3         1.10359       8.63228      1.39230
    4        -0.28709       1.93403      0.56456
    5        -0.85165       0.31871      0.13877
    6        -0.99042       0.01926      0.00953
    7        -0.99995       0.00009      0.00005
-------------------------------------------------------
Root found: -1.00000

Finding Multiple Roots

To find the second root, we use a different initial guess ?

# Find second root with initial guess = -10
root2, iterations2 = newton_raphson(-10)
Iteration     x_g          f(x_g)        Error    
-------------------------------------------------------
    1       -10.00000      63.00000      3.93750
    2        -6.06250      15.50391      1.90817
    3        -4.15433       3.64112      0.84508
    4        -3.30925       0.71415      0.27273
    5        -3.03652       0.07438      0.03588
    6        -3.00064       0.00129      0.00064
-------------------------------------------------------
Root found: -3.00000

Visualization

Let's visualize the convergence process ?

# Create visualization
x = np.linspace(-6, 2, 1000)
y = f(x)

plt.figure(figsize=(10, 6))
plt.subplot(1, 2, 1)
plt.plot(x, y, 'b-', linewidth=2, label='f(x) = x² + 4x + 3')
plt.axhline(y=0, color='k', linestyle='--', alpha=0.7)
plt.axvline(x=-1, color='r', linestyle=':', alpha=0.7, label='Root 1: x = -1')
plt.axvline(x=-3, color='g', linestyle=':', alpha=0.7, label='Root 2: x = -3')
plt.xlabel('x')
plt.ylabel('f(x)')
plt.title('Newton-Raphson Method')
plt.legend()
plt.grid(True, alpha=0.3)

# Plot convergence for both roots
plt.subplot(1, 2, 2)
errors1 = [iter_data['error'] for iter_data in iterations1]
errors2 = [iter_data['error'] for iter_data in iterations2]

plt.semilogy(range(1, len(errors1)+1), errors1, 'ro-', label='Initial guess = 10')
plt.semilogy(range(1, len(errors2)+1), errors2, 'go-', label='Initial guess = -10')
plt.xlabel('Iteration')
plt.ylabel('Error (log scale)')
plt.title('Convergence Rate')
plt.legend()
plt.grid(True, alpha=0.3)

plt.tight_layout()
plt.show()

Complete Implementation

Here's the complete Newton-Raphson solver ?

import numpy as np

class NewtonRaphson:
    def __init__(self, func, derivative, tolerance=1e-5, max_iterations=100):
        self.func = func
        self.derivative = derivative
        self.tolerance = tolerance
        self.max_iterations = max_iterations
    
    def solve(self, initial_guess):
        """Find root using Newton-Raphson method"""
        x = initial_guess
        
        for iteration in range(self.max_iterations):
            fx = self.func(x)
            dfx = self.derivative(x)
            
            if abs(dfx) < 1e-10:
                raise ValueError("Derivative too small - cannot continue")
            
            x_new = x - fx / dfx
            error = abs(x_new - x)
            
            if error < self.tolerance:
                return x_new, iteration + 1
            
            x = x_new
        
        raise ValueError("Maximum iterations reached without convergence")

# Example: solve x³ - 2x - 5 = 0
def cubic_func(x):
    return x**3 - 2*x - 5

def cubic_derivative(x):
    return 3*x**2 - 2

solver = NewtonRaphson(cubic_func, cubic_derivative)
root, iterations = solver.solve(2.0)

print(f"Root: {root:.6f}")
print(f"Converged in {iterations} iterations")
print(f"Verification: f({root:.6f}) = {cubic_func(root):.2e}")
Root: 2.094552
Converged in 4 iterations
Verification: f(2.094552) = -7.11e-15

Key Properties

Property Description Advantage/Limitation
Convergence Rate Quadratic (very fast) Faster than bisection method
Initial Guess Must be close to root May diverge with poor guess
Derivative Required Need f'(x) Not suitable if derivative complex
Multiple Roots Finds one root per run Need different guesses for multiple roots

Conclusion

The Newton-Raphson method is a powerful numerical technique for finding roots with quadratic convergence. It requires the function's derivative and a good initial guess, but converges much faster than other methods when these conditions are met.

Updated on: 2026-03-27T00:33:05+05:30

3K+ Views

Kickstart Your Career

Get certified by completing the course

Get Started
Advertisements