0% found this document useful (0 votes)
4 views7 pages

Root-Finding Methods: Newton-Raphson and Secant Methods: June 1, 2025

This report discusses the Newton-Raphson and Secant methods for finding approximate solutions to real-valued equations. The Newton-Raphson method uses tangents for rapid convergence, while the Secant method approximates derivatives using two initial guesses, resulting in generally slower convergence. Both methods are effective, with the choice depending on the specific problem and available information.

Uploaded by

shresthapaone27
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views7 pages

Root-Finding Methods: Newton-Raphson and Secant Methods: June 1, 2025

This report discusses the Newton-Raphson and Secant methods for finding approximate solutions to real-valued equations. The Newton-Raphson method uses tangents for rapid convergence, while the Secant method approximates derivatives using two initial guesses, resulting in generally slower convergence. Both methods are effective, with the choice depending on the specific problem and available information.

Uploaded by

shresthapaone27
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Root-Finding Methods: Newton-Raphson and

Secant Methods

June 1, 2025

1 Introduction
This report discusses two iterative numerical techniques used to find approxi-
mate solutions to real-valued equations: the Newton-Raphson method and the
Secant method.

2 Newton-Raphson Method
2.1 Theory
The Newton-Raphson method utilizes the concept of tangents. Starting with an
initial guess x0 , the next approximation xn+1 is calculated using the formula:

f (xn )
xn+1 = xn − (1)
f ′ (xn )
where:
• f (x) is the function for which we seek the root.
• f ′ (x) is the derivative of the function.

2.2 Algorithm
1. Choose an initial guess x0 .
2. Compute f (xn ) and f ′ (xn ).

3. Update the guess using:

f (xn )
xn+1 = xn − (2)
f ′ (xn )

4. Repeat until the desired accuracy is achieved.

1
2.3 Code
import numpy as np
import matplotlib.pyplot as plt

def fun(x):
return np.cos(x) - x * np.exp(x)

def derivative(x):
return -np.sin(x) - np.exp(x) - x * np.exp(x)

def NewtonRapson(a, e):


x = a
iterations = [0]
errors = [abs(fun(x))]
iter_count = 0
for _ in range(10):
b = x - fun(x) / derivative(x)
error = abs(fun(b))
iter_count += 1
iterations.append(iter_count)
errors.append(error)
x = b
return iterations, errors

iter, err = NewtonRapson(a=1, e=0.0001)


plt.plot(iter, err, marker=’o’, linestyle=’-’, color=’b’)
plt.xlabel(’Iteration’)
plt.ylabel(’Error (|f(x)|)’)
plt.title(’Newton-Raphson Convergence (10 Iterations)’)
plt.grid(True)
plt.yscale(’log’)
plt.show()

2.4 Results
The Newton-Raphson method converges rapidly to the root, as observed in the
plotted graph of error versus iteration. The error decreases significantly after
each iteration, indicating the method’s effectiveness.

2
Figure 1: Newton Rapson

3 Secant Method
3.1 Theory
The Secant method is similar to the Newton-Raphson method but does not
require derivatives. It uses two initial guesses, x0 and x1 , to approximate the
derivative. The formula for updating the guess is:
xn − xn−1
xn+1 = xn − f (xn ) · (3)
f (xn ) − f (xn−1 )

3.2 Algorithm
1. Choose two initial guesses x0 and x1 .
2. Compute f (x0 ) and f (x1 ).

3
3. Update the guess using:
xn − xn−1
xn+1 = xn − f (xn ) · (4)
f (xn ) − f (xn−1 )

4. Repeat until the desired accuracy is achieved.

3.3 Code
import numpy as np
import matplotlib.pyplot as plt
import seaborn as sns
import pandas as pd

sns.set_theme(style="darkgrid", palette="deep", font_scale=1.2)

def fun(x):
• • return np.cos(x) - x * np.exp(x)

def Secant(a, e):


• • x0 = a
• • x1 = a + 0.1
• • iterations = [0] •
• • errors = [abs(fun(x0))] •
• • iter_count = 0
• •
• • for _ in range(10):
• • • • fa = fun(x0)
• • • • fb = fun(x1)
• • • •
• • • • # Avoid division by zero
• • • • if abs(fb - fa) < 1e-10:
• • • • • • print("Difference in function values too small, stopping early.")
• • • • • • break
• •
• • • • x_new = x1 - fb * (x1 - x0) / (fb - fa)
• • • • error = abs(fun(x_new)) •
• • • • x0 = x1
• • • • x1 = x_new
• • • •
• • • • iter_count += 1
• • • • iterations.append(iter_count)
• • • • errors.append(error)

4
return iterations, errors

iter, err = Secant(a=1, e=0.0001)

data = pd.DataFrame({’Iteration’: iter, ’Error’: err})

plt.figure(figsize=(10, 6)) •# Larger figure for clarity


sns.lineplot(data=data, x=’Iteration’, y=’Error’, marker=’o’, linestyle=’-’,
• • • • • • •color=’teal’, linewidth=2, markersize=8, label=’Convergence’)
plt.yscale(’log’)
plt.xlabel(’Iteration’, fontsize=12)
plt.ylabel(’Error (|f(x)|)’, fontsize=12)
plt.title(’Secant Method Convergence (10 Iterations)’, fontsize=14, pad=10)
plt.legend()
plt.show()

3.4 Results
The Secant method also shows convergence toward the root, although the rate
of convergence may be slower than that of the Newton-Raphson method. The
graph illustrates a gradual decrease in error with each iteration, reflecting the
effectiveness of this method.

5
Figure 2: Secant method

4 Discussion of Graphs
The graphs of the Newton-Raphson and Secant methods provide valuable in-
sights into their convergence behavior and effectiveness in finding roots.

4.1 Newton-Raphson Method


In the graph illustrating the Newton-Raphson method, we observe the following:
• Tangent Lines: The method uses the tangent line at the current ap-
proximation to determine the next guess. Each tangent line intersects the
x-axis closer to the actual root, demonstrating rapid convergence.
• Convergence: The graph typically shows a steep descent toward the
root, indicating quadratic convergence, especially when the initial guess is
close to the actual root.
• Failures: In cases where the derivative is zero or the function is flat,
the graph may show divergence or oscillation, emphasizing the method’s
sensitivity to the initial guess.

4.2 Secant Method


The graph for the Secant method highlights several key aspects:

6
• Secant Lines: Unlike the Newton-Raphson method, the Secant method
uses secant lines between two points. The slope of these lines approximates
the derivative, influencing the convergence rate.
• Convergence Rate: The convergence is generally slower than the Newton-
Raphson method, as indicated by a more gradual approach to the root in
the graph. However, the method can still be effective, particularly when
derivatives are difficult to compute.
• Initial Guesses: The choice of initial guesses significantly impacts the
graph’s trajectory. Poor initial guesses may lead to divergence or slow
convergence.

5 Conclusion
Both the Newton-Raphson and Secant methods are effective techniques for solv-
ing nonlinear equations. The choice of method depends on the specific problem
and the available information about the function.

You might also like