0% found this document useful (0 votes)
18 views10 pages

NC-Assignment No 3

Uploaded by

Hannan MHR
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views10 pages

NC-Assignment No 3

Uploaded by

Hannan MHR
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

--Assignment No 3--

Numerical Computing
T0

Mr.Umer Hayat

BY

Hannan Khalid

(20014119-0026)

Submission Date:24-June-2024
BS Computer Science (A)

Department of Computing and Information Technology

UNIVERSITY OF GUJRAT
Q1) Implement the bisection method to find a root of
f(x)=x3−2x−in the interval [1,3].

SOLUTION:
Q2) Implement Newton-Raphson method to find a
root of the same equation, starting from x0=2.
SOLUTION:
Q3)Compare the convergence rates of both methods and discuss
their strengths and weaknesses.

ANSWER:

❖ Bisection Method

The Bisection Method is a root-finding algorithm that repeatedly divides an interval in half and
then selects the subinterval in which a root must lie. It is based on the Intermediate Value
Theorem, which states that if a continuous function changes sign over an interval, then it has a
root in that interval.

Convergence Rate:

• The convergence rate of the Bisection Method is linear.


• Given an initial interval [a,b][a, b][a,b] with a root α\alphaα, after nnn iterations, the
interval width is b−a2n\frac{b - a}{2^n}2nb−a.

Strengths:

• Guaranteed convergence if the function is continuous and the initial interval is chosen
such that the function values at the endpoints have opposite signs.
• Simple to understand and implement.

Weaknesses:

• Slow convergence, especially compared to methods like Newton-Raphson.


• Requires a continuous function and an interval where the function changes sign.
❖ Newton-Raphson Method
The Newton-Raphson Method is an iterative root-finding algorithm that uses the function and its
derivative to find successively better approximations to the roots (or zeroes) of a real-valued
function.

Convergence Rate:

• The convergence rate of the Newton-Raphson Method is quadratic.


• If the initial guess x0x_0x0 is sufficiently close to the root α\alphaα, the error after nnn
iterations en=xn−αe_n = x_n - \alphaen=xn−α satisfies en+1≈en22f′(α)e_{n+1} \approx
\frac{e_n^2}{2f'(\alpha)}en+1≈2f′(α)en2.

Strengths:

• Fast convergence when the initial guess is close to the root.


• Requires fewer iterations than the Bisection Method for well-behaved functions.

Weaknesses:

• Requires the computation of the derivative of the function.


• Convergence is not guaranteed if the initial guess is not close to the root.
• Can fail or converge to a different root if the function is not well-behaved near the root
(e.g., inflection points, horizontal tangents)
Q4) Provide numerical results, convergence analysis, and
computational complexities of each method.

ANSWER:

❖ Numerical Results:
Consider finding the root of f(x) = x^2 - 2.

• Bisection Method: Starting with an interval of [-2, 2], it takes several iterations to get
close to the root (sqrt(2)). Each iteration halves the interval size.
• Newton-Raphson Method: Starting with an initial guess close to the root (e.g., x = 1), it
takes fewer iterations to converge to the root due to its quadratic convergence.

❖ Convergence Analysis:

Theoretical analysis proves the convergence rates mentioned above.

• Bisection guarantees convergence based on the halving property.


• Newton-Raphson requires specific conditions for convergence, like the derivative being
non-zero near the root.

❖ Computational Complexity:

• Bisection Method: Each iteration involves relatively simple calculations (function


evaluations at the midpoint). Complexity is O(log n), where n is the desired accuracy
(number of bits).
• Newton-Raphson Method: Requires one function evaluation and one derivative
evaluation per iteration. Complexity is also O(log n) under ideal conditions.

You might also like