Fall 2004 Final CC-BY-NC

Maintainer: admin

View the exam on docuum

Under construction.

1Question 1

Newton's method

1.1Solution

1.1.1Part (a)

Use a Taylor expansion to derive Newton's method.

Let $x = x_n$ and $x^* = x_{n+1}$. Substituting these into the Taylor expansion (ignoring the error term), we get:

$$f(x_{n+1}) = f(x_n) + (x_{n+1} - x_n)f'(x_n) \approx 0 \tag{since $f(x_{n+1}) \approx 0$ as $n \to \infty$}$$

Solving for $x_{n+1}$, we get:

$$x_{n+1} = x_n - \frac{f(x_n)}{f'(x_n)}$$

which is indeed Newton's method.

1.1.2Part (b)

Write Newton's method as a fixed-point iteration

We need to find a $g$ such that we can write

$$x_{n+1} = g(x_n)$$

This is kind of a freebie ...

$$g(x) = x - \frac{f(x)}{f'(x)}$$

1.1.3Part (c)

Show that $g'(x^*) = 0$ provided $f'(x^*) \neq 0$.

Take the derivative of $g$:

$$g'(x) = \frac{f(x)f''(x)}{(f'(x))^2}$$

As long as $f'(x^*) \neq 0$, then the denominator is non-zero and so the expression is valid. Since $f(x^*) = 0$ if $x^*$ is the root, then $g(x^*) = 0$ (since the numerator is zero).

This tells us that the fixed-point iterations will converge quadratically provided our initial guess $x_0$ is close enough to $x^*$ (where "close enough" means $x_0$ is in some $\delta$-neighbourhood of $x^*$).

1.1.4Part (d)

Find some limit for $g'(x)$.

We're given that $f'(x^*) = 0$ but $f''(x^*) \neq 0$. Thus $f'(x)$ must be of the form $f'(x) = (x-x^*)h(x)$ for some function $h(x)$ (where $h(x^*) \neq 0)$. Then $f(x) = (x-x^*)^2i(x)$ where $i(x)$ is some function such that $i(x^*) \neq 0$. We can rewrite $f'(x)$ in terms of $i(x)$: $f'(x) = 2(x-x^*)i(x) + (x-x^*)^2i'(x) = (x-x^*)(2i(x) + (x-x^*)i'(x))$. Then, $f''(x) = 2(x-x^*)i'(x) + 2i(x) + 2(x-x^*)i'(x) + (x-x^*)^2i''(x) = 2i(x) + 4(x-x^*)i'(x) + (x-x^*)^2i''(x)$. Now let's find the limit for $g'(x)$:

$$\lim_{x \to x^*} g'(x) = \lim_{x \to x^*} \frac{f(x)f''(x)}{(f'(x))^2} = \text{some really long derivation} = \frac{1}{2}$$

This tells us that the fixed-point iteration will converge linearly to $x^*$ given $x_0$ is close enough to $x^*$.

1.1.5Part (e) (i)

Use Newton's method to compute $x_5$ for the root of some function.

$f(x) = x^3 - 9x^2 + 15x + 25$. Since we know that $x=5$ is one of the roots, we can rewrite this as $f(x) = (x-5)(x^2-4x-5) = (x-5)(x-5)(x+1)$. Also, $f'(x) = 3x^2 - 18x + 15$. Let $x_0 = 3$ be our initial approximation. We can calculate subsequent approximations using the formula from part (a):

$$\begin{align} x_1 & = x_0 - \frac{f(x_0)}{f'(x_0)} = 3 - \frac{f(3)}{f'(3)} = 3 - \frac{16}{-12} = \frac{13}{3} \\ x_2 & = x_1 - \frac{f(x_1)}{f'(x_1)} = \frac{211}{45} \\ x_3 & \approx 4.84881749219099 \tag{got lazy} \\ x_4 & \approx 4.9253984938432955 \\ x_5 & \approx 4.9629355449998265 \end{align}$$

1.1.6Part (e) (ii)

What's the apparent rate of convergence?

To find the apparent rate of convergence, we look at the following:

$$\lim_{k \to \infty} \frac{|x_{k+1} - 5|}{|x_{k} - 5|}$$

This seems to be around 0.5 so the order of convergence is 1.

1.2Accuracy and discussion

See q6 in the instructor-provided solutions to the final review exercises, hosted on docuum. Part (e) in the exercises doesn't have the "compute $x_5$" bit, though.

2Question 2

Fixed-point theorem & proving it. See HTSEFP for the proof.

3Question 3

Divided differences

3.1Solution

3.1.1Part (a)

Define the zeroth divided differences for all $j$.

Just $f[x_j] = f(x_j)$.

3.1.2Part (b)

Describe the interpolating polynomial in Newton form.

We are given the following formula for the interpolating polynomial in Newton form:

$$p_n(x) = \sum_{j=0}^n c_j w_j(x)$$

Our job is to define the $w$'s and $c$'s:

  • $w_0(x) = 1$
  • $\displaystyle w_j(x) = \prod_{i=0}^{j-1}(x-x_i) \quad j = 1, \ldots, n$
  • $c_j = f[x_0, x_1, \ldots, x_j] \quad j=0, 1, \ldots, n$

3.1.3Part (c)

Assuming that $f$ is $n$-times diff'able, show there exists $\xi \in [x_0, x_n]$ such that

$$f[x_0, \ldots, x_n] = \frac{f^{(n)}(\xi)}{n!}$$

We define a new function $g$ as $g(x) = f(x) - p_n(x)$. By the definition of the interpolating polynomial, we know that $f(x_i) = p_n(x_i)$ for $i \in 0, \ldots, n$. So $g(x)$ has $n+1$ zeros. By the generalised version of Rolle's theorem, we know that $g^{(n)}(x)$ has a zero, which we'll call $\xi$. Thus there is a $\xi \in [x_0, x_n]$ such that $g^{(n)}(\xi) = f^{(n)}(\xi) - p_n^{(n)}(\xi) = 0$.

Let's look more closely at $p_n^{(n)}$. We know that the highest-order term in $p_n$ is $x^n$, which has a coefficient of $f[x_0, \ldots, x_n]$. Since all lower-order terms will become 0 in the $n$th derivative, we can ignore those. Then, if we take the $n$th of $p_n$, we get $p_n^{(n)}(x) = n!f[x_0, \ldots, x_n]$. If we substitute this into the formula for $g^{(n)}(\xi)$, we get:

$$g^{(n)}(\xi) = f^{(n)}(\xi) - n!f[x_0, \ldots, x_n] = 0$$

and if we rearrange this, we get

$$f[x_0, \ldots, x_n] = \frac{f^{(n)}(\xi)}{n!}$$

exactly as desired.

3.1.4Part (d)

Construct the table of centered differences given some values for $x_i$ and $f(x_i)$.

$f[x_i]$ $f[x_{i-1}, x_i]$ $f[x_{i-2}, x_{i-1}, x_i]$ $f[x_{i-3}, x_{i-2}, x_{i-1}, x]$
$x_0 = 0$ 3
$x_1 = 1$ 5 $(5-3)/(1-0) = 2$
$x_2 = 2$ 9 $(9-5)/(2-1) = 4$ $(4-2)/(2-0) = 1$
$x_3 = 3$ 17 $(17-9)/(3-2) = 8$ $(8-4)/(3-1) = 2$ $(2 - 1)/(3-0) = 1/3$

State the polynomial of degree 3 which interpolates at $x_0, x_1, x_2, x_3$.

Take the coefficients from the diagonal of the table:

$$p_3(x) = 3 + 2(x-0) + 1(x-0)(x-1) + 1/3(x-0)(x-1)(x-2) = 3+2x + x(x-1) + 1/3x(x-1)(x-2)$$

State the polynomial of degree 2 which interpolates at $x_0, x_1, x_2$.

Take the coefficients from one level below the diagonal:

$$p_2(x) = 5 + 4(x-1) + 2(x-1)(x-2)$$

3.2Accuracy and discussion

Same as question 2 for the Fall 2008 final. Solutions here. Note that part (d) has different numbers (but the method is identical).

4Question 4

Forward difference approximations

4.1Solution

4.1.1Part (a)

Use the fwd diff formula with $h=0.1$ and $h=0.2$ to get 2 approximations for $f'(0)$.

Free points basically ...

$$\begin{align} h = 0.1: & f'(0) \approx \frac{f(0+0.1) - f(0)}{0.1} = \frac{f(0.1) - f(0)}{0.1} = \frac{0.090483}{0.1} = 0.90483 \\ h = 0.2: & f'(0) \approx \frac{f(0 + 0.2) - f(0)}{0.1} = \frac{f(0.2) - f(0)}{0.2} = \frac{0.163746}{0.2} = 0.81873 \end{align}$$

4.1.2Part (b)

Using Taylor series or otherwise find the error term

Using Taylor series:

$$f(x_0 + h) = f(x_0) + hf'(x_0) + \frac{h^2}{2} f''(x_0) + \frac{h^3}{6} f'''(x_0) + \ldots$$

Rearranging to solve for $f'(x_0)$, we have:

$$f'(x_0) = \frac{f(x_0 + h) - f(x_0)}{h} - \underbrace{\left ( \frac{h^2}{2} f''(x_0) + \frac{h^3}{6} f'''(x_0) + \ldots \right )}_{\text{error terms}}$$

4.1.3Part (c)

Use Richardson extrapolation to get a better approximation

Replace $h$ by $h/2$, do 2(2) - (1), should get $\approx 0.99093$

4.1.4Part (d)

State a finite formula for approximating $f''(x_0)$, and use it to approximate $f''(0.1)$.

$$f''(x) \approx \frac{f(x+h) - 2f(x) + f(x-h)}{h^2}$$

Let $h=0.1$. Then: $(f(0.2) - 2f(0.1) + f(0))/0.01 = 100*(0.163746 - 2*0.090483) = -1.722$.

4.2Accuracy and discussion

Similar to q3 in the exercises

5Question 5

Quadrature

5.1Solution

5.1.1Part (a)

Define degree of accuracy

see htsefp

5.1.2Part (b)

Find deg of acc

see htsefp

5.1.3Part (c)

Find $k$

see htsefp

5.1.4Part (d)

Approximate $\ln(2)$

see htsefp

5.2Accuracy and discussion

Similar to q4 in the exercises (slightly different though). Also similar to q4 for fall 2008.

6Question 6

Simpson's rule

6.1Solution

6.1.1Part (a)

Derive composite rule

see htsefp

6.1.2Part (b)

Prove error formula for composite rule

see htsefp

6.1.3Part (c)

Show that $I(f) \leq I_h(f)$.

This whole question is so ridiculously easy. I can't believe I was actually worried for this exam1. To show that $I(f) \leq I_h(f)$, we can just use the formula from part (b) for $I(f) - I_h(f)$. As long as this difference is negative, $I(f) \leq I_h(f)$. Well, $b-a$ is obviously positive, and $h^4$ is also obviously positive. So to ensure that the difference is negative, we just need $f^{(4)}$ to be positive. Let's take a look at the fourth derivative:

$$f'(x) = 3x^2 + \pi\cos(\pi x) \quad f''(x) = 6x - \pi^2\sin(\pi x) \quad f'''(x) = 6 - \pi^3\cos(\pi x) \quad f^{(4)}(x) = \pi^4\sin(\pi x)$$

Consider the behaviour of $\sin(\pi x)$ over the interval $[0, 1]$. (This is equivalent to the behaviour of $\sin(x)$ over the interval $[0, \pi]$.) Notably, consider the fact that it's never negative at any point in this interval. So yeah, $I(f) \leq I_h(f)$. $\blacksquare$

Find an upper bound for the error when $h = 1/10$.

When $h=1/10$, the upper bound is given by

$$\max_{\xi \in [0, 1]} -\frac{(b-a)}{180}h^4 f^{(4)}(\xi) = -\frac{(1-0)}{180} \cdot \frac{1}{10^4} \cdot \max_{\xi \in [0, 1]} f^{(4)}(\xi) = -\frac{1}{1800000} \cdot \pi^4 \approx$5.4109 \times 10^{-5}$

since the max of $\sin(x)$ in the interval is obviously 1.

What value of $h$ is required to ensure that $|I(f) - I_h(f)| \leq 10^{-4}$?

$$h \leq \sqrt{\frac{10^{-4} \cdot 180}{\pi^4}}[4] \approx 0.116595$$

6.2Accuracy and discussion

Identical to q5 for fall 2008.

7Question 7

Runge-Kutta

7.1Solution

7.1.1Part (a)

Prove something

same as exercises

7.1.2Part (b)

Prove something else

ditto

7.1.3Part (c)

Conditions on $h$, etc

same

7.1.4Part (d)

Define local truncation error, etc

Slightly different from the one in the exercises. Define local truncation error as

$$\tau_{i+1} = y(t_{i+1}) - w_{i+1}$$

and use a Taylor expansion on $y(t_{i+1}) = y(t_i + h)$. Recall that $y' = \lambda y, $y'' = \lambda^2 y$, etc, and substitute those into the expansion. Should work out.

7.2Accuracy and discussion

Similar to q7 for the exercises and q6 for fall 2008.

8Question 8

Linear shooting method. Skipped cus that's not covered in fall 2013.

  1. This sentiment may change in 12 hours' time.