Here, `f` is the function whose root should be found, and `df` is its derivative (i.e. Jacobian). `initial` is an initial guess, and should return the first iterate for which the norm of the function value is smaller than some fixed bound.

%% Cell type:markdown id: tags:

# problem 2

(This is problem 4 of the SIAM 100-digit challenge.)

1. Plot the function using the `vis.py` script from the source repository.

2. Try to find the global minimum of the function by solving the equation

$$\nabla f(x, y) = 0$$

using Newton's method from problem 1.

Since the function oscillates very quickly and therefore has multiple local extrema, you should solve the equation for a sufficiently large number of initial values. Use a grid of $10 \times 10$ points in the square $(-0.5,\, 0.5)^2 \subset \mathbb{R}^2$ and run Newton's method on each of these points.

3. It may also be useful to try to refine the set of initial points until some heuristic criterion is fulfilled. Try to formulate such a criterion.

For reference, the global minimum is -3.306868647.

The function, its gradient and is Hessian are are implemented by the following Haskell functions:

Here, `f` is the function whose root should be found, and `df` is its derivative (i.e. Jacobian). `initial` is an initial guess, and should return the first iterate for which the norm of the function value is smaller than some fixed bound.

%% Cell type:markdown id: tags:

# problem 2

(This is problem 4 of the SIAM 100-digit challenge.)

1. Plot the function using the `vis.py` script from the source repository.

2. Try to find the global minimum of the function by solving the equation

$$\nabla f(x, y) = 0$$

using Newton's method from problem 1.

Since the function oscillates very quickly and therefore has multiple local extrema, you should solve the equation for a sufficiently large number of initial values. Use a grid of $10 \times 10$ points in the square $(-0.5,\, 0.5)^2 \subset \mathbb{R}^2$ and run Newton's method on each of these points.

3. It may also be useful to try to refine the set of initial points until some heuristic criterion is fulfilled. Try to formulate such a criterion.

For reference, the global minimum is -3.306868647.

The function, its gradient and is Hessian are are implemented by the following Haskell functions:

```haskell

f::Double->Double->Double

fxy=exp(sin(50*x))

+sin(60*expy)

+sin(70*sinx)

+sin(sin(80*y))

-sin(10*(x+y))

+(x^2+y^2)/4

gradf::Double->Double->[Double]

gradfxy=[dfx,dfy]

wheredfx=50*cos(50*x)*exp(sin(50*x))

+70*cosx*cos(70*sinx)

-10*cos(10*(x+y))

+x/2

dfy=60*expy*cos(60*expy)

+80*cos(80*y)*cos(sin(80*y))

-10*cos(10*(x+y))

+y/2

hessf::Double->Double->[[Double]]

hessfxy=[[hfxx,hfxy],[hfxy,hfyy]]

wherehfxx=2500*cos(50*x)^2*exp(sin(50*x))

-2500*sin(50*x)*exp(sin(50*x))

-70*sinx*cos(70*sinx)

-4900*cosx^2*sin(70*sinx)

+100*sin(10*(x+y))

+1/2

hfyy=60*expy*cos(60*expy)

-3600*expy^2*sin(60*expy)

-6400*sin(80*y)*cos(sin(80*y))

-6400*cos(80*y)^2*sin(sin(80*y))

+100*sin(10*(x+y))

+1/2

hfxy=100*sin(10*(x+y))

```

Try to find the global minimum of the function by solving the equation

$$\nabla f(x, y) = 0$$

using e.g. Newton's method. Since the function oscillates very quickly and therefore has multiple local extrema, you should solve the equation for a sufficiently large number of initial values.

It may also be useful to try to refine the set of initial points until some heuristic criterion is fulfilled. Try to formulate such a criterion.

For reference, the global minimum is -3.306868647.