I suddenly had a desire to calculate square roots. It’s been one of those things that just sits at the back of my mind lingering. I knew roughly that an iterative method is probably used, but I finally decided to actually write the code. In this article I do a quick introduction to Newton’s method then show how it is used to find a square root.

### Newton’s method

Newton’s method is an algorithm to find solutions, the roots, of a continuous function. It works by making a guess at the answer and then iteratively refining that guess. In symbol form we’re looking for:

The method is quite simple. Given our initial guess we can calculate a new value on each iteration:

The formula isn’t hard to understand. is the result of our current guess, our goal is to have this number be equal to zero. To do this we divide it by the slope of the curve at this point: the derivative is , the rate of change, is the value we’re looking for.

Now we just pretend we have a straight line formula for the moment (the reason why I used the word “slope”). If we did, then the result of the division would be the exact difference between our guess and where the zero actually is. Of course, we don’t have an actual straight line, so instead we get a new estimate. We take this new value and repeat the process. We keep repeating until we have the accuracy we desire.

I should note that this method does not work with all functions. The function needs to be well behaved and getting an answer often requires a very good initial guess. If the conditions are not met it may never converge on the answer, or it might get stuck on the wrong answer.

#### Floating point accuracy

The code below is a generic implementation of Newton’s method in Leaf. The loop limits itself to a maximum of 10 iterations. I also put in a check if we can stop early. It works by keeping the previous results and comparing the new result. If our answer hasn’t changed it’s because we’ve reached the limit of floating point accuracy (which means we’ve found the most precise square root we can with floating point).

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 |
defn newton = (f, fd, guess) -> { var x = guess var o1 = x var o2 = x for at in std.range(0,10) ? { x = x - f(x) / fd(x) //stop early if no longer significant (check two due to oscillations) x == o1 or x == o2 then break (o1, o2, at) = (o2, x, at +1) } return x } |

Why do I keep two previous values? Sometimes the answer will lie nicely between two floating point values and the slope’s direction will keep switching signs. This results in the guess alternating between two successive floating point values.

If I were writing a library function here it would be required to deterministically pick one of the results here. It would be wrong for a proper

`sqrt`

function to not have a defined result, regardless of the initial guess or method chosen to find it.

### Square Root

This method to find the square root requires a function that has the desired form. The square root `x`

of `y`

is defined as:

So it’s clear that:

Which is the form we need for Newton’s method:

We also need the derivative, which is simple in this case:

In Leaf I defined this as below (we’ll see where `y`

comes from later in the full example):

1 2 3 4 5 6 |
defn f = (x:float) -> { return (x*x) - y } defn fd = (x:float) -> { return 2*x } |

#### Initial guess

Newton’s method only works well if the initial guess is somewhat reasonable. Otherwise it can oscillate wildly and take a lot of iterations before it stabilizes — assuming it ever does, the limits of floating point could get in the way here.

So how does one estimate the value for a square root? It turns out there is simple solution for floating point. We just need to observe the exponential identity:

Or more interesting to us:

Combine this with how floating point numbers are represented:

Where `s`

is the significand and `e`

the exponent. Then we can create this rough approximation:

We can use as our initial guess knowing that the square of this value will be nearly the same magnitude as the original value. The signficand is obviously quite wrong, but having the exponent so close lets Newton’s method stabilize with a very low number of iterations — mostly after 5, with a few 6’s in my testing.

#### The Code

Combined together we get this function for the square root:

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 |
defn sqrt = (y:float)-> { defn f = (x:float) -> { return (x*x) - y } defn fd = (x:float) -> { return 2*x } //rough estimate as initial guess var sig : float var exp : integer ( sig, exp ) = std.split_float(v) var guess = std.combine_float( sig, exp/2 ) return newton( f, fd, guess ) } |

That turned out to be a lot simpler than I anticipated, though I’m still satisfied and can put my curiosity to rest for now. Maybe I’ll come back later and do the generic exponent function.

I’m also happy that I managed to do this all in Leaf. It’s far enough now that I can start solving more such problems with it.

Categories: Programming, Use Case

Fascinating stuff happens when you try to apply this method to solve complex polynomials: https://en.wikipedia.org/wiki/Newton_fractal

I’m trying to reproduce your code in a different language, but it is hard without knowing what std.split_float returns… Does the exponent part comes out with or without the bias?

Also, it seems at’s increment in “(o1, o2, at) = (o2, x, at +1)” is not needed. Am I right?

`split_float` is a call to CLib function `frexp`:

You’re correct that `at` being incremented is redundant. My code uses a `loop at < 10 ?` not a `for` statement — I converted to a more common syntax for the blog and made a mistake.

Sorry, it was a dumb question. I figured it up (obviously it is unbiased…)