**Tags**

A floating point demon had attacked my rendering code. Instead of drawing a nice straight line I’d get a devil’s tail flurried with barbs. It preferred disrupting the basic tests, straight lines, leaving complex curves in peace. It was a tricky little bugger to find. The code was logically sound math. The demon was swimming in the details of floating point precision.

### Calculating a stroke

The goal of the code is quite easy to describe: draw a border on a polygon. In data terms the polygon is a series of points that form a path. I just need to calculate a second series of points that outlines that one. The diagram below shows the primary path and the desired border.

Given the three points `P`

I’m interested in finding the stroke corner at `C`

.

The first approach was to calculate the two lines parallel to each segment and determine where they intersect. This is shown in the diagram below.

The little gray lines show the normals from that segment. These are calculated like below.

1 2 |
directionP = unit( P1 - P0 ) normalP = [ -direction.Y, direction.X ] |

This calculation will always yield the normal pointing “left” of the line (rotated 90 degrees counterclockwise). This is relevant since you need to form a continuous border along one side of the polygon.

With this normal it’s quite easy to get the parallel line.

1 2 |
paraP = P0 + normalP * strokeWidth directionParaP = directionP |

I do this for both lines and calculate the intersection of those lines to get `C`

. It seems simple enough. What could go wrong?

### Parallel lines

What happens if the two stroke lines are parallel to each other? These don’t have a proper intersection point, yet the code is calculating an intersection nonetheless. I looked into the intersection code, and it has a fragment that looks like this:

1 2 3 4 5 |
delta = A1*B2 - A2*B1 return [ (B2*C1 - B1*C2)/delta, (A1*C2 - A2*C1)/delta ] |

I’ll gloss over what those `A`

and `B`

are and just say that for parallel lines `delta`

is `0`

. Dividing by `0`

in the following statement yields infinity. But if I was getting infinity I’d suspect my line wouldn’t draw at all, rather than just being chaotic. Floating point precision is of course the culprit here. `delta`

is rarely `0`

, instead it is very close to `0`

. This results in non-infinity, but effectively random results.

This type of code is very common in examples, which is unfortunate. Do a search for line intersection and you’ll find many examples of code that just fails to consider the parallel case. Sometimes you get lucky and you’ll see a comparison with 0, but that’s almost just as incorrect since you rarely get exact floating point numbers in practice. Be very suspicious of any floating point algorithms you find.

The intersection code should detect this situation. I tend to use a zero tolerance value to make the `if`

statement clear.

1 2 3 |
if( abs(delta) < zeroTolerance ) { //something } |

The big question is what “something” actually does. There is no possible correct, nor useful, value that can be returned to the caller. Perhaps you’d like to throw an exception in this case. I return one of the input points now, just to be numerically stable.

The caller of the intersection needs to detect the lines are parallel and do something else.

### Different approach

In the typical polygons of vector graphics a lot of line segments are close enough to parallel to create numeric instability. This is the annoying part of floating point. We must always consider a range of values that are not useful.

I decided instead to take a slightly different approach. Instead of calculating the intersection of the two stroke lines, I calculate the intersection of one stroke line and the bisector of the polygon corner. I show this in the diagram below.

This should yield the same result but be more numerically stable for typical polygons. It avoids the parallel line problem. Find ways to avoid floating point precision issues is always preferred over explicitly dealing with those issues.

There is still one problem: cusps. If the input polygon has a very sharp corner the bisector becomes nearly parallel to the input lines. This again leads to chaotic results. It’s actually a legitimate problem with the type of miter join I’m doing. The rendering of such sharp corners is actually undefined, since the stroke can extend to infinity. For this reason a typical renderer will just limit the extent of the stroke, trimming to a bevel join instead. I don’t do this yet, instead doing a quick butt-cap on the line segments.

I detect a cusp by checking the angle between the bisector and the preceding line segment.

1 2 3 4 |
angle = Collision2D.AngleBetween( lv, bisectNormal ) if( /*cusp*/ angle < 0.1f || angle > Math.PIf - 0.1f ) { //use alternate corner cap } |

My `AngleBetween`

function will always a value from `0...Ï€`

. Both too small and too high of an angle must be checked since both are essentially parallel lines. Even with this check I should actually be trimming to a bevel, since an angle of `0.1`

will still yield a very long corner stroke.

### A last special case

One more special case came up during testing. Several polygons contain consecutive points which don’t have significant space between them. As we saw before, checking for `0`

isn’t enough. Any distance between them approaching zero leads to trouble. The problem is that I subtract the points to get a direction vector, and I normalize that vector. Attempting to normalize a vector of near-zero length doesn’t end well.

1 2 3 4 5 6 7 |
len = Vector.Length( current - last ) if( len < 0 ) { //use alternate corner cap } lv = Vector.Normalize( current - last ) //`lv` can be odd if len is close to zero |

### Be leery of floating point algorithms

In this little bit of code I was beset with multiple floating point issues. Part of the problem was the original source material: a lot of examples from around the web don’t properly handle floating point. I can only presume there are a lot of projects with defects in them.

You must be on the lookout for any division. Anything of the form `a / b`

needs to have a near zero tolerance range on `b`

. Within that range you must do something else. There are obviously other cases where floating point precision is trouble, but division appears to be the most common one.

In my programming journeys I’ve seen oddities and curiosities of all sorts. Follow me on Twitter to hear more such tales. I’m also available for presentations and seminars, just contact me.

Sophistifunk

said:Couldn’t you just scale each normal by 1/2 borderWIdth, then add them both to the intersection vertex?

mortoray

said:This won’t result in strokes that are actually of `borderWidth` thickness. Consider a simple case of a 90degree corner. The two normals are `[1,0]` and `[0,1]`. Adding each half together and and multiplying results in `[borderWidth/2,borderWidth/2]`, but that’s clearly wrong. The thickness of both the horizontal and vertical stroke must be `borderWidth`.

No doubt however that I could optimized the intersection for my special case and come up with a much reduced calculation. I just prefer to stay logically sound first and do the optimizations later.