Computational geometry is full of algorithms that demonstrate how randomization very often leads to simplification. One example is the famous randomized incremental algorithm for finding the minimum enclosing ball of a set of points in d-dimensions. The idea of the algorithm is very simple. All you need to do is to look at the points in a random order and maintain the minimum enclosing ball (MEB) of the points seen till now. When a new point is considered, if it already lies inside the current MEB, then there is no need to do anything, but if it lies outside, then update. With some math it can be shown that the expected amount of time this algorithm will take is linear. This is at par with the best known deterministic algorithm known at the time of discovery of this algorithm. However, the deterministic algorithm was quite complex and not easily implementable.

Here’s a very nicely written paper that describes the algorithm – http://www.springerlink.com/content/ph273841264x5163/

### Like this:

Like Loading...

*Related*

For the two dimensional case, finding a disk to enclose the points on a 2-D plane, we can find the pair of points with the widest separation (O(n * log n)?). Using the midpoint of the line segment joining these two points we can construct a disk. Is it true that this is the minimum enclosing disk?

No. It won’t necessarily be enclosing. There can still be points that lie outside.

But if you are just aiming for an O(nlogn) algorithm, then that can be done using a farthest-point Voronoi Diagram. Details can be found in “Closest-point problems” by Shamos and Hoey.

Ah. Now that you mention it, I can construct a counterexample: Take two points separated by a distance . Construct a disk with this as the diameter. We can now choose a point outside of the circle, on the perpendicular bisector of the diameter such that it’s distance from either point, , is

Yes, exactly…