Convergent fractals are fractals that result from iterating functions that
converge to a finite value, rather than diverging to infinity like the classic
Mandelbrot function z=z^{2}+c.

Many convergent fractals are derived from procedures for numerically finding solutions to equations of the form f(z) = 0. The main methods that generate interesting fractals are Newton's method, Halley's method, and especially the secant method. Some additional variations on those classic algorithms have been designed to give more interesting fractal art.

In addition, some fractals are "hybrid" in the sense that both divergent behavior and convergent behavior is seen in the iteration. The Magnet Fractals are great examples of this, although there are many function maps that exhibit this combined behavior.

Newton's method is one of the simplest (but still very effective) methods for numerically finding solutions to equations. This technique is also called Newton-Raphson, recognizing that Joseph Raphson actually developed the version of this method that is in use today.

The Newton-Raphson method is one of the core techniques of numerical analysis, and there's been a huge amount written about this method elsewhere. It's kind of hard to understand without a graph, which takes a lot of work to make, so I won't do into detail here. Basically this method takes a guess at the solution to f(z)=0 and uses the tangent line to the function to update the guess to a more accurate guess. The mathematical formula turns out to be very simple:

z_{n+1} = z_{n} - f(z_{n})/f'(z_{n})

where z_{n} is
the last guess, z_{n+1} the updated guess, f(z) is the function to be solved, and
f'(z) is the derivative of f with respect to z.

As with the Mandelbrot set function, z_{n+1} = z_{n}^{2} + c., this seemingly simple formula gives rise to
incredible complexity when it is iterated many times. All points near each root
are attracted to the root, forming what are called basins of attraction. At the
boundary between the basins of attraction, typically there is a very complicated
Julia-set-like fractal.

Any function f(z) that has a zero can be plugged into Newton's method. Some work better than others.

Newton's method has been modified a little, introducing a "relaxation" factor to control how the guesses converge. This was originally done in an attempt to improve the rate of convergence when using Newton's method for actually finding zeros of equations rather than making pretty pictures. The idea is to take the Rth root of f(z) and find the zeros of that (taking the root won't change the value of z that makes f(z)=0). This makes the method converge faster under certain conditions, and leads to this formula:

z_{n+1} = z_{n} - R f(z_{n}) / f ' (z_{n})

Varying R changes the shape of the fractal.

The Nova fractal is derived from this equation. It has the all-important +c that changes this into something that can be rendered like a Mandelbrot set, rather than a Julia set.

z_{n+1} = z_{n} - R f(z_{n}) / f ' (z_{n})
+ c

Nova is really a large class of fractals since any function f(z)
can be used. The classic Nova fractal uses f(z) = z^{p}-1, and
therefore produces the following equation that is iterated:

z_{n+1} = z_{n} - R (z_{n}^{p}
- 1) / (p z_{n}^{p-1})
+ c

Most images use p=3, which represents the search for the cubic roots of unity, and yields:

z_{n+1} = z_{n} - R (z_{n}^{3} - 1) /
(3 z_{n}^{2}) + c

Halley's method is the next more complicated member, after Newton's method, of a large class of iterative root-finding techniques that are collectively called Householder's method (or Householder methods, whichever you like). These are based on the Pade approximation, which is a way of writing an approximation to a function near a zero of that function if you know the derivatives of the function.

Halley's method and the higher-order Householder methods converge much faster at each iteration than Newton's method does, but each step is more computationally expensive, so there's a tradeoff. This faster convergence can be great if you're trying to find zeros of a function, but it turns out that the faster convergence makes the resulting fractals less interesting.

The formula for Halley's method is a bit more complicated than Newton's method:

z_{n+1} = z_{n} - 2 f(z_{n}) f'(z_{n}) /
{2[f'(z_{n})]^{2} - f(z_{n}) f''(z_{n})}

More work for less fractal ... that doesn't seem like a good deal to me. After spending all the time to write the code for this, I was pretty disappointed.

After the disappointment of Halley's method, I realized that the problem is that the faster an iterative equation solving method finds a solution, the less interesting the fractal at the boundary between the basins of attraction is going to be. So I started wondering if maybe *less* powerful solving methods would make better fractals. Sure enough, the secant method is a great source of pictures.

The secant method is very similar to Newton's method. It uses a line derived from the function near the current estimate of the solution, but instead of using the line tangent to the function at the current estimate, the secant method uses the previous estimates to define a secant line near the current estimate. This has an advantage for numerical analysis because it allows you to solve equations whose derivatives are unknown or are difficult to calculate. The downside is that it converges more slowly than Newton's method. So for problems where the derivative can be calculated, the secant method is not favored.

The formula for the secant method is elegantly simple:

z_{n+1} = z_{n} - (z_{n}-z_{n-1})/[f(z_{n})-f(z_{n-1})]
f(z_{n})

The slowness of convergence of this method makes the fractals that emerge from it much more interesting. In addition, the secant method needs two initial guesses, which introduces another set of options that can drastically affect how the resulting fractal looks. I see a lot of potential here, and I'm not sure why more hasn't been done with this method.

As far as I can tell, the first publication of the secant method for making
fractals was in a FractInt formula file called SECANT.FRM by Mike Wareman. That
file references Szyszkowicz M, "Computer art generated by the method of secants
in the complex plane." *COMPUTERS AND GRAPHICS,* volume 14,
number 3/4, p. 509 (1990). The file can be found in several collections of
FractInt formula files, but it doesn't seem to be part of what you get when you
download FractInt itself.

If you're an UltraFractal user, you can find the secant method realized in at least two places:

- mt.ufm by Mark Townsend, dated 16 Jul 1999
- pwc_convert.ufm by Paul Carlson, dated 19 Jul 1999.

Although from a software standpoint there are some significant differences between convergent and divergent fractals, from a mathematical perspective, there isn't really much difference.

The most succinct way to explain this is: consider z->1/z.

In words, that means divergent fractals are really convergent after all, but they converge to the so-called "point at infinity."