iOS – Mandelbrot


Is it possible to fall in love with a mathematical object? If the object in question is the Mandelbrot set, then the answer is a definite yes. This post talks about an iPad app that helps us explore the strange, hypnotic and never-ending beauty of this well-known fractal.

Screen Shot 2014-05-29 at 12.30.42 AM

The image above shows a zoomed-in view of a certain part of the Mandelbrot set. The points in black are inside the set and those in blue are outside the set. Much of the artistic appeal of the set depends on the color scheme you design to mark the points at the borderline. I used the exact same color scheme from my C++/OpenGL tutorial and a lot of code (in the Model) was directly ported from C++ to Objective-C.

The displayed image is divided into 4 quadrants by the white lines. You select where you want to zoom in by tapping the appropriate block with your finger. For example, here is a sequence of touch events from the very beginning of the set:

Screen Shot 2014-05-28 at 10.28.47 PM

Naming the blocks using 1 (top-left), 2 (top-right), 3 (bottom-left) and 4 (bottom-right), the journey above may be abbreviated thus: 1-4-3-1. The large image at the beginning of this article was obtained using the following sequence: 1-4-4-1-3-3. The possible journeys are infinite and one can probably spend several lifetimes exploring every nook and cranny.


The overall design of this app follows the usual Model-View-Controller strategy. The model consists of the minimum and maximum (x, y) coordinates of the window in the complex plane, the number of divisions along x and y (resolution), a 2D array that holds the parameter value used to color the set and a method to determine whether a point belongs in the set or not. Here is the implementation of the Model class:

There are two views, both subclasses of UIView: one displays the set itself and the other draws the horizontal and vertical centerlines. The implementation of the Class that draws the set is provided below for reference:

Note that we are using the Core Graphics API to render the 2D color data. This is not necessarily the best approach, but I believe it is the simplest to understand and implement. In the future, I plan to use OpenGL-ES and accelerate the calculation and graphics using GPU computing (CUDA/OpenCL).

Finally, the controller interprets the model for the views and updates the model data based on user touch-events. Essentially, we figure out which quadrant the user touched and update the minimum and maximum x and y coordinates appropriately, ask the Model to re-calculate the 2D array in the new window and send the updated data to the View for rendering. Here is the Controller implementation:

Notice we have a RESET button to go back to the original window. This merely resets the minimum and maximum x and y limits to their original values.

The entire source code can be cloned from

Happy Xcoding!

Leave a Reply

Your email address will not be published. Required fields are marked *