this research report is still in progress.
In 2001 I found a book by mathematician Julien C. Sprott entitled “Strange Attractors: Creating Patterns in Chaos” (CPiC). The book describes the mathematics behind a class of fractals called strange attractors.
The bulk of CPiC deals with a sub-class of strange attractors called iterated maps. An iterated map is an equation, or often a system of equations, which are evaluated iteratively. Iteration in this case means that we take the result of evaluating the equations with, and feed this back into the same equations as inputs for the next evaluation. With most randomly chosen equations, the set of results produced by successive iterations is not particularly interesting, perhaps rapidly climbing to infinity, collapsing to 0 or some other fixed value. For a small number of equations however, the set of results carves out a region of number space which exhibits interesting fractal properties. This region of number space is called an attractor. The complete term “strange attractor” refers also to the chaotic properties of these attractors (CPiC p10, CaTSA p127).
In CPiC Sprott presents a computer program which can search for the equations which produce strange attractors by trying many randomly generated equations until one is discovered which meets certain criteria. The results are clouds of points which formed interesting shapes, many with an organic quality. Each set of equations created a unique set of points.
The organic quality of the images is particularly intriguing as they are the result of a relatively simple mathematical process. In fact it is this complexity, emerging from apparently simple equations, which has attracted mathematicians.
The book describes a range of different classes of equations which exhibit the property of producing strange attractors. One class of equations which particularly caught my attention was the polynomial map. Polynomials are interesting as they are easy to generalise, meaning one can easily add or remove terms from the equation to experiment with systems of greater or lesser complexity. Additionally, this flexibility is easily parameterised, making it possible to automate. Finally, there are a number of properties of polynomials that makes their evaluation on computers convenient.
With each class of strange attractor producing equations presented in the book, there were a number of parameters (coefficients, constants) that could be a adjusted to produce different attractors. Often, only a small percentage of the possible constant values would give rise to a strange attractor. The program that Sprott develops in the course of the book randomly tests many different randomly generated parameters, and displays the strange attractor produced by those which exhibited favourable properties. Since the values were chosen randomly from the infinite range of possible values, it gives the impression of the different attractors lying spread out discretely across number space, isolated from each other like stars in the night sky.
I was curious to see what would be the outcome of making small changes to the chosen values, to see if the attractors did not exist as pinpricks in number space, but possibly as extended areas, possibly even connected. I made a simple program which did this, starting from one known strange attractor and making small changes to the parameters, and plotting the resulting attractor. The result was the familiar clouds of points, slowly morphing between many different shapes.
The simple program I made would move through the parameter space randomly and automatically. It showed that there was a continuity to the parameter space which the strange attractors occupied. It did not however, provide any useful way of visualising the path it was taking through this space.
In this research I would like to explore this space in a more controlled manner, and possibly map it to some degree. The mapping of fractal spaces is not unheard of in mathematics. As a simple example, a region of the Lyapunov fractal has been dubbed “Zircon Zity” by mathematician, Alexander Dewdney.
I've always intended this exploration to be more aesthetic than mathematical. it is certainly informed by mathematics, but I feel that any great mathematical insights are unlikely to come from me. I maintain however, the vaguely plausible fantasy that my unscientific endeavours might inspire some more useful research by a suitably qualified mathematician at some point in the future.
The specific aims of this research project are to get some insight into the structure of the parameter space that these strange attractors occupy. the name of the project comes from what I hope is a correct name for the full class of equations which make up the parameter space I am interested in searching.
The specific system of equations which I used in my initial simple program is shown below:
Xn+1 = a1 + a2Xn + a3Xn2 + a4XnYn + a5XnZn + a6Yn + a7Yn2 + a8YnZn + a9Zn + a10Zn2
Yn+1 = a11 + a12Xn + a13Xn2 + a14XnYn + a15XnZn + a16Yn + a17Yn2 + a18YnZn + a19Zn + a20Zn2
Zn+1 = a21 + a22Xn + a23Xn2 + a24XnYn + a25XnZn + a26Yn + a27Yn2 + a28YnZn + a29Zn + a30Zn2
The terms a1 though to a30 are the parameters which define a particular attractor. When I refer to “parameter space”, I am referring collectively to these values. In the same way that you could take two numbers to be coordinates on a two-dimensional map, it can be useful to imagine a long list of parameters as coordinates into a space of considerably higher dimension, in this case, a 30 dimensional space. This is not to suggest that 30-dimensional space is easily comprehensible, but by extrapolation from a two dimensional plane, through to a three dimensional cube and beyond, it is possible imagine at least some of the properties of this space.
Perhaps the most important aspect to comprehend is that sets of parameters where the corresponding values in each set differ only slightly from each other are in some sense adjacent.
The right hand side of equation is simply a polynomial of degree 2 in Xn, Yn and Zn. As mentioned in the previous section, it is easy to generalise these equations and replace them with polynomials of higher or lower degree (greater or fewer numbers of term). With higher degree polynomials, the number of coefficients (parameters) rapidly increases.
The level of complexity of the equations can be varied, but even in this simple (yet still interesting) case the equation is defined by 30 values. If you consider the 3 starting values of X, Y and Z, then it is defined by 33 values. When considered as a space, this is a 33 dimensional space. This raises some problems of how to sensibly navigate or visualise such high dimensional spaces.
As an aside, many of the plots in this report are of my first automatically discovered attractor which uses degree 5 polynomials. This means that all terms are represented in which the exponents of the X, Y and Z variables sum to 5 or less. This results in 3 equations of 56 terms, making for 168 parameters (or 170 if you count the 3 starting values of X, Y and Z).
Normally, attractors are found within this vast numerical space with a combination of brute force (trying many different random sets of parameters) and automated analysis to determine when an interesting attractor has been found. The two analysis methods employed in the program presented in CPiC are measurements of the correlation dimension and the Lyapunov exponent.
The correlation dimension is a particular was of measuring fractal dimension, which is a method of measuring the way in which fractal objects fill space. In the case of the dot plots of strange attractors, the correlation dimension can indicate if the attractor is a collection of disconnected points (dimsions close to 0), if the points are arranged in the form of a line (dimension close to 1), if the points are spread out into a flat plane (dimension close to 2) or if the points form a voluminous cloud (dimension close to 3). Interesting attractors tend to have a dimension greater than 1. Correctly measuring the correlation dimension requires too many calculations to be feasible. As an alternative, the correlation dimension is normally measured approximately using a random sample of points. The accuracy of the measurement increases with the number of points being tested. Additionally the dimension of a fractal is often not consistent across the whole of the fractal, and the resulting value is only an average of the dimension across the tested points.
The Lyapunov exponent is a measure of the chaotic behaviour of the fractal. Chaos is concerned with sensitivity of a complex system to small changes in initial conditions. The Lyapunov measures the speed at which slightly different starting conditions diverge.
It is hoped that the ability to explore and derive a structural understanding of the number space, may assist in searching for visually or perhaps even mathematically interesting regions in space.
I expect there to be a tight feedback loop between initial observations and future explorations. It is very much like sending off a ship into unchartered waters. if you find an island, you will probably explore the island.
I'm expecting the parameter space to be a kind of meta-attractor, and to be interesting in similar ways to the attractors themselves. Similar relationships are already known to exist between better understood fractals. Each point in the well known Mandelbrot Set fractal corresponds to a particular configuration of the similarly popular Julia Set fractal in which the solution is bounded (Sprott p.353). The exact definition of boundedness in this case is beyond the scope of this report, but it is sufficient to understand that the Mandelbrot Set can be understood as a map describing the behaviour of the Julia Set over a range of different parameters. Similarly, I would like to generate a similar map describing the presence of Strange Attractors across a range of parameters.
This exploration is driven by tools and technology. The discovery of fractals was catalysed by the availability of computers, as they require large numbers of calculations and their exploration was not feasible without computational assistance. Even at the time that the Sprott book was written, generating one of these attractors would take a several seconds, making the animations of my early exploratory program infeasibly time consuming. The mapping of the parameter space involves many times more calculations again, making efficient implementation a reasonably high priority.
In addition to the raw computational challenges, the other important problem to face was the construction of a suitable interface for conveniently navigating the high dimensional number spaces.
My early plans were to make a neat, self contained application, displaying the navigation interface alongside a 3D rendering of the current strange attractor. An early problem I needed to solve was the problem of integrating a 3D rendering into that same window as the navigation interface.
The common methods for producing real-time 3D graphics on computers are biased towards applications in which everything takes place in the 3D render window, for example, a 3D game, which takes over the complete display of your computer. I was to discover that relegating that 3D window to a region within another application window is not always a simple task. The problem was further complicated the particular 3D library I had hoped to use (OGRE) and the language in which I had hoped to do the interface programming (Python).
After a number of different trial and error approaches to the problem, I gave up on the dream of a neat, single-windowed application and fell to the less problematic approach of having the 3D render and the navigation interface appear as two distinct windows.
With this initial structural decision made, I moved to implementing the code to evaluate the equations which would generate the attractor. I decided to use SciPy, a python library for performing scientific calculations. It emphasises the use of matrix operations, and so I arranged the evaluation so that the bulk of the calculation would be done as simple operations on large matrices. This method was chosen in the interests of efficiency. While it was not possible to easily test how efficient the matrix operations in SciPy were, it is reasonable to assume that they are optimised to some degree.
Soon after completing an initial implementation of the evaluation code, I was somewhat surprised to discover that using functions provided by SciPy, generating maps of the parameter space would be much easier than I had imagined. In fact, I was able to render my first parameter maps long before I had a working renderer for the 3D dot plots of the attractors themselves.
The first plots were quite time consuming. For each point on the map I was calculating many iterations of the equations, only stopping if the values became incalculably large (infinite). During the ample opportunity for reflection afforded by the long rendering times, I was reminded of the way in which the popular images of the Mandelbrot set are typically generated.
Most popular images of the Mandelbrot set are called “escape time” plots. Each pixel in the image represents a unique starting value which is fed into an iterative equation. The black region in the centre of the image are those starting conditions for which the successive values produced by the iteration stay close to their initial value, and define the Mandelbrot set proper. The coloured bands surrounding this region represent starting values for which successive iteration causes the values to spin off towards infinity. The different colours represent the number of iterations required for the values to cross some arbitrary threshold.
For my purposes, escape time plots had several benefits. Firstly, there is generally less calculation to do for each point, as the values will cross the chosen escape threshold long before my old limit of infinity. Secondly, while not technically representing attractors themselves, the shape of the colour bands tend to give clues about the presence of nearby attractors, which are possibly not visible in the rendered region, acting as a kind of mathematical aura. The lack of any mathematical justification for this second property make the allusion to pseudo-science particularly appropriate.
Until this point, my plots were essentially two dimensional slices of the large number space. The two-dimensional nature of Computer screens lend themselves predictably to the direct represention of two-dimensional data. It is possible to stretch this situation to accommodate one more dimension by making animations, essentially mapping a new dimension (and hence a parameter) to time. The resulting animations are made with a sequences of parallel slices through the number space.
As computations became more ambitious, optimisation of the calculations became a more pressing concern. Animations were a particularly taxing development, as each frame of the animation would require as much computation as earlier plots, causing even short animations to require 50-100 times as much rendering time.
The first optimisation was to make use of Psyco, a specialising compiler for Python. This required some changes in my code to avoid particular structures that Psyco is unable to optimise. The main culprit in my code was the use of generators, as discussed in note 4 on that list. Fortunately it was reasonably simple to convert the unsupported generator into an iterator, a similar Python construct supported by Psyco. This change was rewarded with a halving of render time (a 100% speed increase). Some further modifications to the SciPy calls being used resulted in a further 25% speed increase.
Performance gains from each progressive optimisation were decreasing, and I was considering some other avenues for increasing performance. Initially I had chosen to use Python as the language of implementation as it has many language features that make it comfortable to use when sketching out the initial implementation of an algorithm. Now that the algorithm was becoming quite settled, I decided to re-implement it in C, to get an idea of the differences in efficiency between the two languages. I was assuming that the matrix functions provided by SciPy were reasonably efficient, and that the performance increases of a C implementation would not be too dramatic, but I was curious to make the comparison.
Creating a C implementation was also an excuse for me to try using a library I had discovered called liboil. LibOIL is a library of optimised functions for performing simple instructions across large sets of data. Most modern processor designs are able to efficiently perform these kinds of computations, but the specific approach varies widely between different processors. Liboil has several implementations of each of its functions and chooses the most efficient implementation depending on the current processor architecture.
Initial performance results from the libOIL implementation were very encouraging, running what appeared to be 492 times faster than my Python implementation, although with very slight, but intriguing differences in the output [see fig XX]. Eventually it was discovered that these differences were due to a programming error. The repaired code was slower, but still 24 times faster than the Python implementation.
While the efficiency of a C implementation has obvious benefits, I was reluctant to give up on the flexibility of Python. After some research I discovered Pyrex. Pyrex is a meta-language for writing Python modules. It is very similar to Python, with the exception that it can use functions and access data from C libraries. With Pyrex I was able to integrate my C evaluation code into my existing Python code. Following this integration, the result was slightly slower but still 13 times faster than the earlier pure-Python code.
As an experiment, I replaced the libOIL code in my C implementation with my own implementations of the respective functions. Surprisingly, the resulting program was slightly faster than the original libOIL code. If anything, this observation is a testament to the optimisation capabilities of the GNU C compiler.
[ dot-plot renderer, interface ]
Much of this work was performed without and graphical representation of the strange attractors themselves, but only the maps of their parameter spaces. Development of the dot-plot render was slowed by the problems mentioned earlier. While the specific problems mentioned earlier had workarounds, some lingering problems persisted, most of them relating to the interaction between C++ libraries (such as OGRE) and Python. I was growing impatient with this lack of progress, and eventually decided to implement the dot-plot renderer using a Python game-development library called Soya. I had persisted with the OGRE implementation until this time because I thought the efficiency of the graphical display would be one of the limiting factors.
[ explain fractal dimension / lyapunov somewhere before this paragraph ]
Implementing the Soya render was quite quick, but revealed the unfortunate observation that the strange attractor which I had been mapping until this point was not particularly interesting to look at, and was merely a wobbly 3D loop. This was possibly caused by using an automated search which only examined the fractal dimension as a selection criteria. In my original applet (and the program developed in the CPiC) the Lyapanov exponent is also used. Rather than improve my automated searching algorithm, I decided to focus on providing an interface for manually navigating through the parameter space, relying on human aesthetic judgements and instincts over mathematical analysis.
After experimenting with a number of methods for providing an interface to the potentially large number of parameters, I settled on making some small modifications to the Grid object provided by the wxWidgets library, which provides a rudimentary spread-sheet interface. I added the ability to incrementally modify the number in the current cell by scrolling with the mouse wheel. Different combinations of the Alt, Shift and Control keys change the amount by which the value is incremented.
[ highf-grid.png ]
[ basin of attraction 0,0,0 assumption ]
[ Sprott mention of robustness ]
[ initial plots of basin of attraction (accidentally) .. ]
For historical value, below is the first map I rendered. Mistakenly, it was the result of somewhat misplaced ambition. It was intended to be a plot of the varying fractal dimension across a two-dimensional slice of the parameter space. Due to a conceptual error, it is actually a rendering of the varying fractal dimension across a two-dimensional slice of the basin of attraction. Upon realising this error, the slight variance in the values towards the center became quite surprising.
[ desc of basin of attraction, why the factual dimension shouldn't change much ]
highf-100.png (accidental basin plot)
[ first dot plot: highf-dotplot.png ]
[ low order ]
[ fractal dimension plot + composite ]
[ basin of attraction animations ]
[ render errors? ]
[ code ]
the features of the parameter space maps are as interesting as i had hoped.
i didn't expect to be making maps this early. i had not anticipated that the manual-searching part of the tool would be so difficult, and i also i had not realised that it would actually be easier to do the mapping. i had expected to move on to the mapping after having an 'explorer' of sorts.
I've yet to speak to anyone with a maths background to ask if this is of any relevance, or more realistically to find out that it is very boring mathematically. i can imagine that it might be considered a little boring because i am working with big complicated polynomials. most of the famous fractals are based on very simple equations.
[ benefits of flexible lib vs app ]
i still want to continue, making a useable application for exploring the space. at the moment i am still driving it quite manually.
[ basin plots are truly 3d (not slices), lend themselves to 3d printing ]
[ ability to click on an map and see the attractor ]
[ sonification ]
[ reactions from DE ]