The genetic algorithm is a great algorithm for optimization problems. However, its not significantly more effective than the simulated annealing algorithm or the less-known controlled random search algorithm.
Each has its advantages and disadvantages, but getting too caught up in the metaphors these algorithms and techniques are based on will unnecessarily shackle your thinking. Of course, the opposite is also true.
– by Yet Another Smith (email@example.com) on Monday March 11, @03:51PM (#3144715)
The CRS is based on the 'Nelder-Mead' simplex method. Here's a better description of that. [enst-bretagne.fr]
It starts with n + 1 points where n is the number of parameters you're optimizing. That's 3 points forming a triangle in 2D space or 4 points forming a tetrahedron in 3D space (the space being the values you're optimizing). Its easiest to think of the 2D situation with the triangle.
Each corner of the triangle is some set of parameters, each of which will have a different 'fitness'. The fitness is the value that you're trying to minimize. Evaluate the fitness at each vertex of teh triangle. Take the largest “least fit” vertex, and 'step the triangle downhill' by reflecting it through the midpoint between the other two points.
This should create a reflected triangle closer to the fitness minimum that you are trying to find. repeat until you get so close to the minimum that you're going in circles.
Now, with the CRS, you use the simplex to take all your steps, but in this case you create a large pool of initial candidates at random, just like you do for the Genetic or Simulated Annealing algorithms. The you create a new simplex by selecting n+1 elements from your initial pool. Step the simplex downhill, and see if your new value is better. IF so, throw away the worst element of the initial population, and replace it with the new one. Then select a new simplex at random from your pool of candidates and repeat the procedure.
simulated annealing http://www.sciam.com/0397issue/0397amsci.html