Tag: optimization

News, Science

News: Algorithm searches for models that best explain experimental data

From the original news article from PhysOrg:

An evolutionary computation approach developed by Franklin University’s Esmail Bonakdarian, Ph.D., was used to analyze data from two classical economics experiments. As can be seen in this figure, optimization of the search over subsets of the maximum model proceeds initially at a quick rate and then slowly continues to improve over time until it converges. The top curve (red) shows the optimum value found so far, while the lower, jagged line (green) shows the current average fitness value for the population in each generation.

(…)

Regression analysis has been the traditional tool for finding and establishing statistically significant relationships in research projects, such as for the economics examples Bonakdarian chose. As long as the number of independent variables is relatively small, or the experimenter has a fairly clear idea of the possible underlying relationship, it is feasible to derive the best model using standard software packages and methodologies.

However, Bonakdarian cautioned that if the number of independent variables is large, and there is no intuitive sense about the possible relationship between these variables and the dependent variable, “the experimenter may have to go on an automated ‘fishing expedition’ to discover the important and relevant independent variables.”

You can see the original research paper here.

Genetic Algorithms, Pyevolve, Python

Rosenbrock’s banana function and friends

This post is a report of an optimization test suite applied to Pyevolve GA Core.

I’m writing a paper about Pyevolve and I’ve done some tests on the GA quality of Pyevolve with must common optimization test functions: Schaffer, Rastringin, Sphere, Ackley and the absolute GA-hard Rosenbrock. Follows in the end of the post, the results of the trials, I’ve exported the latex formulae as images.

The only function that the GA had some trouble (it failed to find the global minima in 3 of 10 trials, the “fail” term means that it had not found an optimal solution for 5.000 generations) was the Rosenbrock’s function with 20 variables, also known as Rosenbrock’s banana function due the dinstinctive shape of the contour lines, here is the 3D plot with 2 variables.

As Wikipedia says,

This function is often used to test performance of optimization algorithms. The global minimum is inside a long, narrow, parabolic shaped flat valley. To find the valley is trivial, however to converge to the global minimum is difficult.

However, this behavior was not unexpected, since we can note the same on the research of [1], [2], [3] and [4], among other many papers.

I think this trial results brings more reliability to the framework, when the Pyevolve paper was published, I’ll post it here.

[1] Seredynski, Franciszek, et. al. (2003). “Function Optimization with Coevolutionary Algorithms”, in International Intelligent Information Processing and Web Mining Conference.

[2] Hiroyasu, Tomoyuki, et. al. (1999). “Distributed Genetic Algorithms with Randomized Migration Rate”, IEEE Proceedings of Systems, Man and Cybernetics Conference SMC’99, Vol. 1, pp. 689-694.

[3] Bouvry, Pascal, et. al. (1997). “Distributed evolutionary optimization in Manifold: the Rosenbrock’s function case study”, Duke University.

[4] Panait, Liviu (2002). “A comparison of two competitive fitness functions”, in GECCO 2002: Proceedings of the Genetic and Evolutionary Computation Conference.

I'm starting a new course "Machine Learning: Foundations and Engineering" for 2024.