Genetic algorithms are modular metaheuristics simulating the evolutionary process
over a solution set. The optimization is very adaptive but slow, making statistical
research difficult. In this paper an algorithm is proposed where different variants
are racing against each other while statistics are gathered. Our results show that
this algorithm is an efficient, standalone, and even more adaptive solution. Those
variants that result in faster convergence lead the race, but get stuck in local minima.
In these cases, the more agile combinations with slower convergence gain higher probability
and find better solutions farther from the local minimum. The hybrid is capable of
faster convergence with minimal additional runtime. We also provide complexity estimations
for resource requirements.