Let's consider the following simple problem:

\( \min \qquad f= x^2_{1} + x^2_{2} \)

The solution is obviously \(x_1 = 0 \) and \( x_2 = 0 \). Let's use CMA-ES to find this.

\( x_1^* = \)

\( x_2^* = \)

\( f_{min} = \)

\( gen = \)

- Click on "Run 1 generation" or "Keep running" buttons to see CMAES in action.
- Click "Reset" to initiate from the beginning.

- CMA-ES is a population-based heuristic strategy that improves a population of candidate solutions through an evolutionary process to reach a near-optimal solution.
- It uses a multi-variate Gaussian to represent the current population in a generation.
- Then, in each generation,
- CMA-ES generates sample points using the mean \( \mathbf{x}^{mean} \) and the covariance matrix \( \begin{bmatrix} \sigma^2_{x_1} &\sigma^2_{x_1,x_2}\\ \sigma^2_{x_2,x_1}& \sigma^2_{x_2} \end{bmatrix} \) of the multi-variate Gaussian.
- Updates the covariance matrix using the fitness values (\(f \)) obtained at the sampled points.
- Updates the mean value of the sampling distribution toward the direction of improvement.

- The process is repeated until a satisfactory result is achieved.
- In the above example problem, the initial guess is set to be \( \mathbf{x}^{mean} = [1, 1]^T \) with an initial standard deviation of \( 0.3 \) for both \( x_1\) and \(x_2\) and no covariance.

Change the covariance matrix using the sliders to see how the sampled distribution changes.

Mean:

Covariance Matrix:

Number of sampled points:

- Changing the mean values will move the region where the points are sampled.
- Changing the variances \( C_{11} \) and \( C_{22} \) will adjust the spread of the sampled points in the corresponding directions
- Changing the covariances \( C_{12} \) or \( C_{21} \) will adjust the interdependence of the variables

Note that the scale for covariances \( C_{12} \) and \( C_{21} \) is updated such that the range corresponds to a correlation between \(-1 \) to \( 1\). This is because, there exists an upper limit for covariance of two variables given by \( \lceil C_{12} \rceil = \sqrt{C_{11} C_{22}} \).

By using the fitness values at the sampled points, the CMA-ES algorithm makes clever adaptations to the covariance matrix and mean values, and thereby performs effective optimisation. For more details on the adaptation procedures, one may refer to the following resources.

- Hansen, Nikolaus. "The CMA evolution strategy: A tutorial." arXiv preprint arXiv:1604.00772 (2016).
- Hansen, Nikolaus, Sibylle D. MÃ¼ller, and Petros Koumoutsakos. "Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES)." Evolutionary computation 11, no. 1 (2003): 1-18. (URL)
- CMA-ES official GitHub page with materials and source codes written in several programming languages.