09:00 
09:30 
Coffee Break

09:30 
11:00 
Tutorial:
Theory of Randomised Search Heuristics for Continuous Optimisation
Anne Auger and Nikolaus Hansen

11:00 
11:30 
Time for Questions and Discussion

11:30 
12:00 
Coffee Break

12:00 
12:30 
Noise, Complexity and Bandits
Olivier Teytaud
(joint work with
Guillaume Chaslot and
Philippe Rolet)
Abstract
We show nearly tight complexity bounds for noisy
optimization in frameworks not covered by previously
published papers.

12:30 
13:00 
Log linear Convergence and Divergence of the ScaleInvariant (1+1)ES in Noisy Environments
Mohamed Jebalia
(joint work with
Anne Auger and
Nikolaus Hansen)
Abstract
Noise is present is many realworld continuous optimization problems.
Stochastic search algorithms as Evolution Strategies (ES) have
been proposed as effective search methods in such contexts. In this
work, we provide a mathematical analysis of the convergence of a
(1 + 1)ES on unimodal spherical objective functions in the presence
of noise. First, we prove for a multiplicative noise model that for a
positive expected value of the noisy objective function, convergence
and divergence happen depending on the infimum of the support of
the noise. Our results also show that previously obtained approximative
results (see [1]) do not carry over to finite search spaces dimensions
as one were hoping for. Our results has been obtained using the
BorelCantelli Lemma and presented in the 10th PPSN Conference [2].
Second, we investigate convergence rates and show that the loglinear
convergence is preserved in presence of noise. This result is a strong
theoretical foundation of the robustness of ESs with respect to noise.
The result has been obtained using the Law of Large Numbers for
Markov chains.
 D. V. Arnold and H.G. Beyer. Local performance of the (1+1)ES in
a noisy environment. IEEE Transactions on Evolutionary Computation,
6(1):3041, 2002.
 M. Jebalia and A. Auger. On multiplicative noise models for stochastic
search. In G. R. et al., editor, Proceedings of Parallel Problem Solving
from Nature (PPSN X), pages 5261, 2008.

13:00 
14:00 
Lunch Break

14:00 
14:30 
Gaussian Adaptation revisited  an entropic view on Covariance Matrix Adaptation
Christian Müller
(joint work with Ivo F. Sbalzarini)
Abstract
Highdimensional, nonconvex and noisy optimisation
problems are commonplace in physics, engineering and
biology. In most cases, such problems can only be tackled
by stochastic search or optimisation heuristics.
We revisit Gaussian Adaptation (GaA) [15], a
designcentering and optimisation method for discrete and
continuous problems that has been developed in the late
1960's. This largely neglected search heuristic shares
several interesting features with the wellknown
Covariance Matrix Adaptation Evolution Strategy (CMAES)
[6] and with Simulated Annealing (SA) [7]. During
exploration of the search space, GaA samples single
candidate solutions from a multivariate normal
distribution and continuously updates the first and second
moments (mean and covariance) of the sampling
distribution. While the selection mechanism and moment
adaptation of CMAES are intended to increase the
likelihood of sampling better candidate solutions, GaA
adapts the moments such as to maximise the entropy (or
dispersion) of the search distribution. Samplepoint
selection is controlled by a monotonically decreasing
acceptance threshold, reminiscent of the cooling schedule
in SA. Like CMAES and SA, GaA also works on discontinuous
and noisy functions, where gradients or higherorder
derivatives may not exist.
We describe the theoretical foundations of GaA and analyze
some key features of the algorithm. Furthermore, we
compare the working mechanisms of GaA and CMAES and
analyze their convergence behaviour on selected test
functions. We conclude our study with an outlook on
possible extensions of CMAES by maximumentropy ideas,
and on the inclusion of evolutionary concepts into GaA.
 Kjellström, G. Network Optimization by Random
Variation of component values. Ericsson Technics,
vol. 25, no. 3, pp. 133 151, 1969.
 Kjellström, G. and L. Taxén, Stochastic
Optimization in System Design. IEEE Trans. on
Circ. and Syst., vol. CAS28, no. 7, July 1981
 Kjellström, G., Taxén, L. and P.O. Lindberg,
Discrete Optimization of Digital Filters Using
Gaussian Adaptation and Quadratic Function
Minimization. IEEE Trans. on Circ. and Syst.,
vol. CAS34, no. 10, October 1987
 Kjellström, G. On the Efficiency of Gaussian
Adaptation. Journal of Optimization Theory and
Applications, vol. 71, no. 3, December 1991.
 Kjellström, G. and L. Taxén, Gaussian Adaptation,
an evolutionbased efficient global optimizer;
Computational and Applied Mathematics, In,
C. Brezinski & U. Kulish (Editors), Elsevier Science
Publishers B. V., pp 267276, 1992.
 Hansen, N. and A. Ostermeier. Completely
Derandomized SelfAdaption in Evolution
Strategies. Evolutionary Computation, 9(2):159195,
2001.
 Kirkpatrick, S., Gelatt C.D. and
M. P. Vecchi. Optimization by Simulated
Annealing. Science 220 (4598): 671680, 1983.

14:30 
15:00 
Tightness of Parallel Complexity Bounds:
Automatic Parallelization and the λCorrection
Fabien Teytaud
(joint work with Olivier Teytaud)
Abstract It is usually considered that evolutionary
algorithms are highly parallel. In fact, the theoretical
speedups for parallel evolutionary algorithms shown in
[1] are far better than empirical results; this suggests
that evolutionary algorithms, for large numbers of
processors, are not so efficient. In this paper, we show
(i) that in many cases automatic parallelization provably
provides better results than simply increasing the
population size (ii) that in some precisely shown cases
automatic parallelization of evolutionary algorithms has
optimal speedup (iii) a simple correction (depending on
λ) which improves the speedup of genetic
algorithms, evolution strategies and estimation of
distribution algorithms.

O. Teytaud and H. Fournier. Lower bounds for evolution
strategies using vcdimension. In G. Rudolph, T. Jansen,
S. M. Lucas, C. Poloni, and N. Beume,
editors, PPSN, volume 5199 of Lecture Notes in
Computer Science, pages 102111. Springer, 2008.

15:00 
15:30 
Selection Pressure of Distributed Genetic Algorithms
Gabriel Luque
(joint work with Enrique Alba)
Abstract
The increasing availability of clusters of machines and multicores
have allowed the fast development of parallel evolutionary algorithms
(PEAs). Most popular parallel EAs split the whole population
in separate subpopulations that are dealt with independently
(islands). A sparse exchange of information among the component
subalgorithms leads to a whole new class of algorithms.
In this article we concentrate on the dynamics of the distributed
EA (dEAs), in particular in developing a mathematical description
for the takeover time, i.e., the time for the best solution to
completely fill up all the subpopulations of the dEA. Since we only
focus on selection (i.e., no variation operators), we expect an easy
extension of the results to many other EAs.

15:30 
16:00 
Coffee Break

16:00 
16:30 
Symmetries of Combinatorial Optimisation Problem Classes and NFL
Jon Rowe
(joint work with Michael Vose)
Abstract
The "No Free Lunch" theorem shows that any two black box algorithms
have equal performance averaged over a set of functions if and only if
that set is closed under permutations (c.u.p.). However, traditional
combinatorial problem classes are not c.u.p. If we restrict our
attention to some specific algorithms (rather than ALL black box
algorithms), then they may have equal average performance over sets of
functions which are not c.u.p. (the socalled focussed NFL theorems).
This raises the question of when two algorithms may have the same
average performance over a traditional combinatorial problem class. By
using a duality theorem between algorithms and problem instances, this
question reduces to asking when permutations of the search space leave
a problem class invariant. For certain types of problem class called
"discrete linear subset" problems (which includes MAXSAT, TSP, GRAPH
COLOURING, MAXCUT) we can characterise these permutations as
automorphisms of a corresponding hypergraph. Previous results can then
be applied to describe precisely those operators (mutation and
crossover) which are invariant with respect to these automorphisms and
thereby give rise to algorithms which are robust (in a precisely
defined sense) for the corresponding problem class.

16:30 
17:00 
Local Optima Networks of NK Landscapes with and without Neutrality
Gabriela Ochoa
(joint work with
Sebastien Verel and
Marco Tomassini)
Abstract
We propose a network characterization of combinatorial fitness
landscapes by adapting the notion of inherent networks proposed for
energy surfaces. We use the wellknown family of NK landscapes as an
example. In our case the inherent network is the graph whose vertices
represent the local maxima in the landscape, and the edges account for
the transition probabilities between their corresponding basins of
attraction. We exhaustively extracted such networks on representative
NK landscape instances, and performed a statistical characterization
of their properties. We found that most of these network properties
are related to the search difficulty on the underlying NK landscapes
with varying values of K. In further work, we extended this formalism
to neutral fitness landscapes, which are extremely common in difficult
combinatorial search spaces. By using two known neutral variants of
the NK family (i.e.NK_p and NK_q) in which the amount of neutrality
can be tuned by a parameter, we show that our new definitions of the
optima networks and the associated basins are coherent with the
previous definitions for the nonneutral case. Moreover, our empirical
study and statistical analysis show that the features of neutral
landscapes interpolate smoothly between landscapes with maximum
neutrality and nonneutral ones. We found some unknown structural
differences between the two studied families of neutral
landscapes. But overall, the network features studied confirmed that
neutrality in a landscape may enhance heuristic search. The
methodologies introduced here should be useful in the future to design
new search heuristics or modifications to existing ones such that the
configuration space can be searched more effectively.

17:00 
18:30 
Plenary and Individual Discussions

18:30 

Dinner & Pub
