2.1. Consistency of algorithms
In the paper, the consistency of an algorithm is defined using convergence in probability. A stochastic iterative global optimization algorithm is sait to be consistent over f
if the maximum of the evaluations of the samples produced by the algorithm converges in probability to the maximum of f
over the search space, i.e.
\max_{0 \le i \le n} f(X_i) \xrightarrow{p} \max_{x \in \mathcal{X}} f(x),
where X_i
is the i
-th sample produced by the algorithm. Note that we consider here the case where an algorithm searches to maximize a function, but the definition can be adapted to the case of minimization.
The latter definition can be unfolded as follows:
\forall \varepsilon > 0, \; \mathbb{P}(|\max_{0 \le i \le n} f(X_i) - \max_{x \in \mathcal{X}} f(x)| > \varepsilon) \xrightarrow[n \to \infty]{} 0,
where \mathbb{P}
denotes the probability measure on sequences of samples of size n + 1
produced by the algorithm. In our formalization, we showed that \mathbb{P}
is equal to Algorithm.
fin_measure
.
To formally define the consistency using Algorithm.
fin_measure
, we define the set of sequences of samples of size n + 1
such that the maximum of the evaluations is at least \varepsilon
away from the maximum of f
over the compact search space (noted fmax
).
def set_dist_max {f : α → β} (hf : Lipschitz f) {n : ℕ} (ε : ℝ) :=
{u : iter α n | dist (Tuple.max (f ∘ u)) (fmax hf) > ε}
We can now define the measure of this set of sequences.
noncomputable def measure_dist_max (A : Algorithm α β) {f : α → β} (hf : Lipschitz f) :=
fun ε n => A.fin_measure hf.continuous (set_dist_max hf (n := n) ε)
Finally, an algorithm A
is consistent over a Lipschitz function f
if measure_dist_max
tends to 0
when n
tends to infinity.
def is_consistent (A : Algorithm α β) {f : α → β} (hf : Lipschitz f) :=
∀ ⦃ε⦄, 0 < ε → Tendsto (measure_dist_max A hf ε) atTop (𝓝 0)