CERN Accelerating science

Article
Title Evolutionary expectation maximization
Author(s) Guiraud, Enrico (Oldenburg U. ; CERN) ; Drefs, Jakob (Oldenburg U.) ; Lücke, Jörg (Oldenburg U.)
Publication 2018
Number of pages 8
In: Genetic and Evolutionary Computation Conference (GECCO '18), Kyoto, Japan, 15 - 19 Jul 2018, pp.442-449
DOI 10.1145/3205455.3205588
Subject category Computing and Computers
Abstract We establish a link between evolutionary algorithms (EAs) and learning of probabilistic generative models with binary hidden variables. Learning is formulated as approximate maximum likelihood optimization using variational expectation maximization. When choosing truncated posteriors as variational distributions, the variational parameters take the form of sets of latent states. By (A) identifying these sets with populations of genomes, and by (B) defining the fitness of individuals as the joint probability of the chosen generative model, the sets of latent states can be optimized using EAs. We obtain scalable learning algorithms that effectively improve the tractable free energy objective of truncated posteriors. While this novel approach is applicable to any model with binary latents and tractable joint probability, as a proof of concept, we here apply it to the optimization of parameters of noisy-OR Bayes Nets (modeling binary data) and Binary Sparse Coding (modelling continuous data). We find that the data likelihood is efficiently improved by employing genetic algorithms with point mutations and single-point cross-over as EAs. In general we believe that, with the novel link established here, standard as well as recent results in evolutionary optimization can be leveraged to address the difficult problem of parameter optimization in generative models.
Copyright/License © 2018-2024 Author(s) (License: Association for Computing Machinery)

Corresponding record in: Inspire
 Record created 2024-01-20, last modified 2024-01-21