🚀 Exciting advancement from our Natural Computing team! Our latest publication, "LLaMEA: A Large Language Model Evolutionary Algorithm for Automatically Generating Metaheuristics," is now live in IEEE TEVC. This framework leverages GPT models to automatically generate and refine metaheuristic algorithms, showing competitive performance against state-of-the-art optimization techniques. Congratulations to Niki van Stein and Thomas Bäck for this innovative work in automated algorithm generation! 👏 Read more: https://lnkd.in/eXmrdu5C
I am happy to announce that our paper, "LLaMEA: A Large Language Model Evolutionary Algorithm for Automatically Generating Metaheuristics", has been accepted for publication in the IEEE Transactions on Evolutionary Computation (#TEVC). 👉 See the preprint here: https://lnkd.in/e4weRqy5 In this work, we introduce LLaMEA, a framework that leverages large language models like GPT-4 to automate the generation and refinement of metaheuristic optimization algorithms. By iteratively generating, mutating, and selecting algorithms based on performance metrics and runtime evaluations, LLaMEA offers a novel approach to creating optimized algorithms without extensive prior expertise. Our experiments demonstrate that LLaMEA can produce algorithms that outperform state-of-the-art optimization methods, such as CMA-ES, on the 5-dimensional black-box optimization benchmark (BBOB). Notably, these algorithms also exhibit competitive performance on 10- and 20-dimensional instances, despite not being trained on such instances during the automated generation process. Also a big thank you to my co-author Thomas Bäck, for his invaluable contributions. #Research #EvolutionaryComputation #IEEE #AI #LLM #EC