Research in optimization algorithm design is often accompanied by benchmarking a new al- gorithm. Some benchmarking is done as a proof-of-concept, by demonstrating the new algorithm works on a small number of dicult test problems. Alternately, some benchmarking is done in order to demonstrate that the new algorithm in someway out-performs previous methods. In this circumstance it is important to note that algorithm performance can heavily depend on the selection of a number of user inputed parameters. In this paper we begin by demonstrating that if algorithms are compared using arbitrary parameter selections, results do not just compare the algorithms, but also compare the authors' ability to select good parameters. We further present a novel technique for generating and providing results using each algorithm's \optimal parameter selection". These optimal parameter selections can be computed using modern derivative free optimization methods and generate a framework for fairer benchmarking between optimization algorithms.
Citation
submitted
Article
View Fairer Benchmarking of Optimization Algorithms via Derivative Free Optimization