Funding: NWO Physical Sciences, TOP Module 2
Running Period: 2016-2020
Staff: Bodo Manthey
Ph.D. student: Stefan Klootwijk
Large-scale optimization problems show up in many domains, such as engineering, scheduling, economics, but also, e.g., in the sciences. Unfortunately, finding optimal solutions within reasonable time is often impossible because the problems that have to be solved are computationally intractable. Because of this, optimization problems are nowadays often attacked using ad-hoc heuristics. Many such heuristics show a remarkable performance, but their theoretical (worst-case) performance is poor - worst-case analysis is often too pessimistic to reflect the performance observed. In order to explain the performance of heuristics, probabilistic analysis is the method of choice, where performance is analyzed with respect to random instances.
The instances of many optimization problems involve, implicitly or explicitly, a metric space. This can be physical distances, but also, e.g., costs for travel or transportation. Up to now, however, probabilistic analysis of algorithms is almost exclusively restricted to Euclidean instances or the distances are drawn independently, disregarding the metric nature. Both approaches fall short of explaining the average-case performance of heuristics on general metric instances.
Our goal is to develop and apply a framework for random metric spaces. We develop models for random metric spaces, study their properties, and apply these findings to explain the performance of heuristics for optimization problems.