The Minimum Connectivity Inference (MCI) problem represents an NP-hard generalisation of the well-known minimum spanning tree problem and has been studied in different fields of research independently. Let an undirected complete graph and finitely many subsets (clusters) of its vertex set be given. Then, the MCI problem is to find a minimal subset of edges so that every cluster is connected with respect to this minimal subset. Whereas, in general, existing approaches can only be applied to find approximate solutions or optimal edge sets of rather small instances, concepts to optimally cope with more meaningful problem sizes have not been proposed yet in literature. For this reason, we present a new mixed integer linear programming formulation for the MCI problem, and introduce new instance reduction methods that can be applied to downsize the complexity of a given instance prior to the optimisation. Based on theoretical and computational results both contributions are shown to be beneficial for solving larger instances.
Citation
Optimization, https://doi.org/10.1080/02331934.2018.1465944 Technical Report: MATH-NM-05-2017, Institute of Numerical Mathematics, Faculty of Mathematics, TU Dresden, Germany, October 2017