Maximizing the difference of 2 convex functions over a convex feasible set (the so called
DCA problem) is a hard problem. There is a large number of publications addressing this
problem. Many of them are variations of widely used DCA algorithm [20]. The success of
this algorithm to reach a good approximation of a global optimum, depends crucially on the
choice of its starting point. In the algorithm developed in our paper MDCF (Maximizing the
Difference of Convex Functions) a major effort is to generate a good starting point. This is
obtained by using the COMAX algorithm for maximizing a convex function [6]. The solution
found by COMAX is a basis for obtaining a good strating point for MDCF.
Another contribution of the paper is the algorithm for solving problems with an indefinite
quadratic objective function and compact and convex feasible set. The problem is first
converted to maximizing a difference of convex quadratic functions. The new algorithm
QMDCF is a specific adaptation of MDCF to this case.
The performance of the two new algorithms developed in the paper is tested numerically,
and results are compared to the performance of classical DCA, and some other algorithms.
Article