A necessary condition for the guarantee of the superiorization method

We study a method that involves principally convex feasibility-seeking
and makes secondary efforts of objective function value reduction.
This is the well-known superiorization method (SM), where the iterates
of an asymptotically convergent iterative feasibility-seeking algorithm
are perturbed by objective function nonascent steps. We investigate
the question under what conditions a sequence generated by an SM
algorithm asymptotically converges to a feasible point whose objective
function value is superior (meaning smaller or equal) to that of a feasible
point reached by the corresponding unperturbed one (i.e., the exactly
same feasibility-seeking algorithm that the SM algorithm employs.) This
question is yet only partially answered in the literature. We present
a condition under which an SM algorithm that uses negative gradient
descent steps in its perturbations fails to yield such a superior outcome.
The significance of the discovery of this “negative condition” is that it
necessitates that the inverse of this condition will have to be assumed
to hold in any future guarantee result for the SM. The condition is
important for practitioners who use the SM because it is avoidable in
experimental work with the SM, thus increasing the success rate of the
method in real-world applications.

Citation

Accepted for publication in the journal "Optimization Letters".

Article

Download

Loading...