A progressive decoupling algorithm for minimizing the difference of convex and weakly convex functions over a linear subspace

Commonly, decomposition and splitting techniques for optimization problems strongly depend on convexity. Implementable splitting methods for nonconvex and nonsmooth optimization problems are scarce and often lack convergence guarantees. Among the few exceptions is the Progressive Decoupling Algorithm (PDA), which has local convergence should convexity be elicitable. In this work, we furnish PDA with a descent … Read more

The proximal point method for locally Lipschitz functions in multiobjective optimization

This paper studies the constrained multiobjective optimization problem of finding Pareto critical points of vector-valued functions. The proximal point method considered by Bonnel et al. (SIAM J. Optim., 4 (2005), pp. 953-970) is extended to locally Lipschitz functions in the finite dimensional multiobjective setting. To this end, a new approach for convergence analysis of the … Read more

A GENERALIZED PROXIMAL LINEARIZED ALGORITHM FOR DC FUNCTIONS WITH APPLICATION TO THE OPTIMAL SIZE OF THE FIRM PROBLEM

A proximal linearized algorithm with a quasi distance as regularization term for minimizing a DC function (difference of two convex functions) is proposed. If the sequence generated by our algorithm is bounded, it is proved that every cluster point is a critical point of the function under consideration, even if minimizations are performed inexactly at … Read more

A proximal point algorithm for DC functions on Hadamard manifolds

An extension of a proximal point algorithm for difference of two convex functions is presented in the context of Riemannian manifolds of nonposite sectional curvature. If the sequence generated by our algorithm is bounded it is proved that every cluster point is a critical point of the function (not necessarily convex) under consideration, even if … Read more