This study addresses some algorithms for solving structured unconstrained convex optimization problems using first-order information where the underlying function includes high-dimensional data. The primary aim is to develop an implementable algorithmic framework for solving problems with multi-term composite objective functions involving linear mappings using the optimal subgradient algorithm, OSGA, proposed by {\sc Neumaier} in \cite{NeuO}. To this end, we propose some prox-functions for which the corresponding subproblem of OSGA is solved in a closed form. Considering various inverse problems arising in signal and image processing, machine learning, statistics, we report extensive numerical and comparisons with several state-of-the-art solvers proposing favourably performance of our algorithm. We also compare with the most widely used optimal first-order methods for some smooth and nonsmooth convex problems. Surprisingly, when some Nesterov-type optimal methods originally proposed for smooth problems are adapted for solving nonsmooth problems by simply passing a subgradient instead of the gradient, the results of these subgradient-based algorithms are competitive and totally interesting for solving nonsmooth problems.
Citation
Faculty of Mathematics, University of Vienna
Article
View Optimal subgradient algorithms with application to large-scale linear inverse problems