General algorithmic frameworks for online problems

We study general algorithmic frameworks for online learning tasks. These include binary classification, regression, multiclass problems and cost-sensitive multiclass classification. The theorems that we present give loss bounds on the behavior of our algorithms that depend on general conditions on the iterative step sizes. Citation International Journal of Pure and Applied Mathematics, Vol. 46 (2008), … Read more

On the behavior of subgradient projections methods for convex feasibility problems in Euclidean spaces

We study some methods of subgradient projections for solving a convex feasibility problem with general (not necessarily hyperplanes or half-spaces) convex sets in the inconsistent case and propose a strategy that controls the relaxation parameters in a specific self-adapting manner. This strategy leaves enough user-flexibility but gives a mathematical guarantee for the algorithm’s behavior in … Read more

On diagonally-relaxed orthogonal projection methods

We propose and study a block-iterative projections method for solving linear equations and/or inequalities. The method allows diagonal component-wise relaxation in conjunction with orthogonal projections onto the individual hyperplanes of the system, and is thus called diagonally-relaxed orthogonal projections (DROP). Diagonal relaxation has proven useful in accelerating the initial convergence of simultaneous and block-iterative projection … Read more

Block-Iterative Algorithms with Underrelaxed Bregman Projections

The notion of relaxation is well understood for orthogonal projections onto convex sets. For general Bregman projections it was considered only for hyperplanes and the question of how to relax Bregman projections onto convex sets that are not linear (i.e., not hyperplanes or half-spaces) has remained open. A definition of underrelaxation of Bregman projections onto … Read more