Can linear superiorization be useful for linear optimization problems?

Linear superiorization considers linear programming problems but instead of attempting to solve them with linear optimization methods it employs perturbation resilient feasibility-seeking algorithms and steers them toward reduced (not necessarily minimal) target function values. The two questions that we set out to explore experimentally are (i) Does linear superiorization provide a feasible point whose linear … Read more

Examples with Decreasing Largest Inscribed Ball for Deterministic Rescaling Algorithms

Recently, Pena and Soheili presented a deterministic rescaling perceptron algorithm and proved that it solves a feasible perceptron problem in $O(m^2n^2\log(\rho^{-1}))$ perceptron update steps, where $\rho$ is the radius of the largest inscribed ball. The original stochastic rescaling perceptron algorithm of Dunagan and Vempala is based on systematic increase of $\rho$, while the proof of … Read more

A Polynomial Column-wise Rescaling von Neumann Algorithm

Recently Chubanov proposed a method which solves homogeneous linear equality systems with positive variables in polynomial time. Chubanov’s method can be considered as a column-wise rescaling procedure. We adapt Chubanov’s method to the von Neumann problem, and so we design a polynomial time column-wise rescaling von Neumann algorithm. This algorithm is the first variant of … Read more

Block-iterative algorithms with diagonally scaled oblique projections for the linear feasibility problem

We formulate a block-iterative algorithmic scheme for the solution of systems of linear inequalities and/or equations and analyze its convergence. This study provides as special cases proofs of convergence of (i) the recently proposed Component Averaging (CAV) method of Censor, Gordon and Gordon ({\it Parallel Computing}, 27:777–808, 2001), (ii) the recently proposed Block-Iterative CAV (BICAV) … Read more