A primal-dual majorization-minimization method for large-scale linear programs

We present a primal-dual majorization-minimization method for solving large-scale linear programs. A smooth barrier augmented Lagrangian (SBAL) function with strict convexity for the dual linear program is derived. The majorization-minimization approach is naturally introduced to develop the smoothness and convexity of the SBAL function. Our method only depends on a factorization of the constant matrix … Read more

Equipping Barzilai-Borwein method with two dimensional quadratic termination property

A new gradient stepsize is derived at the motivation of equipping the Barzilai-Borwein (BB) method with two dimensional quadratic termination property. A remarkable feature of the new stepsize is that its computation only depends on the BB stepsizes in previous iterations without the use of exact line searches and Hessian, and hence it can easily … Read more

A primal-dual interior-point relaxation method with adaptively updating barrier for nonlinear programs

Based on solving an equivalent parametric equality constrained mini-max problem of the classic logarithmic-barrier subproblem, we present a novel primal-dual interior-point relaxation method for nonlinear programs. In the proposed method, the barrier parameter is updated in every step as done in interior-point methods for linear programs, which is prominently different from the existing interior-point methods … Read more

On the acceleration of the Barzilai-Borwein method

The Barzilai-Borwein (BB) gradient method is efficient for solving large-scale unconstrained problems to the modest accuracy and has a great advantage of being easily extended to solve a wide class of constrained optimization problems. In this paper, we propose a new stepsize to accelerate the BB method by requiring finite termination for minimizing two-dimensional strongly … Read more

On the asymptotic convergence and acceleration of gradient methods

We consider the asymptotic behavior of a family of gradient methods, which include the steepest descent and minimal gradient methods as special instances. It is proved that each method in the family will asymptotically zigzag between two directions. Asymptotic convergence results of the objective value, gradient norm, and stepsize are presented as well. To accelerate … Read more

Solving Heated Oil Pipeline Problems Via Mixed Integer Nonlinear Programming Approach

It is a crucial problem how to heat oil and save running cost for crude oil transport. This paper strictly formulates such a heated oil pipeline problem as a mixed integer nonlinear programming model. Nonconvex and convex continuous relaxations of the model are proposed, which are proved to be equivalent under some suitable conditions. Meanwhile, … Read more

Gradient methods exploiting spectral properties

We propose a new stepsize for the gradient method. It is shown that this new stepsize will converge to the reciprocal of the largest eigenvalue of the Hessian, when Dai-Yang’s asymptotic optimal gradient method (Computational Optimization and Applications, 2006, 33(1): 73-88) is applied for minimizing quadratic objective functions. Based on this spectral property, we develop … Read more

A family of spectral gradient methods for optimization

We propose a family of spectral gradient methods, whose stepsize is determined by a convex combination of the short Barzilai-Borwein (BB) stepsize and the long BB stepsize. It is shown that each member of the family shares certain quasi-Newton property in the sense of least squares. The family also includes some other gradient methods as … Read more

Quadratic regularization projected alternating Barzilai–Borwein method for constrained optimization

In this paper, based on the regularization techniques and projected gradient strategies, we present a quadratic regularization projected alternating Barzilai–Borwein (QRPABB) method for minimizing differentiable functions on closed convex sets. We show the convergence of the QRPABB method to a constrained stationary point for a nonmonotone line search. When the objective function is convex, we … Read more