Approximate Primal Solutions and Rate Analysis in Dual Subgradient Methods

We study primal solutions obtained as a by-product of subgradient methods when solving the Lagrangian dual of a primal convex constrained optimization problem (possibly nonsmooth). The existing literature on the use of subgradient methods for generating primal optimal solutions is limited to the methods producing such solutions only asymptotically (i.e., in the limit as the … Read more

Derivative Free Optimization Methods for Optimizing Stirrer Configurations

In this paper a numerical approach for the optimization of stirrer configurations is presented. The methodology is based on a flow solver, and a mathematical optimization tool, which are integrated into an automated procedure. The flow solver is based on the discretization of the incompressible Navier-Stokes equations by means of a fully conservative finite-volume method … Read more

Survey of Derivative Free Optimization Methods based on Interpolation

In this survey article we give the basic description of the interpolation based derivative free optimization methods and their variants. We review the recent contributions dealing with the maintaining the geometry of the interpolation set, the management of the trust region radius and the stopping criteria. Derivative free algorithms developed for problems with some structure … Read more

On large scale unconstrained optimization problems and higher order methods

Third order methods will in most cases use fewer iterations than a second order method to reach the same accuracy. However, the number of arithmetic operations per iteration is higher for third order methods than a second order method. Newton’s method is the most commonly used second order method and Halley’s method is the most … Read more

A conic duality Frank–Wolfe type theorem via exact penalization in quadratic optimization

The famous Frank–Wolfe theorem ensures attainability of the optimal value for quadratic objective functions over a (possibly unbounded) polyhedron if the feasible values are bounded. This theorem does not hold in general for conic programs where linear constraints are replaced by more general convex constraints like positive-semidefiniteness or copositivity conditions, despite the fact that the … Read more

Data Assimilation in Weather Forecasting: A Case Study in PDE-Constrained Optimization

Variational data assimilation is used at major weather prediction centers to produce the initial conditions for 7- to 10-day weather forecasts. This technique requires the solution of a very large data-fitting problem in which the major element is a set of partial differential equations that models the evolution of the atmosphere over a time window … Read more

Computable representations for convex hulls of low-dimensional quadratic forms

Let C be the convex hull of points {(1;x)(1,x’)| x \in F\subset R^n}. Representing or approximating C is a fundamental problem for global optimization algorithms based on convex relaxations of products of variables. If n

New Adaptive Stepsize Selections in Gradient Methods

This paper deals with gradient methods for minimizing n-dimensional strictly convex quadratic functions. Two new adaptive stepsize selection rules are presented and some key properties are proved. Practical insights on the effectiveness of the proposed techniques are given by a numerical comparison with the Barzilai-Borwein (BB) method, the cyclic/adaptive BB methods and two recent monotone … Read more

On diagonally-relaxed orthogonal projection methods

We propose and study a block-iterative projections method for solving linear equations and/or inequalities. The method allows diagonal component-wise relaxation in conjunction with orthogonal projections onto the individual hyperplanes of the system, and is thus called diagonally-relaxed orthogonal projections (DROP). Diagonal relaxation has proven useful in accelerating the initial convergence of simultaneous and block-iterative projection … Read more

New class of limited-memory variationally-derived variable metric methods

A new family of limited-memory variationally-derived variable metric or quasi-Newton methods for unconstrained minimization is given. The methods have quadratic termination property and use updates, invariant under linear transformations. Some encouraging numerical experience is reported. CitationTechnical Report V-973. Prague, ICS AS CR 2006.ArticleDownload View PDF