Second-order convergence properties of trust-region methods using incomplete curvature information, with an application to multigrid optimization

Convergence properties of trust-region methods for unconstrained nonconvex optimization is considered in the case where information on the objective function’s local curvature is incomplete, in the sense that it may be restricted to a fixed set of “test directions” and may not be available at every iteration. It is shown that convergence to local “weak” … Read more

Continuous Optimization Methods for Structure Alignments

Structural Alignment is an important tool for fold identification of proteins, structural screening on ligand databases, pharmacophore identification and other applications. In the general case, the optimization problem of superimposing two structures is nonsmooth and nonconvex, so that most popular methods are heuristic and do not employ derivative information. Usually, these methods do not admit … Read more

The Rate of Convergence of the Augmented Lagrangian Method for Nonlinear Semidefinite Programming

We analyze the rate of local convergence of the augmented Lagrangian method for nonlinear semidefinite optimization. The presence of the positive semidefinite cone constraint requires extensive tools such as the singular value decomposition of matrices, an implicit function theorem for semismooth functions, and certain variational analysis on the projection operator in the symmetric-matrix space. Without … Read more

Kantorovich’s Majorants Principle for Newton’s Method

We prove Kantorovich’s theorem on Newton’s method using a convergence analysis which makes clear, with respect to Newton’s Method, the relationship of the majorant function and the non-linear operator under consideration. This approach enable us to drop out the assumption of existence of a second root for the majorant function, still guaranteeing Q-quadratic convergence rate … Read more

A Note on Sparse SOS and SDP Relaxations for Polynomial Optimization Problems over Symmetric Cones

This short note extends the sparse SOS (sum of squares) and SDP (semidefinite programming) relaxation proposed by Waki, Kim, Kojima and Muramatsu for normal POPs (polynomial optimization problems) to POPs over symmetric cones, and establishes its theoretical convergence based on the recent convergence result by Lasserre on the sparse SOS and SDP relaxation for normal … Read more

On the behavior of the conjugate-gradient method on ill-conditioned problems

We study the behavior of the conjugate-gradient method for solving a set of linear equations, where the matrix is symmetric and positive definite with one set of eigenvalues that are large and the remaining are small. We characterize the behavior of the residuals associated with the large eigenvalues throughout the iterations, and also characterize the … Read more

An interior Newton-like method for nonnegative least-squares problems with degenerate solution

An interior point approach for medium and large nonnegative linear least-squares problems is proposed. Global and locally quadratic convergence is shown even if a degenerate solution is approached. Viable approaches for implementation are discussed and numerical results are provided. Citation Technical Report 1/2005, Dipartimento di Energetica ‘S. Stecco’, Universita di Firenze, Italia Article Download View … Read more

On Self-Regulated Swarms, Societal Memory, Speed and Dynamics

Wasps, bees, ants and termites all make effective use of their environment and resources by displaying collective “swarm” intelligence. Termite colonies – for instance – build nests with a complexity far beyond the comprehension of the individual termite, while ant colonies dynamically allocate labor to various vital tasks such as foraging or defense without any … Read more

An efficient conjugate direction method with orthogonalization for large-scale quadratic optimization problems

A new conjugate direction method is proposed, which is based on an orthogonalization procedure and does not make use of line searches for the conjugate vector set construction. This procedure prevents the set of conjugate vectors from degeneracy and eliminates high sensitivity to computation errors pertinent to methods of conjugate directions, resulting in an efficient … Read more

Societal Implicit Memory and his Speed on Tracking Extrema over Dynamic Environments using Self-Regulatory Swarms

In order to overcome difficult dynamic optimization and environment extrema tracking problems, we propose a Self-Regulated Swarm (SRS) algorithm which hybridizes the advantageous characteristics of Swarm Intelligence as the emergence of a societal environmental memory or cognitive map via collective pheromone laying in the landscape (properly balancing the exploration/exploitation nature of the search strategy), with … Read more