A new exact penalty function

For constrained smooth or nonsmooth optimization problems, new continuously differentiable penalty functions are derived. They are proved exact in the sense that under some nondegeneracy assumption, local optimizers of a nonlinear program are precisely the optimizers of the associated penalty function. This is achieved by augmenting the dimension of the program by a variable that … Read more

A Pattern Search Filter Method for Nonlinear Programming without Derivatives

This paper presents and analyzes a pattern search method for general constrained optimization based on filter methods for step acceptance. Roughly, a filter method accepts a step that either improves the objective function value or the value of some function that measures the constraint violation. The new algorithm does not compute or approximate any derivatives, … Read more

Feasible Interior Methods Using Slacks for Nonlinear Optimization

A slack-based feasible interior point method is described which can be derived as a modification of infeasible methods. The modification is minor for most line search methods, but trust region methods require special attention. It is shown how the Cauchy point, which is often computed in trust region methods, must be modified so that the … Read more

A BFGS-IP algorithm for solving strongly convex optimization problems with feasibility enforced by an exact penalty approach

This paper introduces and analyses a new algorithm for minimizing a convex function subject to a finite number of convex inequality constraints. It is assumed that the Lagrangian of the problem is strongly convex. The algorithm combines interior point methods for dealing with the inequality constraints and quasi-Newton techniques for accelerating the convergence. Feasibility of … Read more