Metal Artefact Reduction by Least-Squares Penalized-Likelihood Reconstruction with a Fast Polychromatic Projection Model

We consider penalized-likelihood reconstruction for X-ray computed tomography of objects that contain small metal structures. To reduce the beam hardening artefacts induced by these structures, we derive the reconstruction algorithm from a projection model that takes into account the photon emission spectrum and nonlinear variation of attenuation to photon energy. This algorithm requires excessively long … Read more

A three-term conjugate gradient method with sufficient descent property for unconstrained optimization

Conjugate gradient methods are widely used for solving large-scale unconstrained optimization problems, because they do not need the storage of matrices. In this paper, we propose a general form of three-term conjugate gradient methods which always generate a sufficient descent direction. We give a sufficient condition for the global convergence of the proposed general method. … Read more

A Proximal Algorithm with Quasi Distance. Application to Habit’s Formation

We consider a proximal algorithm with quasi distance applied to nonconvex and nonsmooth functions involving analytic properties for an unconstrained minimization problem. We show the behavioral importance of this proximal point model for habit’s formation in Decision and Making Sciences. Article Download View A Proximal Algorithm with Quasi Distance. Application to Habit's Formation

A Limited Memory Steepest Descent Method

The possibilities inherent in steepest descent methods have been considerably amplified by the introduction of the Barzilai-Borwein choice of step-size, and other related ideas. These methods have proved to be competitive with conjugate gradient methods for the minimization of large dimension unconstrained minimization problems. This paper suggests a method which is able to take advantage … Read more

Using approximate secant equations in limited memory methods for multilevel unconstrained optimization

The properties of multilevel optimization problems defined on a hierarchy of discretization grids can be used to define approximate secant equations, which describe the second-order behaviour of the objective function. Following earlier work by Gratton and Toint (2009), we introduce a quasi-Newton method (with a linesearch) and a nonlinear conjugate gradient method that both take … Read more

On nonlinear optimization since 1959

This view of the development of algorithms for nonlinear optimization is based on the research that has been of particular interest to the author since 1959, including several of his own contributions. After a brief survey of classical methods, which may require good starting points in order to converge successfully, the huge impact of variable … Read more

A sufficiently exact inexact Newton step based on reusing matrix information

Newton’s method is a classical method for solving a nonlinear equation $F(z)=0$. We derive inexact Newton steps that lead to an inexact Newton method, applicable near a solution. The method is based on solving for a particular $F'(z_{k’})$ during $p$ consecutive iterations $k=k’,k’+1,\dots,k’+p-1$. One such $p$-cycle requires $2^p-1$ solves with the matrix $F'(z_{k’})$. If matrix … Read more

A Collection of 1,300 Dynamical Systems for Testing Data Fitting, Optimal Control, Experimental Design, Identification, Simulation or Similar Software – User’s Guide

We describe a collection of test problems which have been used to develop and test data fitting software for identifying parameters in explicit model functions, dynamical systems of equations, Laplace transformations, systems of ordinary differential equations, differential algebraic equations, or systems of one-dimensional time-dependent partial differential equations with or without algebraic equations. The test cases … Read more

MathOptimizer: A nonlinear optimization package for Mathematica users

Mathematica is an advanced software system that enables symbolic computing, numerics, program code development, model visualization and professional documentation in a unified framework. Our MathOptimizer software package serves to solve global and local optimization models developed using Mathematica. We introduce MathOptimizer’s key features and discuss its usage options that support a range of operational modes. … Read more

Alternating direction algorithms for total variation deconvolution in image reconstruction

Image restoration and reconstruction from blurry and noisy observation is known to be ill-posed. To stabilize the recovery, total variation (TV) regularization was introduced by Rudin, Osher and Fatemi in \cite{LIR92}, which has demonstrated superiority in preserving image edges. However, the nondifferentiability of TV makes the underlying optimization problems difficult to solve. In this paper, … Read more