For the minimization of a nonlinear cost functional under convex constraints the relaxed projected gradient process is a well known method. The analysis is classically performed in a Hilbert space. We generalize this method to functionals which are differentiable in a Banach space. The search direction is calculated by a quadratic approximation of the cost functional using the idea of the projected gradient. Thus it is possible to perform e.g. an $L^2$ gradient method if the cost functional is only differentiable in $L^\infty$. We show global convergence using Armijo backtracking for the step length selection and allow the underlying inner product and the scaling of the derivative to change in every iteration. As application we present a structural topology optimization problem based on a phase field model, where the reduced cost functional is differentiable in $H^1\cap L^\infty$. The presented numerical results using the $H^1$ inner product and a pointwise chosen metric including second order information show the expected mesh independency in the iteration numbers. The latter yields an additional, drastic decrease in iteration numbers as well as in computation time. Moreover we present numerical results using a BFGS update of the $H^1$ inner product for further optimization problems based on phase field models.
Preprint Nr. 4/2015, Universität Regensburg, Mathematik, (2015)
View An extension of the projected gradient method to a Banach space setting with application in structural topology optimization