On Feasibility, Descent and Superlinear Convergence in Inequality Constrained Optimization.
Files
Publication or External Link
Date
Authors
Advisor
Citation
DRUM DOI
Abstract
Extension of quasi-Newton techniques from unconstrained to constrained optimization via Sequential Quadratic Programming (SQP) presents several difficulties. Among these are the possible inconsistency, away from the solution, of first order approximations to the constraints, resulting in infeasibility of the quadratic programs; and the task of selecting a suitable merit function, to induce global convergence. In the case of inequality constrained optimization, both of these difficulties disappear if the algorithm is forced to generate iterates that all satisfy the constraints, and that yield monotonically decreasing objective function values. It has been recently shown that this can be achieved while preserving local superlinear convergence. In this note, the essential ingredients for an SQP- based method exhibiting the desired properties are highlighted. Correspondingly, a class of such algorithms is described and analyzed.