Browsing by Author "Panier, E.R."
Now showing 1 - 9 of 9
Results Per Page
Sort Options
Item An Active Set Method for Solving Linearly Constrained Nonsmooth Optimization Problems.(1987) Panier, E.R.; ISRAn algorithm for solving linearly constrained optimization problems is proposed. The search direction is computed by a bundle principle and the constraints are treated through an active set strategy. Difficulties that arise when the objective function is nonsmooth require a clever choice of a constraint to relax. A certain nondegeneracy assumption is necessary to obtain convergence.Item Aspects of Optimization-Based CADCS.(1988) Tits, A.L.; Fan, Michael K-H.; Panier, E.R.; ISRWith the recent dramatic increase in available computing power, numerical optimization has become an attractive tool for the design of complex engineering systems. Yet, generalized use of numerical optimization techniques in design has been hindered by (i) the difficulty to translate in a faithful manner the actual design problem into any kind of rigid mathematical optimization problem, (ii) the inability of classical optimization tools to efficiently take into account the many distinctive features of optimization problems arising in a design context, and (iii) the unavailability of software tools offering to the designer a powerful as well as congenial environment supporting such capabilities. In this paper, some aspects of these questions are touched upon and avenues are suggested to address them. In particular, a recently proposed interaction driven design methodology is briefly described and numerical optimization schemes satisfying two specific requirements of many design problems are sketched. As an example, the design of a controller for a copolymerization reactor using the Maryland developed CONSOLE system is considered.Item Avoiding the Maratos Effect by Means of a Nonrnonotone Line Search: I. General Constrained Problems.(1989) Panier, E.R.; Tits, A.L.; ISRAn essential condition for quasi-Newton optimization methods to converge superlinearly is that a full step of one be taken close to the solution. It is well known that, when dealing with constrained optimization problems, line search schemes ensuring global convergence of such methods may prevent this from occurring (the so called "Maratos effect"). Two types of techniques have been used to circumvent this difficulty. In the watchdog technique, the full step of one is occasionally accepted even when the line search criterion is violated; subsequent backtracking is used if global convergence appears to be lost. In a "bending" technique proposed by Mayne and Polak, backtracking is avoided by performing a search along an arc whose construction requires evaluation of constraint functions at an auxiliary point; along this arc, the full step of one is accepted close to a solution. The main idea in the present paper is to comWne Mayne and Polak's technique with a non-monotone line search proposed by Grippo, Lampariello and Lucidi in the context of unconstrained optimization, in such a way that, asymptotically, function evaluations are no longer performed at auxiliary points. In a companion paper (part II), it is shown that a refinement of this scheme can be used in the context of recently proposed SQP- based methods generating feasible iterates.Item A Globally Convergent Algorithm with Adaptively Refined Discretization for Semi-Infinite Optimization Problems Arising in Engineering Design.(1988) Panier, E.R.; Tits, A.L.; ISROptimization problems arising in engineering design often exhibit specific features which, in the interest of computational efficiency, ought to be exploited. Such is the possible presence of 'functional' specifications, i.e., specifications that are to be met over an interval of values of an independent parameter such as time or frequency. Such problems pertain to semiinfinite optimization. While most of the algorithms that have been proposed for the solution of these problems make use, at each iteration, of a set of local maximizers over the range of the independent parameter, the question of suitably approximating such maximizers is generally left aside. It has been suggested that this issue can be addressed by means of an adaptively refined discretization of the interval of variation of the independent parameter. The algorithm proposed in this paper makes use of such a technique and, by means of a certain memory mechanism, avoids the potential lack of convergence suffered by an existing algorithm.Item Globally Convergent Algorithms for Semi-Infinite Optimization Problems Arising in Engineering Design.(1987) Panier, E.R.; Tits, A.L.; ISROptimization problems arising in engineering design often exhibit specific features which, in the interest of computational efficiency, ought to be exploited. Such is the possible presence of 'functional' specifications, i.e., specifications that are to be met over an interval of values of an independent parameter such as time or frequency. While problems involving such specifications could be handled by general purpose nondifferentiable optimization algorithms, the particular structure of functional constraints calls for specific techniques. Suitable schemes have been proposed in the literature. Global convergence is typically achieved by making use of some kind of adaptively refined discretization of the interval of variation of the independent parameter. One previously proposed algorithm exploits the regularity properties of the functions involved to dramatically-reduce the computational overhead incurred once the discretization mesh becomes small. In this paper examples are given that show however that, if the initial discretization is coarse, convergence to a nonstationary point may occur. The cause of such failure is investigated and a class of algorithms is proposed that circumvent this difficulty.Item On Feasibility, Descent and Superlinear Convergence in Inequality Constrained Optimization.(1989) Panier, E.R.; Tits, A.L.; ISRExtension of quasi-Newton techniques from unconstrained to constrained optimization via Sequential Quadratic Programming (SQP) presents several difficulties. Among these are the possible inconsistency, away from the solution, of first order approximations to the constraints, resulting in infeasibility of the quadratic programs; and the task of selecting a suitable merit function, to induce global convergence. In the case of inequality constrained optimization, both of these difficulties disappear if the algorithm is forced to generate iterates that all satisfy the constraints, and that yield monotonically decreasing objective function values. It has been recently shown that this can be achieved while preserving local superlinear convergence. In this note, the essential ingredients for an SQP- based method exhibiting the desired properties are highlighted. Correspondingly, a class of such algorithms is described and analyzed.Item On the Stability of Polynomials with Uncoupled Perturbations in the Coefficients of Even and Odd Powers.(1987) Panier, E.R.; Fan, Michael K-H.; Tits, A.L.; ISRIn this note, we present some results concerning the stability (in Hurwitz' sense) of a family of polynomials with even and odd coefficients subject to uncoupled perturbations. It is shown that the stability of an appropriate small subset of extreme polynomials guarantees the stability of the entire family. In particular, a polytope of polynomials with the even-odd uncoupling property is stable provided a certain small subset of its vertices is. For the case of an arbitrary subset of polynomials, this result gives a less conservative sufficient condition than that provided by Kharitonov's theorem.Item A Superlinearly Convergent Feasible Method for the Solution of Inequality Constrained Optimization Problems.(1985) Panier, E.R.; Tits, A.; ISRWhen iteratively solving optimization problems arising from engineering design applications, it is sometimes crucial that all iterates satisfy a given set of 'hard' inequality constraints, and generally desirable that the objective function value improve at each iteration. In this paper, we propose an algorithm of the successive quadratic programming (SQP) type which, unlike other algorithm of this type, does enjoy such properties. Under mild assumptions, the new algorithm is shown to converge from any initial point, locally superlinearly. Numerically tested, it has proven to be competitive with the most successful currently available nonlinear programming algorithms, while the latter do not exhibit the desired properties.Item A Superlinearly Convergent Method of Feasible Directions for Optimization Problems Arising in the Design of Engineering Systems.(1985) Panier, E.R.; Tits, A.L.; ISROptimization problems arising from engineering design problems often involve the solution of one or several constrained minimax optimization problems. It is sometimes crucial that all iterates constructed when solving such problems satisfy a given set of 'hard' inequality constraints, and generally desirable that the (maximum) objective function value improve at each iteration. In this paper, we propose an algorithm of the sequential quadratic programming (SQP) type that enjoys such properties. This algorithm is inspired from an algorithm recently proposed for the solution of single objective constrained optimization problems. Preliminary numerical results are very promising.