The derivative-free methods Coordinate search,
Nelder–Mead, and
COBYLA can internally handle simple bounds and general constraints on global scalar expressions. In practice this means that in addition to simple bounds and constraints defined in the Optimization study step,
Integral Inequality Constraint nodes and
Global Inequality Constraint nodes in Optimization interfaces are accounted for.
The constraint handling algorithm used in Coordinate search and
Nelder–Mead is in principle based on filtering out candidate points in the control variable space which fall outside the feasible region, and to some extent adjust search directions accordingly. The procedure is not guaranteed to find a constrained local minimum fulfilling the KKT conditions.
COBYLA, in contrast, approximates objective function and constraints in a uniform way. Therefore, provided all functions are sufficiently smooth, it will in general find an approximate constrained local minimum. The returned solution may, however, lie slightly outside the feasible set. This can happen, in particular, if the constraints are nonlinear.
BOBYQA handles simple bounds internally, but general constraints only via an external iterative procedure based on repeated minimization of an augmented Lagrangian. This augmented Lagrangian method can also be used as an alternative to the internal penalty methods in the
Coordinate search and
Nelder–Mead solvers, but is not selected by default.
In the Settings window for the
Optimization study step, you can choose to use an
Automatic or a
Manual definition of the initial
Penalty parameter ρ. The automatic setting is default and computes an initial value based on the objective and constraint function values at the initial point. You can also limit the
Maximum number of augmented iterations and select a strategy for updating
δ (tolerance for the subsolver). The options
Dynamic I and
Dynamic II both tighten the subsolver tolerance from iteration to iteration, the latter providing some additional control. There is also a
Manual option. Finally, specify the
Constraint tolerance, that is, the maximum allowable constraint violation in the final solution.
The SNOPT algorithm handles constraints of all types efficiently. Constraint handling in this SQP method is based on linearizing the constraint in the outer, major, iteration, and using an active-set QP solver in the inner loop to decide which constraints are active and bounding the solution at the current iterate. This process requires accurate evaluation of the gradient of the constraints, also known as the
constraint Jacobian.
The MMA algorithm accepts constraints of the same general type as SNOPT, requiring an accurate constraint Jacobian, but handles them differently. In each outer, major, iteration, linear and linearized constraints are combined with a linearized objective function into a convex smooth approximation whose unique optimum is always feasible unless the feasible set is empty. The globally convergent version of MMA implemented in the Optimization module is conservative in a way which ensures that each major iterate is feasible not only with respect to the linearized constraints, but with respect to the fully nonlinear constraints.
The Levenberg–Marquardt solver does not support any type of constraints.