archimedes.minimize¶
- archimedes.minimize(obj: Callable, x0: T, args: Sequence[Any] = (), static_argnames: str | Sequence[str] | None = None, constr: Callable | None = None, bounds: T | None = None, constr_bounds: ArrayLike | None = None, method: str = 'ipopt', **options) T ¶
Minimize a scalar function with optional constraints.
Solve a nonlinear programming problem of the form
minimize f(x, p) subject to lbx <= x <= ubx lbg <= g(x, p) <= ubg
This function provides a simplified interface to nonlinear optimization solvers like IPOPT for solving a single optimization problem.
- Parameters:
obj (callable) – Objective function to minimize, with signature
obj(x, *args)
. Must return a scalar value.x0 (array_like) – Initial guess for the optimization.
args (tuple, optional) – Extra arguments passed to the objective and constraint functions.
static_argnames (tuple of str, optional) – Names of arguments that should be treated as static (non-symbolic) parameters. Static arguments are not differentiated through.
constr (callable, optional) – Constraint function with the same signature as
obj
. Must return an array of constraint values where the constraints are interpreted aslbg <= constr(x, *args) <= ubg
.bounds (tuple of (array_like, array_like), optional) – Bounds on the decision variables, given as a tuple
(lb, ub)
. Each bound can be either a scalar or an array matching the shape ofx0
. Use-np.inf
andnp.inf
to specify no bound.constr_bounds (tuple of (array_like, array_like), optional) – Bounds on the constraint values, given as a tuple
(lbg, ubg)
. Each bound can be a scalar or an array matching the shape of the constraint function output. If None and constr is provided, defaults to(0, 0)
for equality constraints.method (str, optional) – The optimization method to use. Default is “ipopt”. Other options may be available depending on the installed solver. See the CasADi documentation for available methods.
**options (dict) – Additional options passed to the optimization solver through
nlp_solver()
. Available options depend on the solver method. See notes for examples.
- Returns:
x_opt – The optimal solution found by the solver, with the same shape as
x0
.- Return type:
ndarray
Notes
This function dispatches to different optimization methods depending on the
method
argument. By default, it uses the IPOPT interior point optimizer which is effective for large-scale constrained nonlinear problems. IPOPT requires derivatives of the objective and constraints, which are automatically computed using automatic differentiation.For IPOPT, see the [CasADi plugin documentation](https://web.casadi.org/python-api/#ipopt) for options that may be passed directly as keyword arguments. However, most IPOPT configuration options documented in the [IPOPT manual](https://coin-or.github.io/Ipopt/OPTIONS.html) must be passed as an
ipopt
dictionary in theoptions
argument. For example, typical options for IPOPT can be passed as follows:ipopt_options = { "print_level": 0, "max_iter": 100, "tol": 1e-6, } minimize(obj, x0, ..., method="ipopt", options={"ipopt": ipopt_options})
Another common solver is the sequential quadratic programming (SQP) method, available via the “sqpmethod”, “blocksqp”, or “feasiblesqpmethod” method names. SQP methods require a quadratic programming (QP) solver to solve the QP subproblems, specified via a
qpsol
keyword argument. The default QP solver is “qpoases”, but other options include “osqp”, “proxqp”, and plugins for solvers like “cplex” and “gurobi”.Typical configuration options for “sqpmethod” which may be passed directly as keyword arguments include:
hessian_approximation
: “exact” (default, uses automatic differentiation) or“limited-memory” (uses a limited-memory BFGS approximation).
max_iter
: Maximum number of SQP iterations (default is 25).qpsol
: QP solver to use (default is “qpoases”).qpsol_options
: Options for the QP solver, passed as a dictionary.tol_du
: Stopping tolerance for dual feasibility (default is 1e-6).tol_pr
: Stopping tolerance for primal feasibility (default is 1e-6).
For other configuration options, see the CasADi documentation for [“sqpmethod”](https://web.casadi.org/python-api/#sqpmethod), [“blocksqp”](https://web.casadi.org/python-api/#blocksqp), and [“feasiblesqpmethod”](https://web.casadi.org/python-api/#feasiblesqpmethod). For QP solver options, see the documentation for the specific QP solver (e.g., [OSQP](https://osqp.org/docs/release-0.6.3/interfaces/solver_settings.html) or [qpOASES](https://coin-or.github.io/qpOASES/doc/3.0/doxygen/classOptions.html)).
Note that the “sqpmethod” solver with OSQP is the only combination that supports C code generation.
For repeated optimization with different parameters, use
nlp_solver()
directly to avoid recompilation overhead.Examples
>>> import numpy as np >>> import archimedes as arc >>> >>> # Unconstrained Rosenbrock function >>> def f(x): ... return 100 * (x[1] - x[0]**2)**2 + (1 - x[0])**2 >>> >>> # Initial guess >>> x0 = np.array([-1.2, 1.0]) >>> >>> # Solve unconstrained problem >>> x_opt = arc.minimize(f, x0) >>> print(x_opt) [1. 1.] >>> >>> # Constrained optimization >>> def g(x): ... g1 = (x[0] - 1)**3 - x[1] + 1 ... g2 = x[0] + x[1] - 2 ... return np.array([g1, g2], like=x) >>> >>> # Solve with inequality constraints: g(x) <= 0 >>> x_opt = arc.minimize( ... f, ... x0=np.array([2.0, 0.0]), ... constr=g, ... constr_bounds=(-np.inf, 0), ... ) >>> print(x_opt) [0.99998266 1.00000688] >>> >>> # Optimization with variable bounds >>> x_opt = arc.minimize( ... f, ... x0=np.array([0.0, 0.0]), ... bounds=(np.array([0.0, 0.0]), np.array([0.5, 1.5])), ... ) >>> print(x_opt) array([0.50000001, 0.25000001]) >>> >>> # Solving with sequential quadratic programming using OSQP >>> x_opt = arc.minimize( ... f, ... x0=np.array([0.0, 0.0]), ... constr=g, ... constr_bounds=(-np.inf, 0), ... method="sqpmethod", ... qpsol="osqp", ... qpsol_options={"err_abs": 1e-6, "err_rel": 1e-6}, ... tol_du=1e-3, ... tol_pr=1e-3, ... max_iter=10, ... hessian_approximation="limited-memory", ... )
See also
nlp_solver
Create a reusable solver for nonlinear optimization
root
Find the roots of a nonlinear function
implicit
Create a function that solves
F(x, p) = 0
forx
scipy.optimize.minimize
SciPy’s optimization interface