archimedes.minimize¶
- archimedes.minimize(obj: Callable, x0: T, args: Sequence[Any] = (), static_argnames: str | Sequence[str] | None = None, constr: Callable | None = None, bounds: tuple[T, T] | None = None, constr_bounds: ArrayLike | None = None, method: str = 'ipopt', options: dict | None = None) OptimizeResult ¶
Minimize a scalar function with optional constraints and PyTree support.
Solve a nonlinear programming problem of the form:
minimize f(x, p) subject to lbx <= x <= ubx lbg <= g(x, p) <= ubg
This function provides a unified interface to multiple optimization methods, including CasADi-based solvers (IPOPT, SQP) and SciPy optimizers (BFGS, L-BFGS-B), with automatic differentiation and native PyTree parameter structure support.
- Key Features:
- PyTree Support: Parameters can be nested dictionaries, dataclasses, or
any PyTree structure
- Automatic Differentiation: Exact gradients and Hessians (as needed)
computed efficiently and automatically
- Parameters:
obj (callable) – Objective function to minimize, with signature
obj(x, *args)
. Must return a scalar value. The functionx
parameter can be a PyTree matching the structure ofx0
.x0 (PyTree) –
Initial guess for the optimization. Can be a flat array, nested dictionary, dataclass, or any PyTree structure. The solution will preserve this structure.
Examples:
# Flat array x0 = np.array([1.0, 2.0]) # Nested dictionary (preserved in solution) x0 = {"mass": 1.0, "damping": {"c1": 0.1, "c2": 0.2}} # Dataclass-like nested structure @archimedes.struct.pytree_node class Params: mass: float stiffness: float x0 = Params(mass=1.0, stiffness=100.0)
args (tuple, optional) – Extra arguments passed to the objective and constraint functions.
static_argnames (tuple of str, optional) – Names of arguments that should be treated as static (non-symbolic) parameters. Static arguments are not differentiated through and trigger solver recompilation when changed. Useful for discrete parameters.
constr (callable, optional) – Constraint function with the same signature as
obj
. Must return an array of constraint values where the constraints are interpreted aslbg <= constr(x, *args) <= ubg
. Note: SciPy methods (BFGS, L-BFGS-B) do not support general constraints.bounds (tuple, optional) –
Bounds on the decision variables, given as a tuple
(lb, ub)
. Each bound must have the same PyTree structure asx0
. Use-np.inf
andnp.inf
for unbounded variables.Examples:
# Array bounds bounds = (np.array([0.0, -1.0]), np.array([10.0, 1.0])) # PyTree bounds (matching x0 structure) bounds = ( {"mass": 0.1, "damping": {"c1": 0.0, "c2": 0.0}}, # lower bounds {"mass": 10.0, "damping": {"c1": 1.0, "c2": 1.0}} # upper bounds )
constr_bounds (tuple of (array_like, array_like), optional) – Bounds on the constraint values, given as a tuple
(lbg, ubg)
. If None and constr is provided, defaults to(0, 0)
for equality constraintsmethod (str, optional) –
The optimization method to use. Default is “ipopt”. Available methods:
- CasADi Methods (support constraints):
"ipopt"
(default): Interior point method, excellent for largeconstrained problems
"sqpmethod"
: Sequential quadratic programming"blocksqp"
: Block-structured SQP for large problems"feasiblesqpmethod"
: SQP with feasibility restoration
- SciPy Methods (unconstrained or box-constrained only):
"BFGS"
: Quasi-Newton method with automatic exact gradients"L-BFGS-B"
: Limited-memory BFGS with box constraints
options (dict, optional) – Method-specific options. See Notes section for details.
- Returns:
result – Optimization result with PyTree structure preserved:
x
: Solution parameters (same PyTree structure asx0
)success
: Whether optimization succeededmessage
: Descriptive termination messagefun
: Final objective valuenfev
: Number of function evaluationsAdditional method-specific fields
- Return type:
OptimizeResult
Notes
Method Selection Guide:
Constrained problems: Use
"ipopt"
(default) or SQP methodsUnconstrained problems: Use
"BFGS"
for fast convergenceBox constraints only: Use
"L-BFGS-B"
for memory efficiencyLeast-squares problems: Use
least_squares()
withmethod="hess-lm"
for specialized Levenberg-Marquardt algorithm
Automatic Differentiation:
All methods use exact gradients computed via automatic differentiation. For CasADi methods, both gradients and Hessians are computed exactly. For SciPy methods, gradients are exact and Hessians are approximated using BFGS.
PyTree Structure Preservation:
The solution
result.x
maintains the exact same nested structure as the initial guessx0
. This enables natural parameter organization for complex models:# Initial parameters params = {"dynamics": {"mass": 1.0, "damping": 0.1}, "controller": {"kp": 1.0}} # After optimization - same structure result = minimize(objective, params) final_params = result.x # Has same nested structure print(final_params["dynamics"]["mass"]) # Optimized mass value
IPOPT Configuration (method=”ipopt”):
Common IPOPT options passed via
options={"ipopt": {...}}
:ipopt_opts = { "print_level": 0, # Suppress output "max_iter": 100, # Maximum iterations "tol": 1e-6, # Convergence tolerance "acceptable_tol": 1e-4, # Acceptable tolerance } result = minimize(obj, x0, method="ipopt", options={"ipopt": ipopt_opts})
SQP Configuration (method=”sqpmethod”):
Common SQP options passed directly via
options
:options = { "qpsol": "osqp", # QP solver "hessian_approximation": "exact", # Use exact Hessian "max_iter": 50, # SQP iterations "tol_pr": 1e-6, # Primal feasibility tolerance "tol_du": 1e-6, # Dual feasibility tolerance } result = minimize(obj, x0, method="sqpmethod", options=options)
SciPy Integration (method=”BFGS” or “L-BFGS-B”):
SciPy methods receive exact gradients via autodiff and support standard SciPy options:
result = minimize( obj, x0, method="BFGS", options={"gtol": 1e-8, "maxiter": 100} )
Examples
Basic Usage:
>>> import numpy as np >>> import archimedes as arc >>> >>> # Rosenbrock function with PyTree parameters >>> def rosenbrock(params): ... x, y = params["x"], params["y"] ... return 100 * (y - x ** 2) ** 2 + (1 - x) ** 2 >>> >>> # PyTree initial guess >>> x0 = {"x": 2.0, "y": 1.0} >>> >>> # Unconstrained optimization >>> result = arc.optimize.minimize(rosenbrock, x0, method="BFGS") >>> print(result.x) # Preserves PyTree structure {'x': 0.9999999999999994, 'y': 0.9999999999999989}
Constrained Optimization:
>>> # Add constraints >>> def constraint(params): ... x, y = params["x"], params["y"] ... return x + y - 1.5 # x + y >= 1.5 >>> >>> # Solve with inequality constraint >>> result = arc.optimize.minimize( ... rosenbrock, x0, ... constr=constraint, ... constr_bounds=(0.0, np.inf), # g(x) >= 0.0 ... method="ipopt" ... )
Box Constraints with PyTree:
>>> # PyTree bounds matching x0 structure >>> bounds = ( ... {"x": 0.0, "y": 0.0}, # Lower bounds ... {"x": 2.0, "y": 2.0} # Upper bounds ... ) >>> result = arc.optimize.minimize(rosenbrock, x0, bounds=bounds)
See also
least_squares
Specialized Levenberg-Marquardt solver for residual minimization
nlp_solver
Create a reusable solver for repeated optimization
root
Find roots of nonlinear equations
scipy.optimize.minimize
SciPy’s optimization interface