simsopt.solve package
- simsopt.solve.least_squares_mpi_solve(prob: LeastSquaresProblem, mpi: MpiPartition, grad: bool = False, abs_step: float = 1e-07, rel_step: float = 0.0, diff_method: str = 'forward', **kwargs)
Solve a nonlinear-least-squares minimization problem using MPI. All MPI processes (including group leaders and workers) should call this function.
- Parameters
prob – Optimizable object defining the objective function(s) and parameter space.
mpi – A MpiPartition object, storing the information about how the pool of MPI processes is divided into worker groups.
grad – Whether to use a gradient-based optimization algorithm, as opposed to a gradient-free algorithm. If unspecified, a a gradient-free algorithm will be used by default. If you set
grad=True
finite-difference gradients will be used.abs_step – Absolute step size for finite difference jac evaluation
rel_step – Relative step size for finite difference jac evaluation
diff_method – Differentiation strategy. Options are “centered”, and “forward”. If
centered
, centered finite differences will be used. Ifforward
, one-sided finite differences will be used. Else, error is raised.kwargs – Any arguments to pass to scipy.optimize.least_squares. For instance, you can supply
max_nfev=100
to set the maximum number of function evaluations (not counting finite-difference gradient evaluations) to 100. Or, you can supplymethod
to choose the optimization algorithm.
- simsopt.solve.least_squares_serial_solve(prob: LeastSquaresProblem, grad: Optional[bool] = None, abs_step: float = 1e-07, rel_step: float = 0.0, diff_method: str = 'forward', **kwargs)
Solve a nonlinear-least-squares minimization problem using scipy.optimize, and without using any parallelization.
- Parameters
prob – LeastSquaresProblem object defining the objective function(s) and parameter space.
grad – Whether to use a gradient-based optimization algorithm, as opposed to a gradient-free algorithm. If unspecified, a a gradient-free algorithm will be used by default. If you set
grad=True
for a problem, finite-difference gradients will be used.abs_step – Absolute step size for finite difference jac evaluation
rel_step – Relative step size for finite difference jac evaluation
diff_method – Differentiation strategy. Options are
"centered"
, and"forward"
. If"centered"
, centered finite differences will be used. If"forward"
, one-sided finite differences will be used. Else, error is raised.kwargs –
Any arguments to pass to scipy.optimize.least_squares. For instance, you can supply
max_nfev=100
to set the maximum number of function evaluations (not counting finite-difference gradient evaluations) to 100. Or, you can supplymethod
to choose the optimization algorithm.
- simsopt.solve.serial_solve(prob: Union[Optimizable, Callable], grad: Optional[bool] = None, abs_step: float = 1e-07, rel_step: float = 0.0, diff_method: str = 'centered', **kwargs)
Solve a general minimization problem (i.e. one that need not be of least-squares form) using scipy.optimize.minimize, and without using any parallelization.
- Parameters
prob – Optimizable object defining the objective function(s) and parameter space.
grad – Whether to use a gradient-based optimization algorithm, as opposed to a gradient-free algorithm. If unspecified, a gradient-based algorithm will be used if
prob
has gradient information available, otherwise a gradient-free algorithm will be used by default. If you setgrad=True
in which gradient information is not available, finite-difference gradients will be used.abs_step – Absolute step size for finite difference jac evaluation
rel_step – Relative step size for finite difference jac evaluation
diff_method – Differentiation strategy. Options are
"centered"
, and"forward"
. If"centered"
, centered finite differences will be used. If"forward"
, one-sided finite differences will be used. Else, error is raised.kwargs –
Any arguments to pass to scipy.optimize.least_squares. For instance, you can supply
max_nfev=100
to set the maximum number of function evaluations (not counting finite-difference gradient evaluations) to 100. Or, you can supplymethod
to choose the optimization algorithm.