simsopt.solve package

Submodules

simsopt.solve.mpi module

This module provides two main functions, fd_jac_mpi and least_squares_mpi_solve. Also included are some functions that help in the operation of these main functions.

simsopt.solve.mpi._mpi_workers_task(mpi: simsopt.util.mpi.MpiPartition, prob: simsopt._core.optimizable.Optimizable)

This function is called by worker processes when MpiPartition.workers_loop() receives a signal to do something.

Parameters
  • mpi – A simsopt.util.mpi.MpiPartition object, storing the information about how the pool of MPI processes is divided into worker groups.

  • prob – Optimizable object

  • data – Integer with a value from 1 to 3

simsopt.solve.mpi.least_squares_mpi_solve(prob: simsopt.objectives.least_squares.LeastSquaresProblem, mpi: simsopt.util.mpi.MpiPartition, grad: bool = False, abs_step: float = 1e-07, rel_step: float = 0.0, diff_method: str = 'forward', **kwargs)

Solve a nonlinear-least-squares minimization problem using MPI. All MPI processes (including group leaders and workers) should call this function.

Parameters
  • prob – Optimizable object defining the objective function(s) and parameter space.

  • mpi – A MpiPartition object, storing the information about how the pool of MPI processes is divided into worker groups.

  • grad – Whether to use a gradient-based optimization algorithm, as opposed to a gradient-free algorithm. If unspecified, a a gradient-free algorithm will be used by default. If you set grad=True finite-difference gradients will be used.

  • abs_step – Absolute step size for finite difference jac evaluation

  • rel_step – Relative step size for finite difference jac evaluation

  • diff_method – Differentiation strategy. Options are “centered”, and “forward”. If centered, centered finite differences will be used. If forward, one-sided finite differences will be used. Else, error is raised.

  • kwargs – Any arguments to pass to scipy.optimize.least_squares. For instance, you can supply max_nfev=100 to set the maximum number of function evaluations (not counting finite-difference gradient evaluations) to 100. Or, you can supply method to choose the optimization algorithm.

simsopt.solve.serial module

This module provides the least_squares_serial_solve function. Eventually I can also put a serial_solve function here for general optimization problems.

simsopt.solve.serial.least_squares_serial_solve(prob: simsopt.objectives.least_squares.LeastSquaresProblem, grad: Optional[bool] = None, abs_step: float = 1e-07, rel_step: float = 0.0, diff_method: str = 'forward', **kwargs)

Solve a nonlinear-least-squares minimization problem using scipy.optimize, and without using any parallelization.

Parameters
  • prob – LeastSquaresProblem object defining the objective function(s) and parameter space.

  • grad – Whether to use a gradient-based optimization algorithm, as opposed to a gradient-free algorithm. If unspecified, a a gradient-free algorithm will be used by default. If you set grad=True for a problem, finite-difference gradients will be used.

  • abs_step – Absolute step size for finite difference jac evaluation

  • rel_step – Relative step size for finite difference jac evaluation

  • diff_method – Differentiation strategy. Options are "centered", and "forward". If "centered", centered finite differences will be used. If "forward", one-sided finite differences will be used. Else, error is raised.

  • kwargs

    Any arguments to pass to scipy.optimize.least_squares. For instance, you can supply max_nfev=100 to set the maximum number of function evaluations (not counting finite-difference gradient evaluations) to 100. Or, you can supply method to choose the optimization algorithm.

simsopt.solve.serial.serial_solve(prob: Union[simsopt._core.optimizable.Optimizable, Callable], grad: Optional[bool] = None, abs_step: float = 1e-07, rel_step: float = 0.0, diff_method: str = 'centered', **kwargs)

Solve a general minimization problem (i.e. one that need not be of least-squares form) using scipy.optimize.minimize, and without using any parallelization.

Parameters
  • prob – Optimizable object defining the objective function(s) and parameter space.

  • grad – Whether to use a gradient-based optimization algorithm, as opposed to a gradient-free algorithm. If unspecified, a gradient-based algorithm will be used if prob has gradient information available, otherwise a gradient-free algorithm will be used by default. If you set grad=True in which gradient information is not available, finite-difference gradients will be used.

  • abs_step – Absolute step size for finite difference jac evaluation

  • rel_step – Relative step size for finite difference jac evaluation

  • diff_method – Differentiation strategy. Options are "centered", and "forward". If "centered", centered finite differences will be used. If "forward", one-sided finite differences will be used. Else, error is raised.

  • kwargs

    Any arguments to pass to scipy.optimize.least_squares. For instance, you can supply max_nfev=100 to set the maximum number of function evaluations (not counting finite-difference gradient evaluations) to 100. Or, you can supply method to choose the optimization algorithm.

Module contents

simsopt.solve.least_squares_serial_solve(prob: simsopt.objectives.least_squares.LeastSquaresProblem, grad: Optional[bool] = None, abs_step: float = 1e-07, rel_step: float = 0.0, diff_method: str = 'forward', **kwargs)

Solve a nonlinear-least-squares minimization problem using scipy.optimize, and without using any parallelization.

Parameters
  • prob – LeastSquaresProblem object defining the objective function(s) and parameter space.

  • grad – Whether to use a gradient-based optimization algorithm, as opposed to a gradient-free algorithm. If unspecified, a a gradient-free algorithm will be used by default. If you set grad=True for a problem, finite-difference gradients will be used.

  • abs_step – Absolute step size for finite difference jac evaluation

  • rel_step – Relative step size for finite difference jac evaluation

  • diff_method – Differentiation strategy. Options are "centered", and "forward". If "centered", centered finite differences will be used. If "forward", one-sided finite differences will be used. Else, error is raised.

  • kwargs

    Any arguments to pass to scipy.optimize.least_squares. For instance, you can supply max_nfev=100 to set the maximum number of function evaluations (not counting finite-difference gradient evaluations) to 100. Or, you can supply method to choose the optimization algorithm.

simsopt.solve.serial_solve(prob: Union[simsopt._core.optimizable.Optimizable, Callable], grad: Optional[bool] = None, abs_step: float = 1e-07, rel_step: float = 0.0, diff_method: str = 'centered', **kwargs)

Solve a general minimization problem (i.e. one that need not be of least-squares form) using scipy.optimize.minimize, and without using any parallelization.

Parameters
  • prob – Optimizable object defining the objective function(s) and parameter space.

  • grad – Whether to use a gradient-based optimization algorithm, as opposed to a gradient-free algorithm. If unspecified, a gradient-based algorithm will be used if prob has gradient information available, otherwise a gradient-free algorithm will be used by default. If you set grad=True in which gradient information is not available, finite-difference gradients will be used.

  • abs_step – Absolute step size for finite difference jac evaluation

  • rel_step – Relative step size for finite difference jac evaluation

  • diff_method – Differentiation strategy. Options are "centered", and "forward". If "centered", centered finite differences will be used. If "forward", one-sided finite differences will be used. Else, error is raised.

  • kwargs

    Any arguments to pass to scipy.optimize.least_squares. For instance, you can supply max_nfev=100 to set the maximum number of function evaluations (not counting finite-difference gradient evaluations) to 100. Or, you can supply method to choose the optimization algorithm.