simsopt.objectives package
Submodules
simsopt.objectives.fluxobjective module
- class simsopt.objectives.fluxobjective.CoilOptObjective(Jfluxs, Jcls=[], alpha=0.0, Jdist=None, beta=0.0)
Bases:
simsopt._core.graph_optimizable.Optimizable
Objective combining a single or a list of
simsopt.objectives.fluxobjective.SquaredFlux
with a list of curve objectives and a distance objective to form the basis of a classic Stage II optimization problem. The objective functions are combined into a single scalar function using weightsalpha
andbeta
.If a single
simsopt.objectives.fluxobjective.SquaredFlux
is given, then the objective is\[J = \mathrm{Jflux} + \alpha \sum_k \mathrm{Jcls}_k + \beta \mathrm{Jdist}.\]If a list of n
simsopt.objectives.fluxobjective.SquaredFlux
objects are given, then the objective is\[J = \frac1n \sum_{i=1}^n \mathrm{Jflux}_i + \alpha \sum_k \mathrm{Jcls}_k + \beta \mathrm{Jdist}.\]This latter case is useful for stochastic optimization.
- Parameters
Jfluxs – A single
simsopt.objectives.fluxobjective.SquaredFlux
or a list of themJcls – Typically a list of
simsopt.geo.curveobjectives.CurveLength
, though any list of objectives that have aJ()
anddJ()
function is fine.alpha – The scalar weight in front of the objectives in
Jcls
.Jdist – Typically a
simsopt.geo.curveobjectives.MinimumDistance
, though any objective that has aJ()
anddJ()
function is fine.beta – The scalar weight in front of the objective in
Jdist
.
- J()
- dJ(*args, partials=False, **kwargs)
- class simsopt.objectives.fluxobjective.SquaredFlux(surface, field, target=None)
Bases:
simsopt._core.graph_optimizable.Optimizable
Objective representing the quadratic flux of a field on a surface, that is
\[\frac12 \int_{S} (\mathbf{B}\cdot \mathbf{n} - B_T)^2 ds\]where \(\mathbf{n}\) is the surface unit normal vector and \(B_T\) is an optional (zero by default) target value for the magnetic field.
- Parameters
surface – A
simsopt.geo.surface.Surface
object on which to compute the fluxfield – A
simsopt.field.magneticfield.MagneticField
for which to compute the flux.target – A
nphi x ntheta
numpy array containing target values for the flux. Herenphi
andntheta
correspond to the number of quadrature points on surface inphi
andtheta
direction.
- J()
- dJ(*args, partials=False, **kwargs)
simsopt.objectives.functions module
This module provides a few minimal optimizable objects, each representing a function. These functions are mostly used for testing.
- class simsopt.objectives.functions.Adder(n=3)
Bases:
simsopt._core.optimizable.Optimizable
This class defines a minimal object that can be optimized. It has n degrees of freedom, and has a function that just returns the sum of these dofs. This class is used for testing.
- J()
Returns the sum of the degrees of freedom.
- dJ()
- property df
Same as the function dJ(), but a property instead of a function.
- property f
Same as the function J(), but a property instead of a function.
- get_dofs()
This base Optimizable object has no degrees of freedom, so return an empty array
- set_dofs(xin)
This base Optimizable object has no degrees of freedom, so do nothing.
- class simsopt.objectives.functions.Affine(nparams, nvals)
Bases:
simsopt._core.optimizable.Optimizable
This class represents a random affine (i.e. linear plus constant) transformation from R^n to R^m.
- J()
- __init__(nparams, nvals)
nparams = number of independent variables. nvals = number of dependent variables.
- dJ()
- get_dofs()
This base Optimizable object has no degrees of freedom, so return an empty array
- set_dofs(x)
This base Optimizable object has no degrees of freedom, so do nothing.
- class simsopt.objectives.functions.Beale
Bases:
simsopt._core.optimizable.Optimizable
This is a test function which does not supply derivatives. It is taken from https://en.wikipedia.org/wiki/Test_functions_for_optimization
- J()
- get_dofs()
This base Optimizable object has no degrees of freedom, so return an empty array
- set_dofs(x)
This base Optimizable object has no degrees of freedom, so do nothing.
- class simsopt.objectives.functions.Failer(nparams: int = 2, nvals: int = 3, fail_index: int = 2)
Bases:
simsopt._core.optimizable.Optimizable
This class is used for testing failures of the objective function. This function always returns a vector with entries all 1.0, except that ObjectiveFailure will be raised on a specified evaluation.
- Parameters
nparams – Number of input values.
nvals – Number of entries in the return vector.
fail_index – Which function evaluation to fail on.
- J()
- get_dofs()
This base Optimizable object has no degrees of freedom, so return an empty array
- set_dofs(x)
This base Optimizable object has no degrees of freedom, so do nothing.
- class simsopt.objectives.functions.Identity(x=0.0)
Bases:
simsopt._core.optimizable.Optimizable
This class represents a term in an objective function which is just the identity. It has one degree of freedom, and the output of the function is equal to this degree of freedom.
- J()
- dJ()
- property df
Same as the function dJ(), but a property instead of a function.
- property f
Same as the function J(), but a property instead of a function.
- get_dofs()
This base Optimizable object has no degrees of freedom, so return an empty array
- set_dofs(xin)
This base Optimizable object has no degrees of freedom, so do nothing.
- class simsopt.objectives.functions.Rosenbrock(b=100.0, x=0.0, y=0.0)
Bases:
simsopt._core.optimizable.Optimizable
This class defines a minimal object that can be optimized.
- dterm1()
Returns the gradient of term1
- property dterm1prop
Same as dterm1, but a property rather than a callable function.
- dterm2()
Returns the gradient of term2
- property dterm2prop
Same as dterm2, but a property rather than a callable function.
- dterms()
Returns the 2x2 Jacobian for term1 and term2.
- f()
Returns the total function, squaring and summing the two terms.
- get_dofs()
This base Optimizable object has no degrees of freedom, so return an empty array
- set_dofs(xin)
This base Optimizable object has no degrees of freedom, so do nothing.
- term1()
Returns the first of the two quantities that is squared and summed.
- property term1prop
Same as term1, but a property rather than a callable function.
- term2()
Returns the second of the two quantities that is squared and summed.
- property term2prop
Same as term2, but a property rather than a callable function.
- terms()
Returns term1 and term2 together as a 2-element numpy vector.
- class simsopt.objectives.functions.RosenbrockWithFailures(*args, fail_interval=8, **kwargs)
Bases:
simsopt.objectives.functions.Rosenbrock
This class is similar to the Rosenbrock class, except that it fails (raising ObjectiveFailure) at regular intervals. This is useful for testing that the simsopt infrastructure handles failures in the expected way.
- term1()
Returns the first of the two quantities that is squared and summed.
- class simsopt.objectives.functions.TestObject1(val)
Bases:
simsopt._core.optimizable.Optimizable
This is an optimizable object used for testing. It depends on two sub-objects, both of type Adder.
- J()
- dJ()
- property df
Same as dJ() but a property instead of a function.
- property f
Same as J() but a property instead of a function.
- get_dofs()
This base Optimizable object has no degrees of freedom, so return an empty array
- set_dofs(x)
This base Optimizable object has no degrees of freedom, so do nothing.
- class simsopt.objectives.functions.TestObject2(val1, val2)
Bases:
simsopt._core.optimizable.Optimizable
This is an optimizable object used for testing. It depends on two sub-objects, both of type Adder.
- J()
- dJ()
- property df
Same as dJ() but a property instead of a function.
- property f
Same as J() but a property instead of a function.
- get_dofs()
This base Optimizable object has no degrees of freedom, so return an empty array
- set_dofs(x)
This base Optimizable object has no degrees of freedom, so do nothing.
simsopt.objectives.graph_functions module
This module provides a few minimal optimizable objects, each representing a function. These functions are mostly used for testing.
- class simsopt.objectives.graph_functions.Adder(n=3, x0=None, dof_names=None)
Bases:
simsopt._core.graph_optimizable.Optimizable
Defines a minimal graphe based Optimizable object that can be optimized. It has n degrees of freedom.
The method sum returns the sum of these dofs. The call hook internally calls the sum method.
- Parameters
n – Number of degrees of freedom (DOFs)
x0 – Initial values of the DOFs. If not given, equal to zeroes
dof_names – Identifiers for the DOFs
- dJ()
- property df
Same as the function dJ(), but a property instead of a function.
- return_fn_map: Dict[str, Callable] = {'sum': <function Adder.sum>}
- sum()
Sums the DOFs
- class simsopt.objectives.graph_functions.Affine(nparams, nvals)
Bases:
simsopt._core.graph_optimizable.Optimizable
Implements a random affine (i.e. linear plus constant) transformation from R^n to R^m. The n inputs to the transformation are initially set to zeroes.
- Parameters
nparams – number of independent variables.
nvals – number of dependent variables.
- dJ()
- f()
- return_fn_map: Dict[str, Callable] = {'f': <function Affine.f>}
- class simsopt.objectives.graph_functions.Beale
Bases:
simsopt._core.graph_optimizable.Optimizable
This is a test function which does not supply derivatives. It is taken from https://en.wikipedia.org/wiki/Test_functions_for_optimization
- J()
- class simsopt.objectives.graph_functions.Failer(nparams: int = 2, nvals: int = 3, fail_index: int = 2)
Bases:
simsopt._core.graph_optimizable.Optimizable
This class is used for testing failures of the objective function. This function always returns a vector with entries all 1.0, except that ObjectiveFailure will be raised on a specified evaluation.
- Parameters
nparams – Number of input values.
nvals – Number of entries in the return vector.
fail_index – Which function evaluation to fail on.
- J()
- get_dofs()
- set_dofs(x)
- class simsopt.objectives.graph_functions.Identity(x: numbers.Real = 0.0, dof_name: Optional[str] = None, dof_fixed: bool = False)
Bases:
simsopt._core.graph_optimizable.Optimizable
Represents a term in an objective function which is just the identity. It has one degree of freedom. Conforms to the experimental graph based Optimizable framework.
The output of the method f is equal to this degree of freedom. The call hook internally calls method f. It does not have any parent Optimizable nodes
- Parameters
x – Value of the DOF
dof_name – Identifier for the DOF
dof_fixed – To specify if the dof is fixed
- dJ(x: Optional[Union[Sequence[numbers.Real], nptyping.types._ndarray.NDArray[None, nptyping.types._number.Float[float, numpy.floating]]]] = None)
- f()
Returns the value of the DOF
- return_fn_map: Dict[str, Callable] = {'f': <function Identity.f>}
- class simsopt.objectives.graph_functions.Rosenbrock(b=100.0, x=0.0, y=0.0)
Bases:
simsopt._core.graph_optimizable.Optimizable
Implements Rosenbrock function using the graph based optimization framework. The Rosenbrock function is defined as
\[f(x,y) = (a-x)^2 + b(y-x^2)^2\]The parameter a is fixed to 1. And the b parameter can be given as input.
- Parameters
b – The b parameter of Rosenbrock function
x – x coordinate
y – y coordinate
- property dterm1
Returns the gradient of term1
- property dterm2
Returns the gradient of term2
- dterms()
Returns the 2x2 Jacobian for term1 and term2.
- f(x=None)
Returns the total function, squaring and summing the two terms.
- return_fn_map: Dict[str, Callable] = {'f': <function Rosenbrock.f>}
- property term1
Returns the first of the two quantities that is squared and summed.
- property term2
Returns the second of the two quantities that is squared and summed.
- property terms
Returns term1 and term2 together as a 2-element numpy vector.
- class simsopt.objectives.graph_functions.TestObject1(val: numbers.Real, opts: Optional[Sequence[simsopt._core.graph_optimizable.Optimizable]] = None)
Bases:
simsopt._core.graph_optimizable.Optimizable
Implements a graph based optimizable with a single degree of freedom and has parent optimizable nodes. Mainly used for testing.
The output method is named f. Call hook internally calls method f.
- Parameters
val – Degree of freedom
opts – Parent optimizable objects. If not given, two Adder objects are added as parents
- dJ()
Same as dJ() but a property instead of a function.
- f()
Implements an objective function
- return_fn_map: Dict[str, Callable] = {'f': <function TestObject1.f>}
- class simsopt.objectives.graph_functions.TestObject2(val1, val2)
Bases:
simsopt._core.graph_optimizable.Optimizable
Implements a graph based optimizable with two single degree of freedom and has two parent optimizable nodes. Mainly used for testing.
The output method is named f. Call hook internally calls method f.
- Parameters
val1 – First degree of freedom
val2 – Second degree of freedom
- dJ()
- f()
- return_fn_map: Dict[str, Callable] = {'f': <function TestObject2.f>}
simsopt.objectives.graph_least_squares module
Provides the LeastSquaresProblem class implemented using the new graph based optimization framework.
- class simsopt.objectives.graph_least_squares.LeastSquaresProblem(goals: Union[numbers.Real, Sequence[numbers.Real], nptyping.types._ndarray.NDArray[None, nptyping.types._number.Float[float, numpy.floating]]], weights: Union[numbers.Real, Sequence[numbers.Real], nptyping.types._ndarray.NDArray[None, nptyping.types._number.Float[float, numpy.floating]]], funcs_in: Optional[Sequence[Callable]] = None, depends_on: Optional[Union[simsopt._core.graph_optimizable.Optimizable, Sequence[simsopt._core.graph_optimizable.Optimizable]]] = None, opt_return_fns: Optional[Union[Sequence, Sequence[Sequence[str]]]] = None, fail: Union[None, float] = 1000000000000.0)
Bases:
simsopt._core.graph_optimizable.Optimizable
Represents a nonlinear-least-squares problem implemented using the new graph based optimization framework. A LeastSquaresProblem instance has 3 basic attributes: a set of functions (f_in), target values for each of the functions (goal), and weights. The residual (f_out) for each of the f_in is defined as:
\[f_{out} = weight * (f_{in} - goal) ^ 2\]- Parameters
goals – Targets for residuals in optimization
weights – Weight associated with each of the residual
funcs_in – Input functions (Generally one of the output functions of the Optimizable instances
depends_on – (Alternative initialization) Instead of specifying funcs_in, one could specify the Optimizable objects
opt_return_fns – (Alternative initialization) If using depends_on, specify the return functions associated with each Optimizable object
- classmethod from_sigma(goals: Union[numbers.Real, Sequence[numbers.Real], nptyping.types._ndarray.NDArray[None, nptyping.types._number.Float[float, numpy.floating]]], sigma: Union[numbers.Real, Sequence[numbers.Real], nptyping.types._ndarray.NDArray[None, nptyping.types._number.Float[float, numpy.floating]]], funcs_in: Optional[Sequence[Callable]] = None, depends_on: Optional[Union[simsopt._core.graph_optimizable.Optimizable, Sequence[simsopt._core.graph_optimizable.Optimizable]]] = None, opt_return_fns: Optional[Union[Sequence, Sequence[Sequence[str]]]] = None, fail: Union[None, float] = 1000000000000.0) simsopt.objectives.graph_least_squares.LeastSquaresProblem
Define the LeastSquaresProblem with
\[\begin{split}\sigma = 1/\sqrt{weight}, \text{so} \\ f_{out} = \left(\frac{f_{in} - goal}{\sigma}\right) ^ 2.\end{split}\]- Parameters
goals – Targets for residuals in optimization
sigma – Inverse of the sqrt of the weight associated with each of the residual
funcs_in – Input functions (Generally one of the output functions of the Optimizable instances
depends_on – (Alternative initialization) Instead of specifying funcs_in, one could specify the Optimizable objects
opt_return_fns – (Alternative initialization) If using depends_on, specify the return functions associated with each Optimizable object
- classmethod from_tuples(tuples: Sequence[Tuple[Callable, numbers.Real, numbers.Real]], fail: Union[None, float] = 1000000000000.0) simsopt.objectives.graph_least_squares.LeastSquaresProblem
Initializes graph based LeastSquaresProblem from a sequence of tuples containing f_in, goal, and weight.
- Parameters
tuples – A sequence of tuples containing (f_in, goal, weight) in each tuple (the specified order matters).
- objective(x=None, *args, **kwargs)
Return the least squares sum
- Parameters
x – Degrees of freedom or state
args – Any additional arguments
kwargs – Keyword arguments
- residuals(x=None, *args, **kwargs)
Return the weighted residuals
- Parameters
x – Degrees of freedom or state
args – Any additional arguments
kwargs – Keyword arguments
- return_fn_map: Dict[str, Callable] = {'objective': <function LeastSquaresProblem.objective>, 'residuals': <function LeastSquaresProblem.residuals>}
- unweighted_residuals(x=None, *args, **kwargs)
Return the unweighted residuals (f_in - goal)
- Parameters
x – Degrees of freedom or state
args – Any additional arguments
kwargs – Keyword arguments
simsopt.objectives.least_squares module
This module provides the LeastSquaresProblem class, as well as the associated class LeastSquaresTerm.
- class simsopt.objectives.least_squares.LeastSquaresTerm(f_in, goal, weight)
Bases:
object
This class represents one term in a nonlinear-least-squares problem. A LeastSquaresTerm instance has 3 basic attributes: a function (called f_in), a goal value (called goal), and a weight (sigma). The overall value of the term is:
f_out = weight * (f_in - goal) ** 2.
- f_out()
Return the overall value of this least-squares term.
- classmethod from_sigma(f_in, goal, sigma)
Define the LeastSquaresTerm with sigma = 1 / sqrt(weight), so
f_out = ((f_in - goal) / sigma) ** 2.
Module contents
- class simsopt.objectives.LeastSquaresProblem(goals: Union[numbers.Real, Sequence[numbers.Real], nptyping.types._ndarray.NDArray[None, nptyping.types._number.Float[float, numpy.floating]]], weights: Union[numbers.Real, Sequence[numbers.Real], nptyping.types._ndarray.NDArray[None, nptyping.types._number.Float[float, numpy.floating]]], funcs_in: Optional[Sequence[Callable]] = None, depends_on: Optional[Union[simsopt._core.graph_optimizable.Optimizable, Sequence[simsopt._core.graph_optimizable.Optimizable]]] = None, opt_return_fns: Optional[Union[Sequence, Sequence[Sequence[str]]]] = None, fail: Union[None, float] = 1000000000000.0)
Bases:
simsopt._core.graph_optimizable.Optimizable
Represents a nonlinear-least-squares problem implemented using the new graph based optimization framework. A LeastSquaresProblem instance has 3 basic attributes: a set of functions (f_in), target values for each of the functions (goal), and weights. The residual (f_out) for each of the f_in is defined as:
\[f_{out} = weight * (f_{in} - goal) ^ 2\]- Parameters
goals – Targets for residuals in optimization
weights – Weight associated with each of the residual
funcs_in – Input functions (Generally one of the output functions of the Optimizable instances
depends_on – (Alternative initialization) Instead of specifying funcs_in, one could specify the Optimizable objects
opt_return_fns – (Alternative initialization) If using depends_on, specify the return functions associated with each Optimizable object
- classmethod from_sigma(goals: Union[numbers.Real, Sequence[numbers.Real], nptyping.types._ndarray.NDArray[None, nptyping.types._number.Float[float, numpy.floating]]], sigma: Union[numbers.Real, Sequence[numbers.Real], nptyping.types._ndarray.NDArray[None, nptyping.types._number.Float[float, numpy.floating]]], funcs_in: Optional[Sequence[Callable]] = None, depends_on: Optional[Union[simsopt._core.graph_optimizable.Optimizable, Sequence[simsopt._core.graph_optimizable.Optimizable]]] = None, opt_return_fns: Optional[Union[Sequence, Sequence[Sequence[str]]]] = None, fail: Union[None, float] = 1000000000000.0) simsopt.objectives.graph_least_squares.LeastSquaresProblem
Define the LeastSquaresProblem with
\[\begin{split}\sigma = 1/\sqrt{weight}, \text{so} \\ f_{out} = \left(\frac{f_{in} - goal}{\sigma}\right) ^ 2.\end{split}\]- Parameters
goals – Targets for residuals in optimization
sigma – Inverse of the sqrt of the weight associated with each of the residual
funcs_in – Input functions (Generally one of the output functions of the Optimizable instances
depends_on – (Alternative initialization) Instead of specifying funcs_in, one could specify the Optimizable objects
opt_return_fns – (Alternative initialization) If using depends_on, specify the return functions associated with each Optimizable object
- classmethod from_tuples(tuples: Sequence[Tuple[Callable, numbers.Real, numbers.Real]], fail: Union[None, float] = 1000000000000.0) simsopt.objectives.graph_least_squares.LeastSquaresProblem
Initializes graph based LeastSquaresProblem from a sequence of tuples containing f_in, goal, and weight.
- Parameters
tuples – A sequence of tuples containing (f_in, goal, weight) in each tuple (the specified order matters).
- objective(x=None, *args, **kwargs)
Return the least squares sum
- Parameters
x – Degrees of freedom or state
args – Any additional arguments
kwargs – Keyword arguments
- residuals(x=None, *args, **kwargs)
Return the weighted residuals
- Parameters
x – Degrees of freedom or state
args – Any additional arguments
kwargs – Keyword arguments
- return_fn_map: Dict[str, Callable] = {'objective': <function LeastSquaresProblem.objective>, 'residuals': <function LeastSquaresProblem.residuals>}
- unweighted_residuals(x=None, *args, **kwargs)
Return the unweighted residuals (f_in - goal)
- Parameters
x – Degrees of freedom or state
args – Any additional arguments
kwargs – Keyword arguments