Optimization¶
Nonlinear Programming Formulation¶
Three methods in the EpsilonConstraintOptimizer
class, opt_slsqp
, opt_shgo
and opt_diffev
, are
wrappers for optimization algorithm calls. (The tyche.EpsilonConstraints section provides full parameter and return value information for these methods.) The optimization methods define the optimization problem according to each algorithm’s requirements, call the algorithm, and provide either optimized results in a standard format for postprocessing, or an error messages if the optimization did not complete successfully. The SLSQP algorithm, which is not a global optimizer, is provided to assess problem feasibility and provide reasonable upper and lower bounds on metrics being optimized. Because the technology models within an R&D decision context may be arbitrarily complex, two global optimization algorithms were also implemented. The global algorithms were chosen according to the following criteria.
Ability to perform constrained optimization with inequality constraints, for instance on metric values or investment amounts.
Ability to optimize without specified Jacobian or Hessian functions (derivative-less optimization).
Ability to specify bounds on individual decision variables, which allows constraints on single research areas.
Ability to work on a variety of potentially non-convex and otherwise complex problems.
Algorithm Testing¶
As a point of comparison between algorithms, the Simple Residential Photovoltaics decision context was optimized for minimum levelized cost of electricity (LCOE) subject to an investment constraint and a metric constraint (GHG). The solutions are given in Table 15. The solve times listed are in addition to the time required to set up the problem and solve for the optimum metric values; this procedure currently uses the SLSQP algorithm by default. This setup time is between 10 and 15 seconds.
Algorithm |
Objective Function Value |
GHG Constraint Value |
Solve Time (s) |
---|---|---|---|
Differential evolution |
0.037567 |
41.699885 |
145 |
Differential evolution |
0.037547 |
41.632867 |
589 |
SLSQP |
0.037712 |
41.969348 |
~ 2 |
SHGO |
None found |
None found |
None found |
Additional details for each solution are given below under the section for the corresponding algorithm.
Sequential Least Squares Programming (SLSQP)¶
The Sequential Least Squares Programming algorithm uses a gradient search method to locate a possibly local optimum. [6] A complete list of parameters and options for the fmin_slsqp
algorithm is available in the scipy.optimize.fmin_slsqp documentation.
Constraints for fmin_slsqp
are defined either as a single function that takes as input a vector of decision variable values and returns an array containing the value of all constraints in the problem simultaneously. Both equality and inequality constraints can be defined, although they must be as separate functions and are provided to the fmin_slsqp
algorithm under separate arguments.
SLSQP Solution to Simple Residential Photovoltaics¶
Solve time: 1.5 s
Decision Variable |
Optimized Value |
---|---|
BoS R&D |
1.25 E-04 |
Inverter R&D |
3.64 E-08 |
Module R&D |
3.00 E+06 |
System Metric |
Optimized Value |
---|---|
GHG |
41.97 |
LCOE |
0.038 |
Labor |
0.032 |
Differential Evolution¶
Differential evolution is one type of evolutionary algorithm that iteratively improves on an initial population, or set of potential solutions. [5] Differential evolution is well-suited to searching large solution spaces with multiple local minima, but does not guarantee convergence to the global minimum. Moreover, users may need to adjust the default solving parameters and options in order to obtain a solution and cut down on solve time. A complete list of parameters and options for the differential_evolution algorithm is available in the scipy.optimize.differential_evolution documentation.
Constraints for differential_evolution are defined by passing the same multi-valued function defined in opt_slsqp to the NonlinearConstraint method.
Differential Evolution Solutions to Simple Residential Photovoltaics¶
Differential evolution stochastically populates the initial set of potential solutions, and so the optimal solution and solve time may vary with the random seed used.
Solution 1¶
Starting with a random seed of 2, the solution time was 145 seconds.
Decision Variable |
Optimized Value |
---|---|
BoS R&D |
9.62 E+02 |
Inverter R&D |
5.33 E+02 |
Module R&D |
2.99 E+06 |
System Metric |
Optimized Value |
---|---|
GHG |
41.70 |
LCOE |
0.038 |
Labor |
-0.456 |
Solution 2¶
Starting with a random seed of 1, the solution time was 589 seconds.
Decision Variable |
Optimized Value |
---|---|
BoS R&D |
4.70 E+03 |
Inverter R&D |
3.71 E+02 |
Module R&D |
2.99 E+06 |
System Metric |
Optimized Value |
---|---|
GHG |
41.63 |
LCOE |
0.037 |
Labor |
-2.29 |
Simplicial Homology Global Optimization¶
The Simplicial Homology Global Optimization (SHGO) algorithm applies simplicial homology to general non-linear, low-dimensional optimization problems. [4] SHGO provides fast solutions using default parameters and options, but the optimum found may not be as precise as that found by the differential evolution algorithm. Constraints for shgo must be provided as a dictionary or sequence of dictionaries with the following format:
constraints = [ {'type': 'ineq', 'fun': g1(x)},
{'type': 'ineq', 'fun': g2(x)},
...
{'type': 'eq', 'fun': h1(x)},
{'type': 'eq', 'fun': h2(x)},
... ]
Each of the constraint functions g1(x), h1(x), and so on are functions that take decision variable values as inputs and return the value of the constraint. Inequality constraints (g1(x) and g2(x) above) are formulated as \(g(x) \geq 0\) and equality constraints (h1(x) and h2(x) above) are formulated as \(h(x) = 0\). Each constraint in the optimization problem is defined as a separate function, with a separate dictionary giving the constraint type. With shgo it is not possible to use one function that returns a vector of constraint values.
Piecewise Linear (MILP) Formulation¶
Notation¶
Index |
Description |
---|---|
\(I\) |
Number of elicited data points (investment levels and metrics) |
\(J\) |
Number of investment categories |
\(K\) |
Number of metrics |
Data |
Notation |
Information |
---|---|---|
Investment amounts |
\(c_{ij}, i \in \{1, ..., I\}\) |
\(c_i\) is a point in \(J\)-dimensional space |
Metric value |
\(q_{ik}, i \in \{1, ..., I \}, k \in \{1, ..., K \}\) |
One metric will form the objective function, leaving up to \(K-1\) metrics for constraints |
Variable |
Notation |
Information |
---|---|---|
Binary variables |
\(y_{ii'}, i, i' \in \{1, ..., I\}, i' > i\) |
Number of linear intervals between elicited data points. |
Combination variables |
\(\lambda_{i}, i \in \{1, ..., I\}\) |
Used to construct linear combinations of elicited data points. \(\lambda_{i} \geq 0 \forall i\) |
Each metric and investment amount can be written as a linear combination of elicited data points and the newly introduced variables \(\lambda_{i}\) and \(y_{ii'}\). Additional constraints on \(y_{ii'}\) and \(\lambda_{i}\) take care of the piecewise linearity by ensuring that the corners used to calculate \(q_k\) reflect the interval that \(c_i\) is in. There will be a total of \(\binom{I}{2}\) binary \(y\) variables, which reduces to \(\frac{I(I-1)}{2}\) binary variables.
One-Investment-Category, One-Metric Example¶
Suppose we have an elicited data set for one metric (\(K = 1\)) and one investment category (\(J = 1\)) with three possible investment levels (\(I = 3\)). We can write the total investment amount as a linear combination of the three investment levels \(c_{i1}, i \in \{1, 2, 3\}\), using the \(\lambda\) variables:
\(\lambda_{1}c_{11} + \lambda_{2}c_{21} + \lambda_{13}c_{31} = \sum_{i} \lambda_{i}c_{i1}\)
We can likewise write the metric as a linear combination of \(q_{1i}\) and the \(\lambda\) variables:
\(\lambda_{1}q_{11} + \lambda_{2}q_{21} + \lambda_{3}q_{31} = \sum_{i} \lambda_{i}q_{i1}\)
We have the additional constraint on the \(\lambda\) variables that
\(\sum_{i} \lambda_{i} = 1\)
These equations, combined with the integer variables \(y_{ii'} = \{ y_{12}, y_{13}, y_{23} \}\), can be used to construct a mixed-integer linear optimization problem.
The MILP that uses this formulation to minimize a technology metric subject to a investment budget \(B\) is as follows:
\(\min_{y, \lambda} \lambda_{1}q_{11} + \lambda_{2}q_{21} + \lambda_{3}q_{31}\)
subject to
\(\lambda_{1}c_{11} + \lambda_{2}c_{21} + \lambda_{3}c_{31} \leq B\) , (1) Total budget constraint
\(\lambda_1 + \lambda_2 + \lambda_3 = 1\) , (2)
\(y_{12} + y_{23} + y_{13} = 1\) , (3)
\(y_{12} \leq \lambda_1 + \lambda_2\) , (4)
\(y_{23} \leq \lambda_2 + \lambda_3\) , (5)
\(y_{13} \leq \lambda_1 + \lambda_3\) , (6)
\(0 \leq \lambda_1, \lambda_2, \lambda_3 \leq 1\) , (7)
\(y_{12}, y_{23}, y_{13} \in \{ 0, 1 \}\) , (8)
(We’ve effectively removed the investments and the metrics as variables, replacing them with the elicited data points and the new \(\lambda\) and \(y\) variables.)
Extension to N x N Problem¶
Note: \(k'\) indicates the metric which is being constrained. \(k*\) indicates the metric being optimized. \(J'\) indicates the set of investment categories which have a budget limit (there may be more than one budget-constrained category in a problem).
No metric constraint or investment category-specific budget constraint
\(\min_{y, \lambda} \sum_i \lambda_{i}q_{ik*}\)
subject to
\(\sum_i \sum_j \lambda_{i}c_{ij} \leq B\) , (1) Total budget constraint
\(\sum_i \lambda_i = 1\) , (2)
\(\sum_{i,i'} y_{ii'} = 1\) , (3)
\(y_{ii'} \leq \lambda_i + \lambda_{i'} \forall i, i'\) , (4)
\(0 \leq \lambda_i \leq 1 \forall i\) , (5)
\(y_{ii'} \in \{ 0, 1 \} \forall i, i'\) , (6)
With investment category-specific budget constraint
\(\min_{y, \lambda} \sum_i \lambda_{i}q_{ik*}\)
subject to
\(\sum_i \sum_j \lambda_{i}c_{ij} \leq B\) , (1) Total budget constraint
\(\sum_i \lambda_{i}c_{ij'} \leq B_{j'} \forall j' \in J'\), (2) Investment category budget constraint(s)
\(\sum_i \lambda_i = 1\) , (3)
\(\sum_{i,i'} y_{ii'} = 1\) , (4)
\(y_{ii'} \leq \lambda_i + \lambda_{i'} \forall i, i'\) , (5)
\(0 \leq \lambda_i \leq 1 \forall i\) , (6)
\(y_{ii'} \in \{ 0, 1 \} \forall i, i'\) , (7)
With metric constraint and investment category-specific budget constraint
\(\min_{y, \lambda} \sum_i \lambda_{i}q_{ik*}\)
subject to
\(\sum_i \sum_j \lambda_{i}c_{ij} \leq B\), (1) Total budget constraint
\(\sum_i \lambda_{i}c_{ij'} \leq B_{j'} \forall j' \in J'\) (2) Investment category budget constraint(s)
\(\sum_i \lambda_{i}q_{ik'} \leq M_{k'}\) , (3) Metric constraint
\(\sum_i \lambda_i = 1\) , (4)
\(\sum_{i,i'} y_{ii'} = 1\) , (5)
\(y_{ii'} \leq \lambda_i + \lambda_{i'} \forall i, i'\) , (6)
\(0 \leq \lambda_i \leq 1 \forall i\) , (7)
\(y_{ii'} \in \{ 0, 1 \} \forall i, i'\) , (8)
Problem Size
In general, \(I\) is the number of rows in the dataset of elicited data. In the case that all investment categories have elicited data at the same number of levels (not necessarily the same levels themselves), \(I\) can also be calculated as \(l^J\) where \(l\) is the number of investment levels.
The problem will involve \(\frac{I(I-1)}{2}\) binary variables and \(I\) continuous (\(\lambda\)) variables.
References¶
scipy.optimize.shgo
SciPy v1.5.4 Reference Guide: Optimization and root finding (scipy.optimize
) URL: https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.shgo.html#rb2e152d227b3-1 Last accessed 12/28/2020.scipy.optimize.differential_evolution
SciPy v1.5.4 Reference Guide: Optimization and root finding (scipy.optimize
) URL: https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.differential_evolution.html Last accessed 12/28/2020.scipy.optimize.fmin_slsqp
SciPy v1.5.4 Reference Guide: Optimization and root finding (scipy.optimize
) URL: https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.fmin_slsqp.html Last accessed 12/28/2020.Endres, SC, Sandrock, C, Focke, WW. (2018) “A simplicial homology algorithm for Lipschitz optimisation”, Journal of Global Optimization (72): 181-217. URL: https://link.springer.com/article/10.1007/s10898-018-0645-y
Storn, R and Price, K. (1997) “Differential Evolution - a Simple and Efficient Heuristic for Global Optimization over Continuous Spaces”, Journal of Global Optimization (11): 341 - 359. URL: https://link.springer.com/article/10.1023/A:1008202821328
Kraft D (1988) A software package for sequential quadratic programming. Tech. Rep. DFVLR-FB 88-28, DLR German Aerospace Center — Institute for Flight Mechanics, Koln, Germany.
scipy.optimize.NonlinearConstraint
SciPy v1.5.4 Reference Guide: Optimization and root finding (scipy.optimize
) URL: https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.NonlinearConstraint.html Last accessed 12/29/2020.