biogeme.optimization

Examples of use of several functions.

This is designed for programmers who need examples of use of the functions of the module. The examples are designed to illustrate the syntax. They do not correspond to any meaningful model.

author:

Michel Bierlaire

date:

Thu Nov 23 22:25:13 2023

import pandas as pd
from biogeme.version import get_text
import biogeme.biogeme as bio
import biogeme.database as db
from biogeme import models
from biogeme.expressions import Beta, Variable
import biogeme.biogeme_logging as blog

Version of Biogeme.

print(get_text())
biogeme 3.2.14 [2024-08-05]
Home page: http://biogeme.epfl.ch
Submit questions to https://groups.google.com/d/forum/biogeme
Michel Bierlaire, Transport and Mobility Laboratory, Ecole Polytechnique Fédérale de Lausanne (EPFL)
logger = blog.get_screen_logger(blog.INFO)

Data,

df = pd.DataFrame(
    {
        'Person': [1, 1, 1, 2, 2],
        'Exclude': [0, 0, 1, 0, 1],
        'Variable1': [1, 2, 3, 4, 5],
        'Variable2': [10, 20, 30, 40, 50],
        'Choice': [1, 2, 3, 1, 2],
        'Av1': [0, 1, 1, 1, 1],
        'Av2': [1, 1, 1, 1, 1],
        'Av3': [0, 1, 1, 1, 1],
    }
)
df
Person Exclude Variable1 Variable2 Choice Av1 Av2 Av3
0 1 0 1 10 1 0 1 0
1 1 0 2 20 2 1 1 1
2 1 1 3 30 3 1 1 1
3 2 0 4 40 1 1 1 1
4 2 1 5 50 2 1 1 1


my_data = db.Database('test', df)

Variables.

Choice = Variable('Choice')
Variable1 = Variable('Variable1')
Variable2 = Variable('Variable2')
beta1 = Beta('beta1', 0, None, None, 0)
beta2 = Beta('beta2', 0, None, None, 0)
V1 = beta1 * Variable1
V2 = beta2 * Variable2
V3 = 0
V = {1: V1, 2: V2, 3: V3}
likelihood = models.loglogit(V, av=None, i=Choice)
my_biogeme = bio.BIOGEME(my_data, likelihood)
my_biogeme.modelName = 'simpleExample'
my_biogeme.save_iterations = False
my_biogeme.generate_html = False
my_biogeme.generate_pickle = False
print(my_biogeme)
Biogeme parameters read from biogeme.toml.
simpleExample: database [test]{'log_like': _bioLogLogitFullChoiceSet[choice=Choice]U=(1:(Beta('beta1', 0, None, None, 0) * Variable1), 2:(Beta('beta2', 0, None, None, 0) * Variable2), 3:`0.0`)av=(1:`1.0`, 2:`1.0`, 3:`1.0`)}
f, g, h, gdiff, hdiff = my_biogeme.check_derivatives(
    beta=my_biogeme.beta_values_dict_to_list(), verbose=True
)
x               Gradient        FinDiff         Difference
beta1           +2.220446E-16   -6.039613E-07   +6.039613E-07
beta2           +2.000000E+01   +1.999994E+01   +6.110388E-05
Row             Col             Hessian FinDiff         Difference
beta1           beta1           -1.222222E+01   -1.222222E+01   +8.314151E-07
beta1           beta2           +6.111111E+01   +6.111115E+01   -4.166707E-05
beta2           beta1           +6.111111E+01   +6.111112E+01   -4.274759E-06
beta2           beta2           -1.222222E+03   -1.222223E+03   +8.332704E-04
pd.DataFrame(gdiff)
0
0 6.039613e-07
1 6.110388e-05


pd.DataFrame(hdiff)
0 1
0 8.314151e-07 -0.000042
1 -4.274759e-06 0.000833


scipy: this is the optimization algorithm from scipy.

my_biogeme.algorithm_name = 'scipy'
results = my_biogeme.estimate()
results.get_estimated_parameters()
As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds"
As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds"
Optimization algorithm: hybrid Newton/BFGS with simple bounds [simple_bounds]
** Optimization: Newton with trust region for simple bounds
Iter.           beta1           beta2     Function    Relgrad   Radius      Rho
    0            0.11           0.022          5.3       0.13       10        1   ++
    1            0.14           0.023          5.3      0.015    1e+02        1   ++
    2            0.14           0.023          5.3    3.3e-07    1e+02        1   ++
Value Rob. Std err Rob. t-test Rob. p-value
beta1 0.144545 0.366198 0.394720 0.693050
beta2 0.023502 0.034280 0.685573 0.492982


for k, v in results.data.optimizationMessages.items():
    print(f'{k}:\t{v}')
Relative gradient:      3.2893676183728316e-07
Cause of termination:   Relative gradient = 3.3e-07 <= 0.00012
Number of function evaluations: 4
Number of gradient evaluations: 4
Number of hessian evaluations:  3
Algorithm:      Newton with trust region for simple bound constraints
Number of iterations:   3
Proportion of Hessian calculation:      3/3 = 100.0%
Optimization time:      0:00:00.003657

Newton with linesearch

my_biogeme.algorithm_name = 'LS-newton'
results = my_biogeme.estimate()
results.get_estimated_parameters()
As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds"
As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds"
Optimization algorithm: hybrid Newton/BFGS with simple bounds [simple_bounds]
** Optimization: Newton with trust region for simple bounds
Iter.           beta1           beta2     Function    Relgrad   Radius      Rho
    0            0.11           0.022          5.3       0.13       10        1   ++
    1            0.14           0.023          5.3      0.015    1e+02        1   ++
    2            0.14           0.023          5.3    3.3e-07    1e+02        1   ++
Value Rob. Std err Rob. t-test Rob. p-value
beta1 0.144545 0.366198 0.394720 0.693050
beta2 0.023502 0.034280 0.685573 0.492982


for k, v in results.data.optimizationMessages.items():
    print(f'{k}:\t{v}')
Relative gradient:      3.2893676183728316e-07
Cause of termination:   Relative gradient = 3.3e-07 <= 0.00012
Number of function evaluations: 4
Number of gradient evaluations: 4
Number of hessian evaluations:  3
Algorithm:      Newton with trust region for simple bound constraints
Number of iterations:   3
Proportion of Hessian calculation:      3/3 = 100.0%
Optimization time:      0:00:00.003638

Changing the requested precision

my_biogeme.tolerance = 0.1
results = my_biogeme.estimate()
results.get_estimated_parameters()
As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds"
As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds"
Optimization algorithm: hybrid Newton/BFGS with simple bounds [simple_bounds]
** Optimization: Newton with trust region for simple bounds
Iter.           beta1           beta2     Function    Relgrad   Radius      Rho
    0            0.11           0.022          5.3       0.13       10        1   ++
    1            0.11           0.022          5.3      0.015       10        1   ++
Value Rob. Std err Rob. t-test Rob. p-value
beta1 0.144317 0.365691 0.394641 0.693108
beta2 0.023428 0.034256 0.683887 0.494047


for k, v in results.data.optimizationMessages.items():
    print(f'{k}:\t{v}')
Relative gradient:      0.01474652490560303
Cause of termination:   Relative gradient = 0.015 <= 0.1
Number of function evaluations: 3
Number of gradient evaluations: 3
Number of hessian evaluations:  2
Algorithm:      Newton with trust region for simple bound constraints
Number of iterations:   2
Proportion of Hessian calculation:      2/2 = 100.0%
Optimization time:      0:00:00.002998

Newton with trust region

my_biogeme.algorithm_name = 'TR-newton'
results = my_biogeme.estimate()
results.get_estimated_parameters()
As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds"
As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds"
Optimization algorithm: hybrid Newton/BFGS with simple bounds [simple_bounds]
** Optimization: Newton with trust region for simple bounds
Iter.           beta1           beta2     Function    Relgrad   Radius      Rho
    0            0.11           0.022          5.3       0.13       10        1   ++
    1            0.11           0.022          5.3      0.015       10        1   ++
Value Rob. Std err Rob. t-test Rob. p-value
beta1 0.144317 0.365691 0.394641 0.693108
beta2 0.023428 0.034256 0.683887 0.494047


for k, v in results.data.optimizationMessages.items():
    print(f'{k}:\t{v}')
Relative gradient:      0.01474652490560303
Cause of termination:   Relative gradient = 0.015 <= 0.1
Number of function evaluations: 3
Number of gradient evaluations: 3
Number of hessian evaluations:  2
Algorithm:      Newton with trust region for simple bound constraints
Number of iterations:   2
Proportion of Hessian calculation:      2/2 = 100.0%
Optimization time:      0:00:00.002991

We illustrate the parameters. We use the truncated conjugate gradient instead of dogleg for the trust region subproblem, starting with a small trust region of radius 0.001, and a maximum of 3 iterations.

my_biogeme.dogleg = False
my_biogeme.initial_radius = 0.001
my_biogeme.maxiter = 3
results = my_biogeme.estimate()
results.get_estimated_parameters()
As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds"
As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds"
Optimization algorithm: hybrid Newton/BFGS with simple bounds [simple_bounds]
** Optimization: Newton with trust region for simple bounds
Iter.           beta1           beta2     Function    Relgrad   Radius      Rho
    0               0           0.001          5.5        3.4     0.01        1   ++
    1            0.01           0.011          5.3        1.2      0.1     0.99   ++
    2            0.11           0.021          5.3       0.11        1        1   ++
    3            0.11           0.021          5.3      0.013        1        1   ++
Value Rob. Std err Rob. t-test Rob. p-value
beta1 0.143674 0.36550 0.393088 0.694254
beta2 0.023392 0.03422 0.683568 0.494248


for k, v in results.data.optimizationMessages.items():
    print(f'{k}:\t{v}')
Relative gradient:      0.013414091872560863
Cause of termination:   Relative gradient = 0.013 <= 0.1
Number of function evaluations: 5
Number of gradient evaluations: 5
Number of hessian evaluations:  4
Algorithm:      Newton with trust region for simple bound constraints
Number of iterations:   4
Proportion of Hessian calculation:      4/4 = 100.0%
Optimization time:      0:00:00.004514

Changing the requested precision

my_biogeme.tolerance = 0.1
my_biogeme.maxiter = 1000
results = my_biogeme.estimate()
results.get_estimated_parameters()
As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds"
As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds"
Optimization algorithm: hybrid Newton/BFGS with simple bounds [simple_bounds]
** Optimization: Newton with trust region for simple bounds
Iter.           beta1           beta2     Function    Relgrad   Radius      Rho
    0               0           0.001          5.5        3.4     0.01        1   ++
    1            0.01           0.011          5.3        1.2      0.1     0.99   ++
    2            0.11           0.021          5.3       0.11        1        1   ++
    3            0.11           0.021          5.3      0.013        1        1   ++
Value Rob. Std err Rob. t-test Rob. p-value
beta1 0.143674 0.36550 0.393088 0.694254
beta2 0.023392 0.03422 0.683568 0.494248


for k, v in results.data.optimizationMessages.items():
    print(f'{k}:\t{v}')
Relative gradient:      0.013414091872560863
Cause of termination:   Relative gradient = 0.013 <= 0.1
Number of function evaluations: 5
Number of gradient evaluations: 5
Number of hessian evaluations:  4
Algorithm:      Newton with trust region for simple bound constraints
Number of iterations:   4
Proportion of Hessian calculation:      4/4 = 100.0%
Optimization time:      0:00:00.005515

BFGS with line search

my_biogeme.algorithm_name = 'LS-BFGS'
my_biogeme.tolerance = 1.0e-6
my_biogeme.maxiter = 1000
results = my_biogeme.estimate()
results.get_estimated_parameters()
As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds"
As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds"
Optimization algorithm: hybrid Newton/BFGS with simple bounds [simple_bounds]
** Optimization: Newton with trust region for simple bounds
Iter.           beta1           beta2     Function    Relgrad   Radius      Rho
    0               0           0.001          5.5        3.4     0.01        1   ++
    1            0.01           0.011          5.3        1.2      0.1     0.99   ++
    2            0.11           0.021          5.3       0.11        1        1   ++
    3            0.14           0.023          5.3      0.013       10        1   ++
    4            0.14           0.024          5.3    8.4e-06    1e+02        1   ++
    5            0.14           0.024          5.3      2e-11    1e+02        1   ++
Value Rob. Std err Rob. t-test Rob. p-value
beta1 0.144546 0.366198 0.394720 0.693049
beta2 0.023502 0.034280 0.685574 0.492982


for k, v in results.data.optimizationMessages.items():
    print(f'{k}:\t{v}')
Relative gradient:      2.0482365008011036e-11
Cause of termination:   Relative gradient = 2e-11 <= 1e-06
Number of function evaluations: 7
Number of gradient evaluations: 7
Number of hessian evaluations:  6
Algorithm:      Newton with trust region for simple bound constraints
Number of iterations:   6
Proportion of Hessian calculation:      6/6 = 100.0%
Optimization time:      0:00:00.007228

BFGS with trust region

my_biogeme.algorithm_name = 'TR-BFGS'
results = my_biogeme.estimate()
results.get_estimated_parameters()
As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds"
As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds"
Optimization algorithm: hybrid Newton/BFGS with simple bounds [simple_bounds]
** Optimization: Newton with trust region for simple bounds
Iter.           beta1           beta2     Function    Relgrad   Radius      Rho
    0               0           0.001          5.5        3.4     0.01        1   ++
    1            0.01           0.011          5.3        1.2      0.1     0.99   ++
    2            0.11           0.021          5.3       0.11        1        1   ++
    3            0.14           0.023          5.3      0.013       10        1   ++
    4            0.14           0.024          5.3    8.4e-06    1e+02        1   ++
    5            0.14           0.024          5.3      2e-11    1e+02        1   ++
Value Rob. Std err Rob. t-test Rob. p-value
beta1 0.144546 0.366198 0.394720 0.693049
beta2 0.023502 0.034280 0.685574 0.492982


for k, v in results.data.optimizationMessages.items():
    print(f'{k}:\t{v}')
Relative gradient:      2.0482365008011036e-11
Cause of termination:   Relative gradient = 2e-11 <= 1e-06
Number of function evaluations: 7
Number of gradient evaluations: 7
Number of hessian evaluations:  6
Algorithm:      Newton with trust region for simple bound constraints
Number of iterations:   6
Proportion of Hessian calculation:      6/6 = 100.0%
Optimization time:      0:00:00.005795

Newton/BFGS with trust region for simple bounds

This is the default algorithm used by Biogeme. It is the implementation of the algorithm proposed by Conn et al. (1988).

my_biogeme.algorithm_name = 'simple_bounds'
results = my_biogeme.estimate()
results.get_estimated_parameters()
As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds"
As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds"
Optimization algorithm: hybrid Newton/BFGS with simple bounds [simple_bounds]
** Optimization: Newton with trust region for simple bounds
Iter.           beta1           beta2     Function    Relgrad   Radius      Rho
    0               0           0.001          5.5        3.4     0.01        1   ++
    1            0.01           0.011          5.3        1.2      0.1     0.99   ++
    2            0.11           0.021          5.3       0.11        1        1   ++
    3            0.14           0.023          5.3      0.013       10        1   ++
    4            0.14           0.024          5.3    8.4e-06    1e+02        1   ++
    5            0.14           0.024          5.3      2e-11    1e+02        1   ++
Value Rob. Std err Rob. t-test Rob. p-value
beta1 0.144546 0.366198 0.394720 0.693049
beta2 0.023502 0.034280 0.685574 0.492982


for k, v in results.data.optimizationMessages.items():
    print(f'{k}:\t{v}')
Relative gradient:      2.0482365008011036e-11
Cause of termination:   Relative gradient = 2e-11 <= 1e-06
Number of function evaluations: 7
Number of gradient evaluations: 7
Number of hessian evaluations:  6
Algorithm:      Newton with trust region for simple bound constraints
Number of iterations:   6
Proportion of Hessian calculation:      6/6 = 100.0%
Optimization time:      0:00:00.005825

When the second derivatives are too computationally expensive to calculate, it is possible to avoid calculating them at each successful iteration. The parameter second_derivatives allows to control that.

my_biogeme.second_derivatives = 0.5
results = my_biogeme.estimate()
results.get_estimated_parameters()
As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds"
As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds"
Optimization algorithm: hybrid Newton/BFGS with simple bounds [simple_bounds]
** Optimization: Newton with trust region for simple bounds
Iter.           beta1           beta2     Function    Relgrad   Radius      Rho
    0               0           0.001          5.5        3.4     0.01        1   ++
    1            0.01           0.011          5.3        1.2      0.1     0.99   ++
    2            0.11           0.021          5.3       0.11        1        1   ++
    3            0.14           0.023          5.3      0.013       10        1   ++
    4            0.14           0.024          5.3    8.4e-06    1e+02        1   ++
    5            0.14           0.024          5.3      2e-11    1e+02        1   ++
Value Rob. Std err Rob. t-test Rob. p-value
beta1 0.144546 0.366198 0.394720 0.693049
beta2 0.023502 0.034280 0.685574 0.492982


for k, v in results.data.optimizationMessages.items():
    print(f'{k}:\t{v}')
Relative gradient:      2.0482365008011036e-11
Cause of termination:   Relative gradient = 2e-11 <= 1e-06
Number of function evaluations: 7
Number of gradient evaluations: 7
Number of hessian evaluations:  6
Algorithm:      Newton with trust region for simple bound constraints
Number of iterations:   6
Proportion of Hessian calculation:      6/6 = 100.0%
Optimization time:      0:00:00.005796

If the parameter is set to zero, the second derivatives are not used at all, and the algorithm relies only on the BFGS update.

my_biogeme.second_derivatives = 0.0
results = my_biogeme.estimate()
results.get_estimated_parameters()
As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds"
As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds"
Optimization algorithm: hybrid Newton/BFGS with simple bounds [simple_bounds]
** Optimization: Newton with trust region for simple bounds
Iter.           beta1           beta2     Function    Relgrad   Radius      Rho
    0               0           0.001          5.5        3.4     0.01        1   ++
    1            0.01           0.011          5.3        1.2      0.1     0.99   ++
    2            0.11           0.021          5.3       0.11        1        1   ++
    3            0.14           0.023          5.3      0.013       10        1   ++
    4            0.14           0.024          5.3    8.4e-06    1e+02        1   ++
    5            0.14           0.024          5.3      2e-11    1e+02        1   ++
Value Rob. Std err Rob. t-test Rob. p-value
beta1 0.144546 0.366198 0.394720 0.693049
beta2 0.023502 0.034280 0.685574 0.492982


for k, v in results.data.optimizationMessages.items():
    print(f'{k}:\t{v}')
Relative gradient:      2.0482365008011036e-11
Cause of termination:   Relative gradient = 2e-11 <= 1e-06
Number of function evaluations: 7
Number of gradient evaluations: 7
Number of hessian evaluations:  6
Algorithm:      Newton with trust region for simple bound constraints
Number of iterations:   6
Proportion of Hessian calculation:      6/6 = 100.0%
Optimization time:      0:00:00.005807

There are shortcuts to call the BFGS and the Newton versions

my_biogeme.algorithm_name = 'simple_bounds_newton'
results = my_biogeme.estimate()
results.get_estimated_parameters()
As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds"
As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds"
Optimization algorithm: hybrid Newton/BFGS with simple bounds [simple_bounds]
** Optimization: Newton with trust region for simple bounds
Iter.           beta1           beta2     Function    Relgrad   Radius      Rho
    0               0           0.001          5.5        3.4     0.01        1   ++
    1            0.01           0.011          5.3        1.2      0.1     0.99   ++
    2            0.11           0.021          5.3       0.11        1        1   ++
    3            0.14           0.023          5.3      0.013       10        1   ++
    4            0.14           0.024          5.3    8.4e-06    1e+02        1   ++
    5            0.14           0.024          5.3      2e-11    1e+02        1   ++
Value Rob. Std err Rob. t-test Rob. p-value
beta1 0.144546 0.366198 0.394720 0.693049
beta2 0.023502 0.034280 0.685574 0.492982


for k, v in results.data.optimizationMessages.items():
    print(f'{k}:\t{v}')
Relative gradient:      2.0482365008011036e-11
Cause of termination:   Relative gradient = 2e-11 <= 1e-06
Number of function evaluations: 7
Number of gradient evaluations: 7
Number of hessian evaluations:  6
Algorithm:      Newton with trust region for simple bound constraints
Number of iterations:   6
Proportion of Hessian calculation:      6/6 = 100.0%
Optimization time:      0:00:00.005826
my_biogeme.algorithm_name = 'simple_bounds_BFGS'
results = my_biogeme.estimate()
results.get_estimated_parameters()
As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds"
As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds"
Optimization algorithm: hybrid Newton/BFGS with simple bounds [simple_bounds]
** Optimization: Newton with trust region for simple bounds
Iter.           beta1           beta2     Function    Relgrad   Radius      Rho
    0               0           0.001          5.5        3.4     0.01        1   ++
    1            0.01           0.011          5.3        1.2      0.1     0.99   ++
    2            0.11           0.021          5.3       0.11        1        1   ++
    3            0.14           0.023          5.3      0.013       10        1   ++
    4            0.14           0.024          5.3    8.4e-06    1e+02        1   ++
    5            0.14           0.024          5.3      2e-11    1e+02        1   ++
Value Rob. Std err Rob. t-test Rob. p-value
beta1 0.144546 0.366198 0.394720 0.693049
beta2 0.023502 0.034280 0.685574 0.492982


for k, v in results.data.optimizationMessages.items():
    print(f'{k}:\t{v}')
Relative gradient:      2.0482365008011036e-11
Cause of termination:   Relative gradient = 2e-11 <= 1e-06
Number of function evaluations: 7
Number of gradient evaluations: 7
Number of hessian evaluations:  6
Algorithm:      Newton with trust region for simple bound constraints
Number of iterations:   6
Proportion of Hessian calculation:      6/6 = 100.0%
Optimization time:      0:00:00.006087

Total running time of the script: (0 minutes 0.111 seconds)

Gallery generated by Sphinx-Gallery