.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_examples/programmers/plot_optimization.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code. .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_examples_programmers_plot_optimization.py: biogeme.optimization ==================== Examples of use of several functions. This is designed for programmers who need examples of use of the functions of the module. The examples are designed to illustrate the syntax. They do not correspond to any meaningful model. :author: Michel Bierlaire :date: Thu Nov 23 22:25:13 2023 .. GENERATED FROM PYTHON SOURCE LINES 15-25 .. code-block:: Python import pandas as pd from biogeme.version import get_text import biogeme.biogeme as bio import biogeme.database as db from biogeme import models from biogeme.expressions import Beta, Variable import biogeme.biogeme_logging as blog .. GENERATED FROM PYTHON SOURCE LINES 26-27 Version of Biogeme. .. GENERATED FROM PYTHON SOURCE LINES 27-29 .. code-block:: Python print(get_text()) .. rst-class:: sphx-glr-script-out .. code-block:: none biogeme 3.2.14 [2024-08-05] Home page: http://biogeme.epfl.ch Submit questions to https://groups.google.com/d/forum/biogeme Michel Bierlaire, Transport and Mobility Laboratory, Ecole Polytechnique Fédérale de Lausanne (EPFL) .. GENERATED FROM PYTHON SOURCE LINES 30-32 .. code-block:: Python logger = blog.get_screen_logger(blog.INFO) .. GENERATED FROM PYTHON SOURCE LINES 33-34 Data, .. GENERATED FROM PYTHON SOURCE LINES 34-48 .. code-block:: Python df = pd.DataFrame( { 'Person': [1, 1, 1, 2, 2], 'Exclude': [0, 0, 1, 0, 1], 'Variable1': [1, 2, 3, 4, 5], 'Variable2': [10, 20, 30, 40, 50], 'Choice': [1, 2, 3, 1, 2], 'Av1': [0, 1, 1, 1, 1], 'Av2': [1, 1, 1, 1, 1], 'Av3': [0, 1, 1, 1, 1], } ) df .. raw:: html
Person Exclude Variable1 Variable2 Choice Av1 Av2 Av3
0 1 0 1 10 1 0 1 0
1 1 0 2 20 2 1 1 1
2 1 1 3 30 3 1 1 1
3 2 0 4 40 1 1 1 1
4 2 1 5 50 2 1 1 1


.. GENERATED FROM PYTHON SOURCE LINES 49-51 .. code-block:: Python my_data = db.Database('test', df) .. GENERATED FROM PYTHON SOURCE LINES 52-53 Variables. .. GENERATED FROM PYTHON SOURCE LINES 53-63 .. code-block:: Python Choice = Variable('Choice') Variable1 = Variable('Variable1') Variable2 = Variable('Variable2') beta1 = Beta('beta1', 0, None, None, 0) beta2 = Beta('beta2', 0, None, None, 0) V1 = beta1 * Variable1 V2 = beta2 * Variable2 V3 = 0 V = {1: V1, 2: V2, 3: V3} .. GENERATED FROM PYTHON SOURCE LINES 64-72 .. code-block:: Python likelihood = models.loglogit(V, av=None, i=Choice) my_biogeme = bio.BIOGEME(my_data, likelihood) my_biogeme.modelName = 'simpleExample' my_biogeme.save_iterations = False my_biogeme.generate_html = False my_biogeme.generate_pickle = False print(my_biogeme) .. rst-class:: sphx-glr-script-out .. code-block:: none Biogeme parameters read from biogeme.toml. simpleExample: database [test]{'log_like': _bioLogLogitFullChoiceSet[choice=Choice]U=(1:(Beta('beta1', 0, None, None, 0) * Variable1), 2:(Beta('beta2', 0, None, None, 0) * Variable2), 3:`0.0`)av=(1:`1.0`, 2:`1.0`, 3:`1.0`)} .. GENERATED FROM PYTHON SOURCE LINES 73-77 .. code-block:: Python f, g, h, gdiff, hdiff = my_biogeme.check_derivatives( beta=my_biogeme.beta_values_dict_to_list(), verbose=True ) .. rst-class:: sphx-glr-script-out .. code-block:: none x Gradient FinDiff Difference beta1 +2.220446E-16 -6.039613E-07 +6.039613E-07 beta2 +2.000000E+01 +1.999994E+01 +6.110388E-05 Row Col Hessian FinDiff Difference beta1 beta1 -1.222222E+01 -1.222222E+01 +8.314151E-07 beta1 beta2 +6.111111E+01 +6.111115E+01 -4.166707E-05 beta2 beta1 +6.111111E+01 +6.111112E+01 -4.274759E-06 beta2 beta2 -1.222222E+03 -1.222223E+03 +8.332704E-04 .. GENERATED FROM PYTHON SOURCE LINES 78-80 .. code-block:: Python pd.DataFrame(gdiff) .. raw:: html
0
0 6.039613e-07
1 6.110388e-05


.. GENERATED FROM PYTHON SOURCE LINES 81-83 .. code-block:: Python pd.DataFrame(hdiff) .. raw:: html
0 1
0 8.314151e-07 -0.000042
1 -4.274759e-06 0.000833


.. GENERATED FROM PYTHON SOURCE LINES 84-85 **scipy**: this is the optimization algorithm from scipy. .. GENERATED FROM PYTHON SOURCE LINES 85-89 .. code-block:: Python my_biogeme.algorithm_name = 'scipy' results = my_biogeme.estimate() results.get_estimated_parameters() .. rst-class:: sphx-glr-script-out .. code-block:: none As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds" As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds" Optimization algorithm: hybrid Newton/BFGS with simple bounds [simple_bounds] ** Optimization: Newton with trust region for simple bounds Iter. beta1 beta2 Function Relgrad Radius Rho 0 0.11 0.022 5.3 0.13 10 1 ++ 1 0.14 0.023 5.3 0.015 1e+02 1 ++ 2 0.14 0.023 5.3 3.3e-07 1e+02 1 ++ .. raw:: html
Value Rob. Std err Rob. t-test Rob. p-value
beta1 0.144545 0.366198 0.394720 0.693050
beta2 0.023502 0.034280 0.685573 0.492982


.. GENERATED FROM PYTHON SOURCE LINES 90-93 .. code-block:: Python for k, v in results.data.optimizationMessages.items(): print(f'{k}:\t{v}') .. rst-class:: sphx-glr-script-out .. code-block:: none Relative gradient: 3.2893676183728316e-07 Cause of termination: Relative gradient = 3.3e-07 <= 0.00012 Number of function evaluations: 4 Number of gradient evaluations: 4 Number of hessian evaluations: 3 Algorithm: Newton with trust region for simple bound constraints Number of iterations: 3 Proportion of Hessian calculation: 3/3 = 100.0% Optimization time: 0:00:00.003657 .. GENERATED FROM PYTHON SOURCE LINES 94-95 **Newton with linesearch** .. GENERATED FROM PYTHON SOURCE LINES 95-99 .. code-block:: Python my_biogeme.algorithm_name = 'LS-newton' results = my_biogeme.estimate() results.get_estimated_parameters() .. rst-class:: sphx-glr-script-out .. code-block:: none As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds" As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds" Optimization algorithm: hybrid Newton/BFGS with simple bounds [simple_bounds] ** Optimization: Newton with trust region for simple bounds Iter. beta1 beta2 Function Relgrad Radius Rho 0 0.11 0.022 5.3 0.13 10 1 ++ 1 0.14 0.023 5.3 0.015 1e+02 1 ++ 2 0.14 0.023 5.3 3.3e-07 1e+02 1 ++ .. raw:: html
Value Rob. Std err Rob. t-test Rob. p-value
beta1 0.144545 0.366198 0.394720 0.693050
beta2 0.023502 0.034280 0.685573 0.492982


.. GENERATED FROM PYTHON SOURCE LINES 100-103 .. code-block:: Python for k, v in results.data.optimizationMessages.items(): print(f'{k}:\t{v}') .. rst-class:: sphx-glr-script-out .. code-block:: none Relative gradient: 3.2893676183728316e-07 Cause of termination: Relative gradient = 3.3e-07 <= 0.00012 Number of function evaluations: 4 Number of gradient evaluations: 4 Number of hessian evaluations: 3 Algorithm: Newton with trust region for simple bound constraints Number of iterations: 3 Proportion of Hessian calculation: 3/3 = 100.0% Optimization time: 0:00:00.003638 .. GENERATED FROM PYTHON SOURCE LINES 104-105 Changing the requested precision .. GENERATED FROM PYTHON SOURCE LINES 105-109 .. code-block:: Python my_biogeme.tolerance = 0.1 results = my_biogeme.estimate() results.get_estimated_parameters() .. rst-class:: sphx-glr-script-out .. code-block:: none As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds" As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds" Optimization algorithm: hybrid Newton/BFGS with simple bounds [simple_bounds] ** Optimization: Newton with trust region for simple bounds Iter. beta1 beta2 Function Relgrad Radius Rho 0 0.11 0.022 5.3 0.13 10 1 ++ 1 0.11 0.022 5.3 0.015 10 1 ++ .. raw:: html
Value Rob. Std err Rob. t-test Rob. p-value
beta1 0.144317 0.365691 0.394641 0.693108
beta2 0.023428 0.034256 0.683887 0.494047


.. GENERATED FROM PYTHON SOURCE LINES 110-113 .. code-block:: Python for k, v in results.data.optimizationMessages.items(): print(f'{k}:\t{v}') .. rst-class:: sphx-glr-script-out .. code-block:: none Relative gradient: 0.01474652490560303 Cause of termination: Relative gradient = 0.015 <= 0.1 Number of function evaluations: 3 Number of gradient evaluations: 3 Number of hessian evaluations: 2 Algorithm: Newton with trust region for simple bound constraints Number of iterations: 2 Proportion of Hessian calculation: 2/2 = 100.0% Optimization time: 0:00:00.002998 .. GENERATED FROM PYTHON SOURCE LINES 114-115 **Newton with trust region** .. GENERATED FROM PYTHON SOURCE LINES 115-119 .. code-block:: Python my_biogeme.algorithm_name = 'TR-newton' results = my_biogeme.estimate() results.get_estimated_parameters() .. rst-class:: sphx-glr-script-out .. code-block:: none As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds" As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds" Optimization algorithm: hybrid Newton/BFGS with simple bounds [simple_bounds] ** Optimization: Newton with trust region for simple bounds Iter. beta1 beta2 Function Relgrad Radius Rho 0 0.11 0.022 5.3 0.13 10 1 ++ 1 0.11 0.022 5.3 0.015 10 1 ++ .. raw:: html
Value Rob. Std err Rob. t-test Rob. p-value
beta1 0.144317 0.365691 0.394641 0.693108
beta2 0.023428 0.034256 0.683887 0.494047


.. GENERATED FROM PYTHON SOURCE LINES 120-123 .. code-block:: Python for k, v in results.data.optimizationMessages.items(): print(f'{k}:\t{v}') .. rst-class:: sphx-glr-script-out .. code-block:: none Relative gradient: 0.01474652490560303 Cause of termination: Relative gradient = 0.015 <= 0.1 Number of function evaluations: 3 Number of gradient evaluations: 3 Number of hessian evaluations: 2 Algorithm: Newton with trust region for simple bound constraints Number of iterations: 2 Proportion of Hessian calculation: 2/2 = 100.0% Optimization time: 0:00:00.002991 .. GENERATED FROM PYTHON SOURCE LINES 124-128 We illustrate the parameters. We use the truncated conjugate gradient instead of dogleg for the trust region subproblem, starting with a small trust region of radius 0.001, and a maximum of 3 iterations. .. GENERATED FROM PYTHON SOURCE LINES 130-136 .. code-block:: Python my_biogeme.dogleg = False my_biogeme.initial_radius = 0.001 my_biogeme.maxiter = 3 results = my_biogeme.estimate() results.get_estimated_parameters() .. rst-class:: sphx-glr-script-out .. code-block:: none As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds" As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds" Optimization algorithm: hybrid Newton/BFGS with simple bounds [simple_bounds] ** Optimization: Newton with trust region for simple bounds Iter. beta1 beta2 Function Relgrad Radius Rho 0 0 0.001 5.5 3.4 0.01 1 ++ 1 0.01 0.011 5.3 1.2 0.1 0.99 ++ 2 0.11 0.021 5.3 0.11 1 1 ++ 3 0.11 0.021 5.3 0.013 1 1 ++ .. raw:: html
Value Rob. Std err Rob. t-test Rob. p-value
beta1 0.143674 0.36550 0.393088 0.694254
beta2 0.023392 0.03422 0.683568 0.494248


.. GENERATED FROM PYTHON SOURCE LINES 137-140 .. code-block:: Python for k, v in results.data.optimizationMessages.items(): print(f'{k}:\t{v}') .. rst-class:: sphx-glr-script-out .. code-block:: none Relative gradient: 0.013414091872560863 Cause of termination: Relative gradient = 0.013 <= 0.1 Number of function evaluations: 5 Number of gradient evaluations: 5 Number of hessian evaluations: 4 Algorithm: Newton with trust region for simple bound constraints Number of iterations: 4 Proportion of Hessian calculation: 4/4 = 100.0% Optimization time: 0:00:00.004514 .. GENERATED FROM PYTHON SOURCE LINES 141-142 Changing the requested precision .. GENERATED FROM PYTHON SOURCE LINES 142-147 .. code-block:: Python my_biogeme.tolerance = 0.1 my_biogeme.maxiter = 1000 results = my_biogeme.estimate() results.get_estimated_parameters() .. rst-class:: sphx-glr-script-out .. code-block:: none As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds" As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds" Optimization algorithm: hybrid Newton/BFGS with simple bounds [simple_bounds] ** Optimization: Newton with trust region for simple bounds Iter. beta1 beta2 Function Relgrad Radius Rho 0 0 0.001 5.5 3.4 0.01 1 ++ 1 0.01 0.011 5.3 1.2 0.1 0.99 ++ 2 0.11 0.021 5.3 0.11 1 1 ++ 3 0.11 0.021 5.3 0.013 1 1 ++ .. raw:: html
Value Rob. Std err Rob. t-test Rob. p-value
beta1 0.143674 0.36550 0.393088 0.694254
beta2 0.023392 0.03422 0.683568 0.494248


.. GENERATED FROM PYTHON SOURCE LINES 148-151 .. code-block:: Python for k, v in results.data.optimizationMessages.items(): print(f'{k}:\t{v}') .. rst-class:: sphx-glr-script-out .. code-block:: none Relative gradient: 0.013414091872560863 Cause of termination: Relative gradient = 0.013 <= 0.1 Number of function evaluations: 5 Number of gradient evaluations: 5 Number of hessian evaluations: 4 Algorithm: Newton with trust region for simple bound constraints Number of iterations: 4 Proportion of Hessian calculation: 4/4 = 100.0% Optimization time: 0:00:00.005515 .. GENERATED FROM PYTHON SOURCE LINES 152-153 **BFGS with line search** .. GENERATED FROM PYTHON SOURCE LINES 153-159 .. code-block:: Python my_biogeme.algorithm_name = 'LS-BFGS' my_biogeme.tolerance = 1.0e-6 my_biogeme.maxiter = 1000 results = my_biogeme.estimate() results.get_estimated_parameters() .. rst-class:: sphx-glr-script-out .. code-block:: none As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds" As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds" Optimization algorithm: hybrid Newton/BFGS with simple bounds [simple_bounds] ** Optimization: Newton with trust region for simple bounds Iter. beta1 beta2 Function Relgrad Radius Rho 0 0 0.001 5.5 3.4 0.01 1 ++ 1 0.01 0.011 5.3 1.2 0.1 0.99 ++ 2 0.11 0.021 5.3 0.11 1 1 ++ 3 0.14 0.023 5.3 0.013 10 1 ++ 4 0.14 0.024 5.3 8.4e-06 1e+02 1 ++ 5 0.14 0.024 5.3 2e-11 1e+02 1 ++ .. raw:: html
Value Rob. Std err Rob. t-test Rob. p-value
beta1 0.144546 0.366198 0.394720 0.693049
beta2 0.023502 0.034280 0.685574 0.492982


.. GENERATED FROM PYTHON SOURCE LINES 160-163 .. code-block:: Python for k, v in results.data.optimizationMessages.items(): print(f'{k}:\t{v}') .. rst-class:: sphx-glr-script-out .. code-block:: none Relative gradient: 2.0482365008011036e-11 Cause of termination: Relative gradient = 2e-11 <= 1e-06 Number of function evaluations: 7 Number of gradient evaluations: 7 Number of hessian evaluations: 6 Algorithm: Newton with trust region for simple bound constraints Number of iterations: 6 Proportion of Hessian calculation: 6/6 = 100.0% Optimization time: 0:00:00.007228 .. GENERATED FROM PYTHON SOURCE LINES 164-165 **BFGS with trust region** .. GENERATED FROM PYTHON SOURCE LINES 165-169 .. code-block:: Python my_biogeme.algorithm_name = 'TR-BFGS' results = my_biogeme.estimate() results.get_estimated_parameters() .. rst-class:: sphx-glr-script-out .. code-block:: none As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds" As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds" Optimization algorithm: hybrid Newton/BFGS with simple bounds [simple_bounds] ** Optimization: Newton with trust region for simple bounds Iter. beta1 beta2 Function Relgrad Radius Rho 0 0 0.001 5.5 3.4 0.01 1 ++ 1 0.01 0.011 5.3 1.2 0.1 0.99 ++ 2 0.11 0.021 5.3 0.11 1 1 ++ 3 0.14 0.023 5.3 0.013 10 1 ++ 4 0.14 0.024 5.3 8.4e-06 1e+02 1 ++ 5 0.14 0.024 5.3 2e-11 1e+02 1 ++ .. raw:: html
Value Rob. Std err Rob. t-test Rob. p-value
beta1 0.144546 0.366198 0.394720 0.693049
beta2 0.023502 0.034280 0.685574 0.492982


.. GENERATED FROM PYTHON SOURCE LINES 170-173 .. code-block:: Python for k, v in results.data.optimizationMessages.items(): print(f'{k}:\t{v}') .. rst-class:: sphx-glr-script-out .. code-block:: none Relative gradient: 2.0482365008011036e-11 Cause of termination: Relative gradient = 2e-11 <= 1e-06 Number of function evaluations: 7 Number of gradient evaluations: 7 Number of hessian evaluations: 6 Algorithm: Newton with trust region for simple bound constraints Number of iterations: 6 Proportion of Hessian calculation: 6/6 = 100.0% Optimization time: 0:00:00.005795 .. GENERATED FROM PYTHON SOURCE LINES 174-175 **Newton/BFGS with trust region for simple bounds** .. GENERATED FROM PYTHON SOURCE LINES 177-180 This is the default algorithm used by Biogeme. It is the implementation of the algorithm proposed by `Conn et al. (1988) `_. .. GENERATED FROM PYTHON SOURCE LINES 180-184 .. code-block:: Python my_biogeme.algorithm_name = 'simple_bounds' results = my_biogeme.estimate() results.get_estimated_parameters() .. rst-class:: sphx-glr-script-out .. code-block:: none As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds" As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds" Optimization algorithm: hybrid Newton/BFGS with simple bounds [simple_bounds] ** Optimization: Newton with trust region for simple bounds Iter. beta1 beta2 Function Relgrad Radius Rho 0 0 0.001 5.5 3.4 0.01 1 ++ 1 0.01 0.011 5.3 1.2 0.1 0.99 ++ 2 0.11 0.021 5.3 0.11 1 1 ++ 3 0.14 0.023 5.3 0.013 10 1 ++ 4 0.14 0.024 5.3 8.4e-06 1e+02 1 ++ 5 0.14 0.024 5.3 2e-11 1e+02 1 ++ .. raw:: html
Value Rob. Std err Rob. t-test Rob. p-value
beta1 0.144546 0.366198 0.394720 0.693049
beta2 0.023502 0.034280 0.685574 0.492982


.. GENERATED FROM PYTHON SOURCE LINES 185-188 .. code-block:: Python for k, v in results.data.optimizationMessages.items(): print(f'{k}:\t{v}') .. rst-class:: sphx-glr-script-out .. code-block:: none Relative gradient: 2.0482365008011036e-11 Cause of termination: Relative gradient = 2e-11 <= 1e-06 Number of function evaluations: 7 Number of gradient evaluations: 7 Number of hessian evaluations: 6 Algorithm: Newton with trust region for simple bound constraints Number of iterations: 6 Proportion of Hessian calculation: 6/6 = 100.0% Optimization time: 0:00:00.005825 .. GENERATED FROM PYTHON SOURCE LINES 189-193 When the second derivatives are too computationally expensive to calculate, it is possible to avoid calculating them at each successful iteration. The parameter `second_derivatives` allows to control that. .. GENERATED FROM PYTHON SOURCE LINES 195-199 .. code-block:: Python my_biogeme.second_derivatives = 0.5 results = my_biogeme.estimate() results.get_estimated_parameters() .. rst-class:: sphx-glr-script-out .. code-block:: none As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds" As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds" Optimization algorithm: hybrid Newton/BFGS with simple bounds [simple_bounds] ** Optimization: Newton with trust region for simple bounds Iter. beta1 beta2 Function Relgrad Radius Rho 0 0 0.001 5.5 3.4 0.01 1 ++ 1 0.01 0.011 5.3 1.2 0.1 0.99 ++ 2 0.11 0.021 5.3 0.11 1 1 ++ 3 0.14 0.023 5.3 0.013 10 1 ++ 4 0.14 0.024 5.3 8.4e-06 1e+02 1 ++ 5 0.14 0.024 5.3 2e-11 1e+02 1 ++ .. raw:: html
Value Rob. Std err Rob. t-test Rob. p-value
beta1 0.144546 0.366198 0.394720 0.693049
beta2 0.023502 0.034280 0.685574 0.492982


.. GENERATED FROM PYTHON SOURCE LINES 200-203 .. code-block:: Python for k, v in results.data.optimizationMessages.items(): print(f'{k}:\t{v}') .. rst-class:: sphx-glr-script-out .. code-block:: none Relative gradient: 2.0482365008011036e-11 Cause of termination: Relative gradient = 2e-11 <= 1e-06 Number of function evaluations: 7 Number of gradient evaluations: 7 Number of hessian evaluations: 6 Algorithm: Newton with trust region for simple bound constraints Number of iterations: 6 Proportion of Hessian calculation: 6/6 = 100.0% Optimization time: 0:00:00.005796 .. GENERATED FROM PYTHON SOURCE LINES 204-206 If the parameter is set to zero, the second derivatives are not used at all, and the algorithm relies only on the BFGS update. .. GENERATED FROM PYTHON SOURCE LINES 208-212 .. code-block:: Python my_biogeme.second_derivatives = 0.0 results = my_biogeme.estimate() results.get_estimated_parameters() .. rst-class:: sphx-glr-script-out .. code-block:: none As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds" As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds" Optimization algorithm: hybrid Newton/BFGS with simple bounds [simple_bounds] ** Optimization: Newton with trust region for simple bounds Iter. beta1 beta2 Function Relgrad Radius Rho 0 0 0.001 5.5 3.4 0.01 1 ++ 1 0.01 0.011 5.3 1.2 0.1 0.99 ++ 2 0.11 0.021 5.3 0.11 1 1 ++ 3 0.14 0.023 5.3 0.013 10 1 ++ 4 0.14 0.024 5.3 8.4e-06 1e+02 1 ++ 5 0.14 0.024 5.3 2e-11 1e+02 1 ++ .. raw:: html
Value Rob. Std err Rob. t-test Rob. p-value
beta1 0.144546 0.366198 0.394720 0.693049
beta2 0.023502 0.034280 0.685574 0.492982


.. GENERATED FROM PYTHON SOURCE LINES 213-216 .. code-block:: Python for k, v in results.data.optimizationMessages.items(): print(f'{k}:\t{v}') .. rst-class:: sphx-glr-script-out .. code-block:: none Relative gradient: 2.0482365008011036e-11 Cause of termination: Relative gradient = 2e-11 <= 1e-06 Number of function evaluations: 7 Number of gradient evaluations: 7 Number of hessian evaluations: 6 Algorithm: Newton with trust region for simple bound constraints Number of iterations: 6 Proportion of Hessian calculation: 6/6 = 100.0% Optimization time: 0:00:00.005807 .. GENERATED FROM PYTHON SOURCE LINES 217-218 There are shortcuts to call the BFGS and the Newton versions .. GENERATED FROM PYTHON SOURCE LINES 218-222 .. code-block:: Python my_biogeme.algorithm_name = 'simple_bounds_newton' results = my_biogeme.estimate() results.get_estimated_parameters() .. rst-class:: sphx-glr-script-out .. code-block:: none As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds" As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds" Optimization algorithm: hybrid Newton/BFGS with simple bounds [simple_bounds] ** Optimization: Newton with trust region for simple bounds Iter. beta1 beta2 Function Relgrad Radius Rho 0 0 0.001 5.5 3.4 0.01 1 ++ 1 0.01 0.011 5.3 1.2 0.1 0.99 ++ 2 0.11 0.021 5.3 0.11 1 1 ++ 3 0.14 0.023 5.3 0.013 10 1 ++ 4 0.14 0.024 5.3 8.4e-06 1e+02 1 ++ 5 0.14 0.024 5.3 2e-11 1e+02 1 ++ .. raw:: html
Value Rob. Std err Rob. t-test Rob. p-value
beta1 0.144546 0.366198 0.394720 0.693049
beta2 0.023502 0.034280 0.685574 0.492982


.. GENERATED FROM PYTHON SOURCE LINES 223-226 .. code-block:: Python for k, v in results.data.optimizationMessages.items(): print(f'{k}:\t{v}') .. rst-class:: sphx-glr-script-out .. code-block:: none Relative gradient: 2.0482365008011036e-11 Cause of termination: Relative gradient = 2e-11 <= 1e-06 Number of function evaluations: 7 Number of gradient evaluations: 7 Number of hessian evaluations: 6 Algorithm: Newton with trust region for simple bound constraints Number of iterations: 6 Proportion of Hessian calculation: 6/6 = 100.0% Optimization time: 0:00:00.005826 .. GENERATED FROM PYTHON SOURCE LINES 227-231 .. code-block:: Python my_biogeme.algorithm_name = 'simple_bounds_BFGS' results = my_biogeme.estimate() results.get_estimated_parameters() .. rst-class:: sphx-glr-script-out .. code-block:: none As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds" As the model is not too complex, we activate the calculation of second derivatives. If you want to change it, change the name of the algorithm in the TOML file from "automatic" to "simple_bounds" Optimization algorithm: hybrid Newton/BFGS with simple bounds [simple_bounds] ** Optimization: Newton with trust region for simple bounds Iter. beta1 beta2 Function Relgrad Radius Rho 0 0 0.001 5.5 3.4 0.01 1 ++ 1 0.01 0.011 5.3 1.2 0.1 0.99 ++ 2 0.11 0.021 5.3 0.11 1 1 ++ 3 0.14 0.023 5.3 0.013 10 1 ++ 4 0.14 0.024 5.3 8.4e-06 1e+02 1 ++ 5 0.14 0.024 5.3 2e-11 1e+02 1 ++ .. raw:: html
Value Rob. Std err Rob. t-test Rob. p-value
beta1 0.144546 0.366198 0.394720 0.693049
beta2 0.023502 0.034280 0.685574 0.492982


.. GENERATED FROM PYTHON SOURCE LINES 232-234 .. code-block:: Python for k, v in results.data.optimizationMessages.items(): print(f'{k}:\t{v}') .. rst-class:: sphx-glr-script-out .. code-block:: none Relative gradient: 2.0482365008011036e-11 Cause of termination: Relative gradient = 2e-11 <= 1e-06 Number of function evaluations: 7 Number of gradient evaluations: 7 Number of hessian evaluations: 6 Algorithm: Newton with trust region for simple bound constraints Number of iterations: 6 Proportion of Hessian calculation: 6/6 = 100.0% Optimization time: 0:00:00.006087 .. rst-class:: sphx-glr-timing **Total running time of the script:** (0 minutes 0.111 seconds) .. _sphx_glr_download_auto_examples_programmers_plot_optimization.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_optimization.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_optimization.py ` .. container:: sphx-glr-download sphx-glr-download-zip :download:`Download zipped: plot_optimization.zip ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_