.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_examples/swissmetro/plot_b06b_unif_mixture_MHLS.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code. .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_examples_swissmetro_plot_b06b_unif_mixture_MHLS.py: 6b. Mixture of logit models with uniform MLHS draws =================================================== Example of a uniform mixture of logit models, using Monte-Carlo integration. The mixing distribution is uniform. The draws are from the Modified Hypercube Latin Square. Michel Bierlaire, EPFL Fri Jun 20 2025, 11:24:34 .. GENERATED FROM PYTHON SOURCE LINES 13-25 .. code-block:: Python from IPython.core.display_functions import display import biogeme.biogeme_logging as blog from biogeme.biogeme import BIOGEME from biogeme.expressions import Beta, Draws, MonteCarlo, log from biogeme.models import logit from biogeme.results_processing import ( EstimationResults, get_pandas_estimated_parameters, ) .. GENERATED FROM PYTHON SOURCE LINES 26-27 See the data processing script: :ref:`swissmetro_data`. .. GENERATED FROM PYTHON SOURCE LINES 27-44 .. code-block:: Python from swissmetro_data import ( CAR_AV_SP, CAR_CO_SCALED, CAR_TT_SCALED, CHOICE, SM_AV, SM_COST_SCALED, SM_TT_SCALED, TRAIN_AV_SP, TRAIN_COST_SCALED, TRAIN_TT_SCALED, database, ) logger = blog.get_screen_logger(level=blog.INFO) logger.info('Example b06b_unif_mixture_MHLS') .. rst-class:: sphx-glr-script-out .. code-block:: none Example b06b_unif_mixture_MHLS .. GENERATED FROM PYTHON SOURCE LINES 45-46 Parameters to be estimated. .. GENERATED FROM PYTHON SOURCE LINES 46-51 .. code-block:: Python asc_car = Beta('asc_car', 0, None, None, 0) asc_train = Beta('asc_train', 0, None, None, 0) asc_sm = Beta('asc_sm', 0, None, None, 1) b_cost = Beta('b_cost', 0, None, None, 0) .. GENERATED FROM PYTHON SOURCE LINES 52-54 Define a random parameter, normally distributed, designed to be used for Monte-Carlo simulation. .. GENERATED FROM PYTHON SOURCE LINES 54-56 .. code-block:: Python b_time = Beta('b_time', 0, None, None, 0) .. GENERATED FROM PYTHON SOURCE LINES 57-58 It is advised not to use 0 as starting value for the following parameter. .. GENERATED FROM PYTHON SOURCE LINES 58-60 .. code-block:: Python b_time_s = Beta('b_time_s', 1, None, None, 0) .. GENERATED FROM PYTHON SOURCE LINES 61-63 Define a random parameter, uniformly distributed, designed to be used for Monte-Carlo simulation. The type of draws is set to ``NORMAL_MLHS``. .. GENERATED FROM PYTHON SOURCE LINES 63-65 .. code-block:: Python b_time_rnd = b_time + b_time_s * Draws('b_time_rnd', 'NORMAL_MLHS') .. GENERATED FROM PYTHON SOURCE LINES 66-67 Definition of the utility functions. .. GENERATED FROM PYTHON SOURCE LINES 67-71 .. code-block:: Python v_train = asc_train + b_time_rnd * TRAIN_TT_SCALED + b_cost * TRAIN_COST_SCALED v_swissmetro = asc_sm + b_time_rnd * SM_TT_SCALED + b_cost * SM_COST_SCALED v_car = asc_car + b_time_rnd * CAR_TT_SCALED + b_cost * CAR_CO_SCALED .. GENERATED FROM PYTHON SOURCE LINES 72-73 Associate utility functions with the numbering of alternatives. .. GENERATED FROM PYTHON SOURCE LINES 73-75 .. code-block:: Python v = {1: v_train, 2: v_swissmetro, 3: v_car} .. GENERATED FROM PYTHON SOURCE LINES 76-77 Associate the availability conditions with the alternatives. .. GENERATED FROM PYTHON SOURCE LINES 77-79 .. code-block:: Python av = {1: TRAIN_AV_SP, 2: SM_AV, 3: CAR_AV_SP} .. GENERATED FROM PYTHON SOURCE LINES 80-81 Conditional on b_time_rnd, we have a logit model (called the kernel). .. GENERATED FROM PYTHON SOURCE LINES 81-83 .. code-block:: Python conditional_probability = logit(v, av, CHOICE) .. GENERATED FROM PYTHON SOURCE LINES 84-85 We integrate over b_time_rnd using Monte-Carlo .. GENERATED FROM PYTHON SOURCE LINES 85-88 .. code-block:: Python log_probability = log(MonteCarlo(conditional_probability)) .. GENERATED FROM PYTHON SOURCE LINES 89-90 Create the Biogeme object. .. GENERATED FROM PYTHON SOURCE LINES 90-93 .. code-block:: Python the_biogeme = BIOGEME(database, log_probability, number_of_draws=10000, seed=1223) the_biogeme.model_name = 'b06b_unif_mixture_MHLS' .. rst-class:: sphx-glr-script-out .. code-block:: none Biogeme parameters read from biogeme.toml. .. GENERATED FROM PYTHON SOURCE LINES 94-95 Estimate the parameters. .. GENERATED FROM PYTHON SOURCE LINES 95-102 .. code-block:: Python try: results = EstimationResults.from_yaml_file( filename=f'saved_results/{the_biogeme.model_name}.yaml' ) except FileNotFoundError: results = the_biogeme.estimate() .. rst-class:: sphx-glr-script-out .. code-block:: none *** Initial values of the parameters are obtained from the file __b06b_unif_mixture_MHLS.iter Cannot read file __b06b_unif_mixture_MHLS.iter. Statement is ignored. Starting values for the algorithm: {} As the model is rather complex, we cancel the calculation of second derivatives. If you want to control the parameters, change the algorithm from "automatic" to "simple_bounds" in the TOML file. Optimization algorithm: hybrid Newton/BFGS with simple bounds [simple_bounds] ** Optimization: BFGS with trust region for simple bounds Iter. asc_train b_time b_time_s b_cost asc_car Function Relgrad Radius Rho 0 -1 -1 2 -1 1 6.1e+03 0.16 1 0.25 + 1 -0.73 -2 3 -0.4 0 5.5e+03 0.049 1 0.36 + 2 -0.95 -2.3 2.6 -1.4 0.51 5.4e+03 0.054 1 0.39 + 3 -0.95 -2.3 2.6 -1.4 0.51 5.4e+03 0.054 0.5 -0.15 - 4 -0.45 -2.8 2.6 -1.1 0.006 5.3e+03 0.03 0.5 0.5 + 5 -0.091 -2.6 2.5 -1.6 0.33 5.3e+03 0.047 0.5 0.13 + 6 -0.091 -2.6 2.5 -1.6 0.33 5.3e+03 0.047 0.25 -0.18 - 7 -0.34 -2.9 2.3 -1.4 0.26 5.2e+03 0.022 0.25 0.65 + 8 -0.29 -2.6 2.2 -1.2 0.22 5.2e+03 0.0084 0.25 0.5 + 9 -0.29 -2.6 2.2 -1.2 0.22 5.2e+03 0.0084 0.12 -2.8 - 10 -0.29 -2.6 2.2 -1.2 0.22 5.2e+03 0.0084 0.062 -0.23 - 11 -0.36 -2.6 2.1 -1.3 0.28 5.2e+03 0.006 0.062 0.46 + 12 -0.29 -2.6 2.1 -1.4 0.22 5.2e+03 0.0065 0.062 0.52 + 13 -0.35 -2.6 2 -1.3 0.21 5.2e+03 0.0092 0.062 0.51 + 14 -0.33 -2.5 2 -1.3 0.21 5.2e+03 0.0034 0.062 0.89 + 15 -0.35 -2.5 1.9 -1.3 0.18 5.2e+03 0.0059 0.062 0.84 + 16 -0.36 -2.4 1.9 -1.3 0.21 5.2e+03 0.0029 0.062 0.62 + 17 -0.37 -2.4 1.8 -1.3 0.16 5.2e+03 0.0018 0.062 0.84 + 18 -0.4 -2.3 1.7 -1.3 0.17 5.2e+03 0.0027 0.062 0.31 + 19 -0.39 -2.3 1.7 -1.3 0.13 5.2e+03 0.002 0.062 0.48 + 20 -0.39 -2.3 1.7 -1.3 0.13 5.2e+03 0.002 0.031 -0.8 - 21 -0.39 -2.3 1.7 -1.3 0.14 5.2e+03 0.0016 0.031 0.28 + 22 -0.39 -2.3 1.7 -1.3 0.14 5.2e+03 0.0016 0.016 0.08 - 23 -0.4 -2.3 1.7 -1.3 0.14 5.2e+03 0.0009 0.016 0.79 + 24 -0.4 -2.3 1.7 -1.3 0.14 5.2e+03 0.0009 0.0078 -0.47 - 25 -0.4 -2.3 1.7 -1.3 0.14 5.2e+03 0.00056 0.0078 0.33 + 26 -0.4 -2.3 1.7 -1.3 0.14 5.2e+03 0.00026 0.0078 0.13 + 27 -0.4 -2.3 1.7 -1.3 0.14 5.2e+03 0.00026 0.0039 -1.3 - 28 -0.4 -2.3 1.7 -1.3 0.14 5.2e+03 0.00026 0.002 0.0035 - 29 -0.4 -2.3 1.7 -1.3 0.14 5.2e+03 0.00028 0.002 0.61 + 30 -0.4 -2.3 1.7 -1.3 0.14 5.2e+03 0.0001 0.002 0.8 + 31 -0.4 -2.3 1.7 -1.3 0.14 5.2e+03 5.8e-05 0.002 0.89 + 32 -0.4 -2.3 1.7 -1.3 0.14 5.2e+03 3.3e-05 0.002 0.76 + 33 -0.4 -2.3 1.7 -1.3 0.14 5.2e+03 3.2e-05 0.002 0.37 + 34 -0.4 -2.3 1.7 -1.3 0.14 5.2e+03 3.2e-05 0.00098 -2.3 - 35 -0.4 -2.3 1.7 -1.3 0.14 5.2e+03 3.2e-05 0.00049 0.0087 - 36 -0.4 -2.3 1.7 -1.3 0.14 5.2e+03 9.4e-06 0.0049 0.94 ++ 37 -0.4 -2.3 1.7 -1.3 0.14 5.2e+03 9.4e-06 0.0024 -52 - 38 -0.4 -2.3 1.7 -1.3 0.14 5.2e+03 9.4e-06 0.0012 -27 - 39 -0.4 -2.3 1.7 -1.3 0.14 5.2e+03 9.4e-06 0.00061 -10 - 40 -0.4 -2.3 1.7 -1.3 0.14 5.2e+03 9.4e-06 0.00031 -3.5 - 41 -0.4 -2.3 1.7 -1.3 0.14 5.2e+03 9.4e-06 0.00015 -1 - 42 -0.4 -2.3 1.7 -1.3 0.14 5.2e+03 1.6e-06 0.00015 0.86 - Optimization algorithm has converged. Relative gradient: 1.632630973351514e-06 Cause of termination: Relative gradient = 1.6e-06 <= 6.1e-06 Number of function evaluations: 98 Number of gradient evaluations: 55 Number of hessian evaluations: 0 Algorithm: BFGS with trust region for simple bound constraints Number of iterations: 43 Proportion of Hessian calculation: 0/27 = 0.0% Optimization time: 0:01:37.390006 Calculate second derivatives and BHHH File b06b_unif_mixture_MHLS.html has been generated. File b06b_unif_mixture_MHLS.yaml has been generated. .. GENERATED FROM PYTHON SOURCE LINES 103-105 .. code-block:: Python print(results.short_summary()) .. rst-class:: sphx-glr-script-out .. code-block:: none Results for model b06b_unif_mixture_MHLS Nbr of parameters: 5 Sample size: 6768 Excluded data: 3960 Final log likelihood: -5214.947 Akaike Information Criterion: 10439.89 Bayesian Information Criterion: 10473.99 .. GENERATED FROM PYTHON SOURCE LINES 106-108 .. code-block:: Python pandas_results = get_pandas_estimated_parameters(estimation_results=results) display(pandas_results) .. rst-class:: sphx-glr-script-out .. code-block:: none Name Value Robust std err. Robust t-stat. Robust p-value 0 asc_train -0.401859 0.065945 -6.093814 1.102520e-09 1 b_time -2.259754 0.117179 -19.284630 0.000000e+00 2 b_time_s 1.657023 0.132669 12.489949 0.000000e+00 3 b_cost -1.285443 0.086294 -14.896028 0.000000e+00 4 asc_car 0.137021 0.051739 2.648291 8.089997e-03 .. rst-class:: sphx-glr-timing **Total running time of the script:** (3 minutes 27.092 seconds) .. _sphx_glr_download_auto_examples_swissmetro_plot_b06b_unif_mixture_MHLS.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_b06b_unif_mixture_MHLS.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_b06b_unif_mixture_MHLS.py ` .. container:: sphx-glr-download sphx-glr-download-zip :download:`Download zipped: plot_b06b_unif_mixture_MHLS.zip ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_