.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_examples/swissmetro/plot_b01logit_bis.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code. .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_examples_swissmetro_plot_b01logit_bis.py: Illustration of additional features of Biogeme ============================================== Same model as b01logit, using LinearUtility, segmentations Michel Bierlaire, EPFL Wed Jun 18 2025, 10:57:53 .. GENERATED FROM PYTHON SOURCE LINES 12-29 .. code-block:: Python import os from IPython.core.display_functions import display import biogeme.biogeme_logging as blog from biogeme.biogeme import BIOGEME from biogeme.exceptions import BiogemeError from biogeme.expressions import Beta, LinearTermTuple, LinearUtility from biogeme.models import loglogit from biogeme.results_processing import ( EstimateVarianceCovariance, generate_html_file, get_pandas_estimated_parameters, ) from biogeme.segmentation import Segmentation .. GENERATED FROM PYTHON SOURCE LINES 30-31 See the data processing script: :ref:`swissmetro_data`. .. GENERATED FROM PYTHON SOURCE LINES 31-50 .. code-block:: Python from swissmetro_data import ( CAR_AV_SP, CAR_CO_SCALED, CAR_TT_SCALED, CHOICE, GA, MALE, SM_AV, SM_COST_SCALED, SM_TT_SCALED, TRAIN_AV_SP, TRAIN_COST_SCALED, TRAIN_TT_SCALED, database, ) logger = blog.get_screen_logger(level=blog.INFO) logger.info('Example b01logit_bis.py') .. rst-class:: sphx-glr-script-out .. code-block:: none Example b01logit_bis.py .. GENERATED FROM PYTHON SOURCE LINES 51-52 Parameters to be estimated. .. GENERATED FROM PYTHON SOURCE LINES 52-55 .. code-block:: Python asc_car = Beta('asc_car', 0, None, None, 0) asc_train = Beta('asc_train', 0, None, None, 0) .. GENERATED FROM PYTHON SOURCE LINES 56-58 Starting value. We use starting values estimated from a previous run .. GENERATED FROM PYTHON SOURCE LINES 58-61 .. code-block:: Python b_time = Beta('b_time', -1.28, None, None, 0) b_cost = Beta('b_cost', -1.08, None, None, 0) .. GENERATED FROM PYTHON SOURCE LINES 62-63 Define segmentations. .. GENERATED FROM PYTHON SOURCE LINES 63-76 .. code-block:: Python gender_segmentation = database.generate_segmentation( variable=MALE, mapping={0: 'female', 1: 'male'} ) ga_segmentation = database.generate_segmentation( variable=GA, mapping={0: 'without_ga', 1: 'with_ga'} ) segmentations_for_asc = [ gender_segmentation, ga_segmentation, ] .. GENERATED FROM PYTHON SOURCE LINES 77-78 Segmentation of the constants. .. GENERATED FROM PYTHON SOURCE LINES 78-83 .. code-block:: Python asc_train_segmentation = Segmentation(asc_train, segmentations_for_asc) segmented_asc_train = asc_train_segmentation.segmented_beta() asc_car_segmentation = Segmentation(asc_car, segmentations_for_asc) segmented_asc_car = asc_car_segmentation.segmented_beta() .. GENERATED FROM PYTHON SOURCE LINES 84-85 Definition of the utility functions. .. GENERATED FROM PYTHON SOURCE LINES 85-103 .. code-block:: Python terms1 = [ LinearTermTuple(beta=b_time, x=TRAIN_TT_SCALED), LinearTermTuple(beta=b_cost, x=TRAIN_COST_SCALED), ] v_train = segmented_asc_train + LinearUtility(terms1) terms2 = [ LinearTermTuple(beta=b_time, x=SM_TT_SCALED), LinearTermTuple(beta=b_cost, x=SM_COST_SCALED), ] v_swissmetro = LinearUtility(terms2) terms3 = [ LinearTermTuple(beta=b_time, x=CAR_TT_SCALED), LinearTermTuple(beta=b_cost, x=CAR_CO_SCALED), ] v_car = segmented_asc_car + LinearUtility(terms3) .. GENERATED FROM PYTHON SOURCE LINES 104-105 Associate utility functions with the numbering of alternatives. .. GENERATED FROM PYTHON SOURCE LINES 105-107 .. code-block:: Python v = {1: v_train, 2: v_swissmetro, 3: v_car} .. GENERATED FROM PYTHON SOURCE LINES 108-109 Associate the availability conditions with the alternatives. .. GENERATED FROM PYTHON SOURCE LINES 109-111 .. code-block:: Python av = {1: TRAIN_AV_SP, 2: SM_AV, 3: CAR_AV_SP} .. GENERATED FROM PYTHON SOURCE LINES 112-116 Definition of the model. This is the contribution of each observation to the log likelihood function. .. GENERATED FROM PYTHON SOURCE LINES 116-118 .. code-block:: Python logprob = loglogit(v, av, CHOICE) .. GENERATED FROM PYTHON SOURCE LINES 119-122 User notes. These notes will be included as such in the report file. .. GENERATED FROM PYTHON SOURCE LINES 122-130 .. code-block:: Python USER_NOTES = ( 'Example of a logit model with three alternatives: Train, Car and' ' Swissmetro. Same as 01logit and ' 'introducing some options and features. In particular, bioLinearUtility,' ' and automatic segmentation of parameters.' ) .. GENERATED FROM PYTHON SOURCE LINES 131-135 Create the Biogeme object. We include users notes, and we ask not to calculate the second derivatives. The parameter 'calculating_second_derivatives' is a general instruction for Biogeme, In this case, the second derivatives will not even be calculated after the algorithm has converged. It means that the statistics will have to rely on bootstrap or BHHH. .. GENERATED FROM PYTHON SOURCE LINES 135-144 .. code-block:: Python the_biogeme = BIOGEME( database, logprob, user_notes=USER_NOTES, save_iterations=False, bootstrap_samples=100, calculating_second_derivatives='never', ) .. rst-class:: sphx-glr-script-out .. code-block:: none Biogeme parameters read from biogeme.toml. .. GENERATED FROM PYTHON SOURCE LINES 145-149 Calculate the null log likelihood for reporting. As we have used starting values different from 0, the initial model is not the equal probability model. .. GENERATED FROM PYTHON SOURCE LINES 149-152 .. code-block:: Python the_biogeme.calculate_null_loglikelihood(av) the_biogeme.model_name = 'b01logit_bis' .. GENERATED FROM PYTHON SOURCE LINES 153-155 Estimate the parameters. .. GENERATED FROM PYTHON SOURCE LINES 155-157 .. code-block:: Python results = the_biogeme.estimate(run_bootstrap=True) .. rst-class:: sphx-glr-script-out .. code-block:: none Starting values for the algorithm: {} As the model is rather complex, we cancel the calculation of second derivatives. If you want to control the parameters, change the algorithm from "automatic" to "simple_bounds" in the TOML file. Optimization algorithm: hybrid Newton/BFGS with simple bounds [simple_bounds] ** Optimization: BFGS with trust region for simple bounds Iter. asc_train_ref asc_train_diff_ asc_train_diff_ b_time b_cost asc_car_ref asc_car_diff_ma asc_car_diff_wi Function Relgrad Radius Rho 0 0 0 0 -1.3 -1.1 0 0 0 5.5e+03 0.12 0.5 -0.69 - 1 0 0 0 -1.3 -1.1 0 0 0 5.5e+03 0.12 0.25 -0.051 - 2 -0.25 -0.25 0.25 -1.5 -0.83 0.25 0.25 -0.25 5.3e+03 0.079 0.25 0.41 + 3 -0.2 -0.5 0.5 -1.8 -0.58 0 0 -0.5 5.3e+03 0.1 0.25 0.2 + 4 -0.13 -0.55 0.67 -1.5 -0.8 0.13 0.17 -0.53 5.1e+03 0.041 0.25 0.9 + 5 -0.29 -0.8 0.92 -1.6 -1.1 -0.12 -0.077 -0.78 5.1e+03 0.069 0.25 0.49 + 6 -0.21 -0.84 1.1 -1.3 -1.1 -0.0014 0.11 -0.78 5e+03 0.041 0.25 0.46 + 7 -0.24 -0.99 1.4 -1.3 -1 -0.23 0.026 -0.78 5e+03 0.022 0.25 0.69 + 8 -0.44 -1.2 1.6 -1.3 -1 -0.23 0.2 -0.76 5e+03 0.024 0.25 0.17 + 9 -0.44 -1.2 1.6 -1.3 -1 -0.23 0.2 -0.76 5e+03 0.024 0.12 -0.21 - 10 -0.32 -1.2 1.7 -1.2 -1 -0.34 0.16 -0.76 5e+03 0.01 0.12 0.7 + 11 -0.44 -1.2 1.8 -1.2 -1.1 -0.46 0.23 -0.72 5e+03 0.013 0.12 0.54 + 12 -0.44 -1.2 1.9 -1.1 -1.1 -0.5 0.34 -0.68 4.9e+03 0.013 0.12 0.18 + 13 -0.56 -1.1 1.9 -1.2 -1 -0.63 0.45 -0.55 4.9e+03 0.0059 0.12 0.41 + 14 -0.56 -1.1 1.9 -1.2 -1 -0.63 0.45 -0.55 4.9e+03 0.0059 0.062 0.0044 - 15 -0.54 -1.1 1.9 -1.2 -1.1 -0.64 0.43 -0.55 4.9e+03 0.0028 0.062 0.75 + 16 -0.54 -1.1 1.9 -1.2 -1.1 -0.64 0.43 -0.55 4.9e+03 0.0028 0.031 -2.8 - 17 -0.54 -1.1 1.9 -1.2 -1.1 -0.64 0.43 -0.55 4.9e+03 0.0028 0.016 -0.29 - 18 -0.55 -1.1 1.9 -1.2 -1.1 -0.63 0.44 -0.54 4.9e+03 0.0021 0.016 0.44 + 19 -0.54 -1.1 1.9 -1.2 -1.1 -0.63 0.42 -0.53 4.9e+03 0.0021 0.016 0.46 + 20 -0.54 -1.1 1.9 -1.2 -1.1 -0.62 0.42 -0.51 4.9e+03 0.00047 0.016 0.83 + 21 -0.53 -1.1 1.9 -1.2 -1.1 -0.62 0.42 -0.5 4.9e+03 0.0014 0.016 0.42 + 22 -0.54 -1.1 1.9 -1.2 -1.1 -0.61 0.41 -0.48 4.9e+03 0.00067 0.016 0.64 + 23 -0.53 -1.1 1.9 -1.2 -1.1 -0.61 0.41 -0.46 4.9e+03 0.00032 0.016 0.66 + 24 -0.53 -1.1 1.9 -1.2 -1.1 -0.61 0.41 -0.45 4.9e+03 0.00047 0.016 0.59 + 25 -0.53 -1.1 1.9 -1.2 -1.1 -0.61 0.41 -0.43 4.9e+03 0.00049 0.016 0.43 + 26 -0.53 -1.1 1.9 -1.2 -1.1 -0.61 0.41 -0.43 4.9e+03 0.00049 0.0078 -2.6 - 27 -0.53 -1.1 1.9 -1.2 -1.1 -0.61 0.41 -0.43 4.9e+03 0.00022 0.0078 0.5 + 28 -0.53 -1.1 1.9 -1.2 -1.1 -0.61 0.41 -0.43 4.9e+03 0.00022 0.0039 -2.3 - 29 -0.53 -1.1 1.9 -1.2 -1.1 -0.61 0.41 -0.43 4.9e+03 0.00022 0.002 -1.1 - 30 -0.53 -1.1 1.9 -1.2 -1.1 -0.61 0.41 -0.43 4.9e+03 0.00022 0.00098 0.064 - 31 -0.53 -1.1 1.9 -1.2 -1.1 -0.61 0.41 -0.43 4.9e+03 0.00017 0.00098 0.45 + 32 -0.53 -1.1 1.9 -1.2 -1.1 -0.61 0.41 -0.43 4.9e+03 7.4e-05 0.00098 0.84 + 33 -0.53 -1.1 1.9 -1.2 -1.1 -0.61 0.41 -0.43 4.9e+03 6.8e-05 0.00098 0.83 + 34 -0.53 -1.1 1.9 -1.2 -1.1 -0.61 0.41 -0.43 4.9e+03 6.2e-05 0.00098 0.87 + 35 -0.53 -1.1 1.9 -1.2 -1.1 -0.61 0.41 -0.43 4.9e+03 5.4e-05 0.00098 0.89 + 36 -0.53 -1.1 1.9 -1.2 -1.1 -0.61 0.41 -0.42 4.9e+03 5.3e-05 0.00098 0.89 + 37 -0.53 -1.1 1.9 -1.2 -1.1 -0.61 0.41 -0.42 4.9e+03 4.5e-05 0.00098 0.89 + 38 -0.53 -1.1 1.9 -1.2 -1.1 -0.61 0.41 -0.42 4.9e+03 4.3e-05 0.00098 0.89 + 39 -0.53 -1.1 1.9 -1.2 -1.1 -0.61 0.41 -0.42 4.9e+03 3.6e-05 0.00098 0.89 + 40 -0.53 -1.1 1.9 -1.2 -1.1 -0.61 0.41 -0.42 4.9e+03 3.4e-05 0.00098 0.9 + 41 -0.53 -1.1 1.9 -1.2 -1.1 -0.61 0.41 -0.42 4.9e+03 5.7e-05 0.00098 0.88 + 42 -0.53 -1.1 1.9 -1.2 -1.1 -0.61 0.41 -0.42 4.9e+03 2.4e-05 0.0098 0.94 ++ 43 -0.53 -1.1 1.9 -1.2 -1.1 -0.61 0.41 -0.42 4.9e+03 2.4e-05 0.0033 -9.7 - 44 -0.53 -1.1 1.9 -1.2 -1.1 -0.61 0.41 -0.42 4.9e+03 2e-05 0.0033 0.88 + 45 -0.53 -1.1 1.9 -1.2 -1.1 -0.61 0.41 -0.42 4.9e+03 2e-05 0.0017 -35 - 46 -0.53 -1.1 1.9 -1.2 -1.1 -0.61 0.41 -0.42 4.9e+03 2e-05 0.00084 -3.5 - 47 -0.53 -1.1 1.9 -1.2 -1.1 -0.61 0.41 -0.42 4.9e+03 3.8e-06 0.00084 0.86 - Optimization algorithm has converged. Relative gradient: 3.827951698290516e-06 Cause of termination: Relative gradient = 3.8e-06 <= 6.1e-06 Number of function evaluations: 119 Number of gradient evaluations: 71 Number of hessian evaluations: 0 Algorithm: BFGS with trust region for simple bound constraints Number of iterations: 48 Proportion of Hessian calculation: 0/35 = 0.0% Optimization time: 0:00:00.269941 Calculate BHHH Re-estimate the model 100 times for bootstrapping Bootstraps: 0%| | 0/100 [00:00` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_b01logit_bis.py ` .. container:: sphx-glr-download sphx-glr-download-zip :download:`Download zipped: plot_b01logit_bis.zip ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_