biogeme.cnlΒΆ

Example of usage of the cnl module. This is for programmers who need examples of use of the functions of the class. The examples are designed to illustrate the syntax.

Michel Bierlaire Sun Jun 29 2025, 02:25:16

import numpy as np
import pandas as pd
from IPython.core.display_functions import display

import biogeme.biogeme_logging as blog
from biogeme.cnl import cnl_cdf, cnl_g
from biogeme.nests import NestsForCrossNestedLogit, OneNestForCrossNestedLogit
from biogeme.tools import CheckDerivativesResults, check_derivatives

logger = blog.get_screen_logger(level=blog.INFO)
logger.info('Logging on')
Logging on

Definition of the nests.

choice_set = [1, 2, 3, 4]
mu_nest_1 = 1.4
alphas_1 = {1: 1, 2: 0.5, 3: 0.2}
nest_1 = OneNestForCrossNestedLogit(
    nest_param=mu_nest_1, dict_of_alpha=alphas_1, name='Nest 1'
)
mu_nest_2 = 1.2
alphas_2 = {2: 0.5, 3: 0.8, 4: 1}
nest_2 = OneNestForCrossNestedLogit(
    nest_param=mu_nest_2, dict_of_alpha=alphas_2, name='Nest 2'
)
nests = NestsForCrossNestedLogit(choice_set=choice_set, tuple_of_nests=(nest_1, nest_2))

We retrieve the G function of the cross-nested logit, and verify numerically the implementation of the derivatives.

G = cnl_g(choice_set, nests)

Draw a random point where to evaluate the function.

y = np.random.uniform(low=0.01, high=2, size=4)
display(y)
[0.54542382 1.83623941 1.60018716 0.67859271]
check_results: CheckDerivativesResults = check_derivatives(G, y, names=None, logg=True)
Comparing first derivatives
x       Gradient    FinDiff    Difference
x[0]    0.696448   0.696448   2.32375e-08
x[1]    0.841059   0.841059   1.2523e-08
x[2]    0.817379   0.817379   1.34338e-08
x[3]    0.776014   0.776014   4.60067e-09
Comparing second derivatives
Row    Col       Hessian     FinDiff    Difference
x[0]   x[0]    0.36677     0.36677     7.15931e-09
x[1]   x[0]   -0.0886667  -0.0886667  -3.00136e-09
x[2]   x[0]   -0.0232671  -0.0232671  -3.53736e-10
x[3]   x[0]    0           0           0
x[0]   x[1]   -0.0886667  -0.0886667   4.37844e-10
x[1]   x[1]    0.0696261   0.0696261   2.65529e-09
x[2]   x[1]   -0.0384244  -0.0384244   5.77822e-10
x[3]   x[1]   -0.02653    -0.02653     1.27579e-09
x[0]   x[2]   -0.0232671  -0.0232671   2.04933e-09
x[1]   x[2]   -0.0384244  -0.0384244  -3.47211e-09
x[2]   x[2]    0.0712615   0.0712615   4.14298e-09
x[3]   x[2]   -0.0453658  -0.0453658  -5.91513e-10
x[0]   x[3]    0           0           0
x[1]   x[3]   -0.02653    -0.02653    -4.44506e-09
x[2]   x[3]   -0.0453658  -0.0453659   3.23061e-09
x[3]   x[3]    0.178766    0.178766    2.79715e-09
print(f'f = {check_results.function}')
f = 3.7588020261450286

We display the differences between the entries of the analytical gradient and the finite differences gradient

display(pd.DataFrame(check_results.errors_gradient))
              0
0  2.323753e-08
1  1.252299e-08
2  1.343376e-08
3  4.600673e-09

We display the differences between the entries of the analytical hessian and the finite differences hessian

display(pd.DataFrame(check_results.errors_hessian))
              0             1             2             3
0  7.159312e-09  4.378444e-10  2.049330e-09  0.000000e+00
1 -3.001357e-09  2.655290e-09 -3.472115e-09 -4.445062e-09
2 -3.537358e-10  5.778219e-10  4.142982e-09  3.230609e-09
3  0.000000e+00  1.275795e-09 -5.915128e-10  2.797153e-09

We do the same for the CDF.

xi = np.random.uniform(low=-10, high=10, size=4)
display(xi)
[-2.27819628  6.65336283 -2.52765836  1.66566415]
F = cnl_cdf(choice_set, nests)
check_cdf_results: CheckDerivativesResults = check_derivatives(
    F, y, names=None, logg=True
)
Comparing first derivatives
x       Gradient    FinDiff    Difference
x[0]   0.158806   0.158806    2.57403e-11
x[1]   0.0245452  0.0245452   2.65566e-10
x[2]   0.038037   0.038037    2.93833e-09
x[3]   0.13519    0.13519     1.06616e-09
Comparing second derivatives
Row    Col        Hessian      FinDiff    Difference
x[0]   x[0]   -0.0739516   -0.0739516    1.45726e-09
x[1]   x[0]    0.0175312    0.0175312   -3.33241e-10
x[2]   x[0]    0.0229348    0.0229348    3.42736e-10
x[3]   x[0]    0.0765265    0.0765265   -1.33881e-09
x[0]   x[1]    0.0175312    0.0175312    9.50175e-10
x[1]   x[1]   -0.028822    -0.028822    -2.26529e-10
x[2]   x[1]    0.00396066   0.00396066  -1.09097e-10
x[3]   x[1]    0.0139825    0.0139825   -9.12009e-10
x[0]   x[2]    0.0229348    0.0229348    2.6099e-09
x[1]   x[2]    0.00396066   0.00396066  -3.63372e-11
x[2]   x[2]   -0.039943    -0.039943     3.47845e-10
x[3]   x[2]    0.0233566    0.0233566    7.53713e-10
x[0]   x[3]    0.0765265    0.0765265    5.23832e-10
x[1]   x[3]    0.0139825    0.0139825   -1.61359e-10
x[2]   x[3]    0.0233566    0.0233566    3.61726e-10
x[3]   x[3]   -0.0772253   -0.0772253    2.03382e-09
print(f'f = {check_cdf_results.function}')
f = 0.28054306314744876

We display the differences between the entries of the analytical gradient and the finite differences gradient

display(pd.DataFrame(check_cdf_results.errors_gradient))
              0
0  2.574033e-11
1  2.655664e-10
2  2.938331e-09
3  1.066158e-09

We display the differences between the entries of the analytical hessian and the finite differences hessian

display(pd.DataFrame(check_cdf_results.errors_hessian))
              0             1             2             3
0  1.457265e-09  9.501754e-10  2.609898e-09  5.238321e-10
1 -3.332406e-10 -2.265286e-10 -3.633718e-11 -1.613593e-10
2  3.427359e-10 -1.090969e-10  3.478453e-10  3.617260e-10
3 -1.338813e-09 -9.120088e-10  7.537131e-10  2.033817e-09

Total running time of the script: (0 minutes 0.116 seconds)

Gallery generated by Sphinx-Gallery