Note
Go to the end to download the full example code.
biogeme.cnlΒΆ
Example of usage of the cnl module. This is for programmers who need examples of use of the functions of the class. The examples are designed to illustrate the syntax.
Michel Bierlaire Sun Jun 29 2025, 02:25:16
import numpy as np
import pandas as pd
from IPython.core.display_functions import display
import biogeme.biogeme_logging as blog
from biogeme.cnl import cnl_cdf, cnl_g
from biogeme.nests import NestsForCrossNestedLogit, OneNestForCrossNestedLogit
from biogeme.tools import CheckDerivativesResults, check_derivatives
logger = blog.get_screen_logger(level=blog.INFO)
logger.info('Logging on')
Logging on
Definition of the nests.
choice_set = [1, 2, 3, 4]
mu_nest_1 = 1.4
alphas_1 = {1: 1, 2: 0.5, 3: 0.2}
nest_1 = OneNestForCrossNestedLogit(
nest_param=mu_nest_1, dict_of_alpha=alphas_1, name='Nest 1'
)
mu_nest_2 = 1.2
alphas_2 = {2: 0.5, 3: 0.8, 4: 1}
nest_2 = OneNestForCrossNestedLogit(
nest_param=mu_nest_2, dict_of_alpha=alphas_2, name='Nest 2'
)
nests = NestsForCrossNestedLogit(choice_set=choice_set, tuple_of_nests=(nest_1, nest_2))
We retrieve the G function of the cross-nested logit, and verify numerically the implementation of the derivatives.
G = cnl_g(choice_set, nests)
Draw a random point where to evaluate the function.
y = np.random.uniform(low=0.01, high=2, size=4)
display(y)
[0.19753995 0.73907967 0.36701264 1.22185386]
check_results: CheckDerivativesResults = check_derivatives(G, y, names=None, logg=True)
Comparing first derivatives
x Gradient FinDiff Difference
x[0] 0.690605 0.690605 -9.93706e-09
x[1] 0.814957 0.814957 -3.08951e-08
x[2] 0.660398 0.660398 -3.30686e-09
x[3] 0.943367 0.943367 -1.49137e-08
Comparing second derivatives
Row Col Hessian FinDiff Difference
x[0] x[0] 1.01564 1.01564 5.6708e-08
x[1] x[0] -0.245874 -0.245874 3.91355e-09
x[2] x[0] -0.0515219 -0.0515219 -1.32262e-09
x[3] x[0] 0 0 0
x[0] x[1] -0.245874 -0.245874 3.91355e-09
x[1] x[1] 0.165775 0.165775 1.01816e-08
x[2] x[1] -0.0588651 -0.0588651 -1.30608e-09
x[3] x[1] -0.0428424 -0.0428424 4.00143e-09
x[0] x[2] -0.0515219 -0.0515219 6.12796e-09
x[1] x[2] -0.0588651 -0.0588652 6.1445e-09
x[2] x[2] 0.364221 0.364221 1.37493e-08
x[3] x[2] -0.065466 -0.065466 1.86985e-10
x[0] x[3] 0 0 0
x[1] x[3] -0.0428424 -0.0428424 -4.82258e-10
x[2] x[3] -0.065466 -0.065466 5.66927e-09
x[3] x[3] 0.0455789 0.0455789 5.95495e-09
print(f'f = {check_results.function}')
f = 2.1337717888484193
We display the differences between the entries of the analytical gradient and the finite differences gradient
display(pd.DataFrame(check_results.errors_gradient))
0
0 -9.937058e-09
1 -3.089514e-08
2 -3.306858e-09
3 -1.491373e-08
We display the differences between the entries of the analytical hessian and the finite differences hessian
display(pd.DataFrame(check_results.errors_hessian))
0 1 2 3
0 5.670798e-08 3.913554e-09 6.127961e-09 0.000000e+00
1 3.913554e-09 1.018161e-08 6.144497e-09 -4.822576e-10
2 -1.322620e-09 -1.306083e-09 1.374933e-08 5.669273e-09
3 0.000000e+00 4.001432e-09 1.869853e-10 5.954947e-09
We do the same for the CDF.
xi = np.random.uniform(low=-10, high=10, size=4)
display(xi)
[ 1.74122715 -2.82286266 -3.94238629 -7.03782993]
F = cnl_cdf(choice_set, nests)
check_cdf_results: CheckDerivativesResults = check_derivatives(
F, y, names=None, logg=True
)
Comparing first derivatives
x Gradient FinDiff Difference
x[0] 0.116469 0.116469 1.2969e-09
x[1] 0.0483297 0.0483297 -2.25068e-09
x[2] 0.0856095 0.0856096 -2.09333e-09
x[3] 0.0355927 0.0355927 1.59681e-09
Comparing second derivatives
Row Col Hessian FinDiff Difference
x[0] x[0] -0.0366193 -0.0366193 -1.49151e-10
x[1] x[0] 0.0436906 0.0436906 2.29011e-11
x[2] x[0] 0.068831 0.068831 4.46004e-10
x[3] x[0] 0.0273434 0.0273434 4.60708e-10
x[0] x[1] 0.0436906 0.0436906 -1.37408e-09
x[1] x[1] -0.0444609 -0.0444609 -8.36803e-10
x[2] x[1] 0.0308524 0.0308524 -6.5811e-10
x[3] x[1] 0.0127604 0.0127604 -5.08305e-10
x[0] x[2] 0.068831 0.068831 -1.41664e-09
x[1] x[2] 0.0308524 0.0308524 -6.5811e-10
x[2] x[2] -0.0477764 -0.0477764 -6.17964e-10
x[3] x[2] 0.0239828 0.0239828 -1.10104e-09
x[0] x[3] 0.0273434 0.0273434 1.32592e-09
x[1] x[3] 0.0127604 0.0127604 5.98229e-10
x[2] x[3] 0.0239828 0.0239828 4.93963e-10
x[3] x[3] -0.032535 -0.032535 2.33081e-10
print(f'f = {check_cdf_results.function}')
f = 0.15160689886687484
We display the differences between the entries of the analytical gradient and the finite differences gradient
display(pd.DataFrame(check_cdf_results.errors_gradient))
0
0 1.296902e-09
1 -2.250682e-09
2 -2.093329e-09
3 1.596814e-09
We display the differences between the entries of the analytical hessian and the finite differences hessian
display(pd.DataFrame(check_cdf_results.errors_hessian))
0 1 2 3
0 -1.491512e-10 -1.374083e-09 -1.416641e-09 1.325916e-09
1 2.290107e-11 -8.368034e-10 -6.581100e-10 5.982295e-10
2 4.460037e-10 -6.581100e-10 -6.179639e-10 4.939635e-10
3 4.607077e-10 -5.083047e-10 -1.101036e-09 2.330809e-10
Total running time of the script: (0 minutes 0.122 seconds)