Using simulations to understand when heuristics-driven violations of Random Utility Model assumptions are a problem

Julian Sagebiel

Introduction

  • Often we assume a data generating process that aligns with the standard models we use for analysis

    • Random utility model with Gumbel iid errors

    • Unobserved preference heterogeneity

  • Still, we know that the idea of RUM respondents is quite unrealistic

Should we still use RUM

  • There are good reasons to use RUM

    • It is convenient

    • It may be a sufficiently good approximation

    • It allows welfare calculations

Brainstorming

Brainstorming: Gather some ideas of potential violations of RUM based models

RUM model

In RUM, we would assume the following utility function

\[ U = \beta X + \epsilon \]

  • Compensatory behavior: Enough of one attribute can compensate the absense of another attribute (marginal rate of substitution - MRS)

Deviations from RUM

  • Attribute Non Attendence (no MRS)

  • Elimination by aspects

  • Purely lexiographic preferences

  • Left-Right Bias

  • Loss aversion

Key question: How do RUM models perform if the data is not RUM

  • Can we still stick with RUM or do we need better modelling

Data generating processes

  • A DGP is the way how data is created.

  • Sunflower Height=Amount of Sunlight+Water+Soil Nutrients+Random Growth Factor

  • We never know the right DGP, we can only approximate it

  • Any statistical model is based on a DGP

DGP in Choice Experiments

  • A model is a strong simplification of a real phenomenon

  • A choice model is a strong simplification of how people make choices (i.e. the DGP)

  • Using a conditional logit model implies a precise definition of the DGP

  • The model is not more or not less than our assumptions on the DGP

Random utility

  • When we apply RUM, we have strong assumptions on the utility function and its specification

\[ U = \beta X + \epsilon \] - Estimating a RUM means that all respondents make their choices based only on the utility function

  • Estimating a RUM usually gives us some statistically significant results

  • If people do not apply to RUM, our results may get biased.

Example: Feed Additives

  • The following model is estimated from the feed additives choice experiment

  • What does the regression table tell us about the DGP and potential deviations of RUM?

Regression results

  Model 1
basc 0.17 (0.04)***
bcow 0.29 (0.02)***
badv 0.32 (0.02)***
bvet 0.32 (0.02)***
bfar 0.27 (0.02)***
bmet 0.18 (0.01)***
bbon 0.37 (0.01)***
No Observations 28800
No Respondents 3600
Log Likelihood (Null) -31640.03
Log Likelihood (Converged) -20401.99

Question to all: If you are a reviewer and see these results, would you have any doubts?

Heuristics or not

  • To get a better idea, lets look at some choices

  • The following table is just a few respondents

  • Can you see anything in the data?

ID Choice_situation alt1_cow alt1_adv alt1_vet alt1_far alt1_met alt1_bon alt2_cow alt2_adv alt2_vet alt2_far alt2_met alt2_bon CHOICE
2 1 0 1 1 1 3 6 1 0 1 0 1 0 1
2162 1 0 1 1 1 3 6 1 0 1 0 1 0 1
2882 1 0 1 1 1 3 6 1 0 1 0 1 0 1
2 2 1 0 0 1 0 7 0 1 0 0 1 2 1
2162 2 1 0 0 1 0 7 0 1 0 0 1 2 3
2882 2 1 0 0 1 0 7 0 1 0 0 1 2 1
2 3 1 0 0 1 1 1 1 1 1 0 3 5 2
2162 3 1 0 0 1 1 1 1 1 1 0 3 5 2
2882 3 1 0 0 1 1 1 1 1 1 0 3 5 2
1 4 0 0 1 1 3 0 0 1 0 0 2 3 1
2161 4 0 0 1 1 3 0 0 1 0 0 2 3 1
2881 4 0 0 1 1 3 0 0 1 0 0 2 3 2
2 5 0 0 1 0 2 4 0 1 0 1 1 7 2
2162 5 0 0 1 0 2 4 0 1 0 1 1 7 1
2882 5 0 0 1 0 2 4 0 1 0 1 1 7 2
2 6 1 1 0 1 2 0 0 0 1 0 1 5 1
2162 6 1 1 0 1 2 0 0 0 1 0 1 5 2
2882 6 1 1 0 1 2 0 0 0 1 0 1 5 2
1 7 1 0 1 0 2 3 0 1 0 1 0 4 2
2161 7 1 0 1 0 2 3 0 1 0 1 0 4 1
2881 7 1 0 1 0 2 3 0 1 0 1 0 4 2
1 8 0 0 0 1 0 5 0 1 1 0 2 7 2
2161 8 0 0 0 1 0 5 0 1 1 0 2 7 2
2881 8 0 0 0 1 0 5 0 1 1 0 2 7 2
1 9 1 1 0 0 0 2 0 0 0 1 2 0 1
2161 9 1 1 0 0 0 2 0 0 0 1 2 0 1
2881 9 1 1 0 0 0 2 0 0 0 1 2 0 3
2 10 1 1 1 0 1 4 0 0 0 1 0 2 1
2162 10 1 1 1 0 1 4 0 0 0 1 0 2 1
2882 10 1 1 1 0 1 4 0 0 0 1 0 2 1
2 11 0 1 1 0 0 1 1 0 0 1 3 4 2
2162 11 0 1 1 0 0 1 1 0 0 1 3 4 2
2882 11 0 1 1 0 0 1 1 0 0 1 3 4 2
1 12 1 1 0 0 1 6 1 0 1 1 3 1 1
2161 12 1 1 0 0 1 6 1 0 1 1 3 1 1
2881 12 1 1 0 0 1 6 1 0 1 1 3 1 1
1 13 0 0 1 1 1 2 1 1 0 0 3 1 1
2161 13 0 0 1 1 1 2 1 1 0 0 3 1 1
2881 13 0 0 1 1 1 2 1 1 0 0 3 1 1
1 14 1 1 0 1 2 5 1 0 1 0 0 6 1
2161 14 1 1 0 1 2 5 1 0 1 0 0 6 1
2881 14 1 1 0 1 2 5 1 0 1 0 0 6 2
2 15 0 0 0 0 3 7 1 1 1 1 0 3 1
2162 15 0 0 0 0 3 7 1 1 1 1 0 3 2
2882 15 0 0 0 0 3 7 1 1 1 1 0 3 1
1 16 0 1 1 0 3 3 1 0 1 1 2 6 2
2161 16 0 1 1 0 3 3 1 0 1 1 2 6 1
2881 16 0 1 1 0 3 3 1 0 1 1 2 6 2

Simulated dataset

  • The data we just looked at are simulated data

  • Simulations give us full control over the data generating process (DGP)

  • The data was simulated using different decision heuristics

  • As the data is simulated, we know the correct DGP and thus if the model is correct or not

Full data

ID Choice_situation alt1_cow alt1_adv alt1_vet alt1_far alt1_met alt1_bon alt2_cow alt2_adv alt2_vet alt2_far alt2_met alt2_bon Block group V_1 V_2 V_3 e_1 e_2 e_3 U_1 U_2 U_3 CHOICE
1 4 0 0 1 1 3 0 0 1 0 0 2 3 1 1 1.5 1.8 0.2 1.2606303 -0.0001064 0.4438955 2.7606303 1.7998936 0.6438955 1
1 7 1 0 1 0 2 3 0 1 0 1 0 4 1 1 2.1 1.8 0.2 0.2663207 0.6080929 -1.1184641 2.3663207 2.4080929 -0.9184641 2
1 8 0 0 0 1 0 5 0 1 1 0 2 7 1 1 1.8 3.3 0.2 1.5130251 1.0353377 -0.1032120 3.3130251 4.3353377 0.0967880 2
1 9 1 1 0 0 0 2 0 0 0 1 2 0 1 1 1.2 0.9 0.2 1.0994573 -0.8836217 -1.0041691 2.2994573 0.0163783 -0.8041691 1
1 12 1 1 0 0 1 6 1 0 1 1 3 1 1 1 2.7 2.1 0.2 2.6612205 0.8030552 -0.6974163 5.3612205 2.9030552 -0.4974163 1
1 13 0 0 1 1 1 2 1 1 0 0 3 1 1 1 1.5 1.8 0.2 2.8531287 0.1904686 0.7794196 4.3531287 1.9904686 0.9794196 1
1 14 1 1 0 1 2 5 1 0 1 0 0 6 1 1 3.0 2.4 0.2 0.0742271 0.5877955 -0.7728690 3.0742271 2.9877955 -0.5728690 1
1 16 0 1 1 0 3 3 1 0 1 1 2 6 1 1 2.4 3.3 0.2 1.1556488 0.7101979 -0.1524561 3.5556488 4.0101979 0.0475439 2
2 1 0 1 1 1 3 6 1 0 1 0 1 0 2 1 3.6 0.9 0.2 0.8530360 0.0108760 0.8482966 4.4530360 0.9108760 1.0482966 1
2 2 1 0 0 1 0 7 0 1 0 0 1 2 2 1 2.7 1.2 0.2 1.0629435 1.4820326 -0.1926237 3.7629435 2.6820326 0.0073763 1
2 3 1 0 0 1 1 1 1 1 1 0 3 5 2 1 1.2 3.3 0.2 -0.7359799 -1.3557277 0.9440878 0.4640201 1.9442723 1.1440878 2
2 5 0 0 1 0 2 4 0 1 0 1 1 7 2 1 2.1 3.0 0.2 0.4510422 0.6350862 2.6084106 2.5510422 3.6350862 2.8084106 2
2 6 1 1 0 1 2 0 0 0 1 0 1 5 2 1 1.5 2.1 0.2 3.7227908 0.2203160 0.7968505 5.2227908 2.3203160 0.9968505 1
2 10 1 1 1 0 1 4 0 0 0 1 0 2 2 1 2.4 0.9 0.2 -0.1005443 -1.1628394 -0.3342054 2.2994557 -0.2628394 -0.1342054 1
2 11 0 1 1 0 0 1 1 0 0 1 3 4 2 1 0.9 2.7 0.2 0.6700349 1.6353508 1.3852486 1.5700349 4.3353508 1.5852486 2
2 15 0 0 0 0 3 7 1 1 1 1 0 3 2 1 3.0 2.1 0.2 -1.1075404 -0.4046616 1.1858642 1.8924596 1.6953384 1.3858642 1
3 4 0 0 1 1 3 0 0 1 0 0 2 3 1 1 1.5 1.8 0.2 -0.8678303 0.5826661 -0.2255671 0.6321697 2.3826661 -0.0255671 2
3 7 1 0 1 0 2 3 0 1 0 1 0 4 1 1 2.1 1.8 0.2 0.6483179 -1.3844568 1.2320727 2.7483179 0.4155432 1.4320727 1
3 8 0 0 0 1 0 5 0 1 1 0 2 7 1 1 1.8 3.3 0.2 0.7359145 0.0371263 -0.9898015 2.5359145 3.3371263 -0.7898015 2
3 9 1 1 0 0 0 2 0 0 0 1 2 0 1 1 1.2 0.9 0.2 -0.3453707 1.8019234 -0.1438140 0.8546293 2.7019234 0.0561860 2
3 12 1 1 0 0 1 6 1 0 1 1 3 1 1 1 2.7 2.1 0.2 1.1001233 0.7355253 0.3302873 3.8001233 2.8355253 0.5302873 1
3 13 0 0 1 1 1 2 1 1 0 0 3 1 1 1 1.5 1.8 0.2 -0.5198692 0.6109552 -1.1900497 0.9801308 2.4109552 -0.9900497 2
3 14 1 1 0 1 2 5 1 0 1 0 0 6 1 1 3.0 2.4 0.2 4.6170242 -0.2357096 -0.3531038 7.6170242 2.1642904 -0.1531038 1
3 16 0 1 1 0 3 3 1 0 1 1 2 6 1 1 2.4 3.3 0.2 0.3302260 0.0131924 0.2629378 2.7302260 3.3131924 0.4629378 2
4 1 0 1 1 1 3 6 1 0 1 0 1 0 2 1 3.6 0.9 0.2 0.1182670 2.5395730 3.2915101 3.7182670 3.4395730 3.4915101 1
4 2 1 0 0 1 0 7 0 1 0 0 1 2 2 1 2.7 1.2 0.2 1.8658662 0.1989536 0.3159225 4.5658662 1.3989536 0.5159225 1
4 3 1 0 0 1 1 1 1 1 1 0 3 5 2 1 1.2 3.3 0.2 1.1832195 -1.2572051 3.2006742 2.3832195 2.0427949 3.4006742 3
4 5 0 0 1 0 2 4 0 1 0 1 1 7 2 1 2.1 3.0 0.2 2.6885540 1.3173080 -0.2843732 4.7885540 4.3173080 -0.0843732 1
4 6 1 1 0 1 2 0 0 0 1 0 1 5 2 1 1.5 2.1 0.2 -0.8023013 1.4371628 0.1593705 0.6976987 3.5371628 0.3593705 2
4 10 1 1 1 0 1 4 0 0 0 1 0 2 2 1 2.4 0.9 0.2 1.2489432 4.2182805 0.7173716 3.6489432 5.1182805 0.9173716 2
4 11 0 1 1 0 0 1 1 0 0 1 3 4 2 1 0.9 2.7 0.2 0.1858225 0.4456220 0.2780780 1.0858225 3.1456220 0.4780780 2
4 15 0 0 0 0 3 7 1 1 1 1 0 3 2 1 3.0 2.1 0.2 0.5343423 1.1190677 -0.5519986 3.5343423 3.2190677 -0.3519986 1
5 4 0 0 1 1 3 0 0 1 0 0 2 3 1 1 1.5 1.8 0.2 -0.1479072 -0.1122433 2.0039114 1.3520928 1.6877567 2.2039114 3
5 7 1 0 1 0 2 3 0 1 0 1 0 4 1 1 2.1 1.8 0.2 -0.7589834 -0.9571965 0.5809490 1.3410166 0.8428035 0.7809490 1
5 8 0 0 0 1 0 5 0 1 1 0 2 7 1 1 1.8 3.3 0.2 0.1044386 1.8715922 -0.5523303 1.9044386 5.1715922 -0.3523303 2
5 9 1 1 0 0 0 2 0 0 0 1 2 0 1 1 1.2 0.9 0.2 -1.4007410 0.2591963 0.8259824 -0.2007410 1.1591963 1.0259824 2
5 12 1 1 0 0 1 6 1 0 1 1 3 1 1 1 2.7 2.1 0.2 0.3666695 1.3490850 -0.0366271 3.0666695 3.4490850 0.1633729 2
5 13 0 0 1 1 1 2 1 1 0 0 3 1 1 1 1.5 1.8 0.2 -1.0621769 -0.6435810 -1.2998089 0.4378231 1.1564190 -1.0998089 2
5 14 1 1 0 1 2 5 1 0 1 0 0 6 1 1 3.0 2.4 0.2 0.3114798 1.0910527 1.4335898 3.3114798 3.4910527 1.6335898 2
5 16 0 1 1 0 3 3 1 0 1 1 2 6 1 1 2.4 3.3 0.2 -0.9292686 -0.7806398 0.9467055 1.4707314 2.5193602 1.1467055 2
6 1 0 1 1 1 3 6 1 0 1 0 1 0 2 1 3.6 0.9 0.2 -1.2044700 0.8835375 1.1245905 2.3955300 1.7835375 1.3245905 1
6 2 1 0 0 1 0 7 0 1 0 0 1 2 2 1 2.7 1.2 0.2 6.3959437 1.8164732 -1.6435748 9.0959437 3.0164732 -1.4435748 1
6 3 1 0 0 1 1 1 1 1 1 0 3 5 2 1 1.2 3.3 0.2 1.8430660 2.1483922 -0.4963695 3.0430660 5.4483922 -0.2963695 2
6 5 0 0 1 0 2 4 0 1 0 1 1 7 2 1 2.1 3.0 0.2 -0.4881596 5.6415180 1.4593814 1.6118404 8.6415180 1.6593814 2
6 6 1 1 0 1 2 0 0 0 1 0 1 5 2 1 1.5 2.1 0.2 -1.2633638 -0.5403731 1.3670424 0.2366362 1.5596269 1.5670424 3
6 10 1 1 1 0 1 4 0 0 0 1 0 2 2 1 2.4 0.9 0.2 1.1766784 0.3423929 0.0161535 3.5766784 1.2423929 0.2161535 1
6 11 0 1 1 0 0 1 1 0 0 1 3 4 2 1 0.9 2.7 0.2 0.4126669 -1.2747480 -1.5636682 1.3126669 1.4252520 -1.3636682 2
6 15 0 0 0 0 3 7 1 1 1 1 0 3 2 1 3.0 2.1 0.2 -0.7363453 1.1162124 -0.4699908 2.2636547 3.2162124 -0.2699908 2
7 4 0 0 1 1 3 0 0 1 0 0 2 3 1 1 1.5 1.8 0.2 0.9653491 1.1158550 -0.7901764 2.4653491 2.9158550 -0.5901764 2
7 7 1 0 1 0 2 3 0 1 0 1 0 4 1 1 2.1 1.8 0.2 3.6893566 0.5624485 -0.2221840 5.7893566 2.3624485 -0.0221840 1
7 8 0 0 0 1 0 5 0 1 1 0 2 7 1 1 1.8 3.3 0.2 -0.4879056 3.5847796 -1.3237347 1.3120944 6.8847796 -1.1237347 2
7 9 1 1 0 0 0 2 0 0 0 1 2 0 1 1 1.2 0.9 0.2 0.8165038 2.6743297 0.9992979 2.0165038 3.5743297 1.1992979 2
7 12 1 1 0 0 1 6 1 0 1 1 3 1 1 1 2.7 2.1 0.2 -0.0311807 -0.6339399 -0.2614249 2.6688193 1.4660601 -0.0614249 1
7 13 0 0 1 1 1 2 1 1 0 0 3 1 1 1 1.5 1.8 0.2 0.0844100 -1.5491443 -0.4829470 1.5844100 0.2508557 -0.2829470 1
7 14 1 1 0 1 2 5 1 0 1 0 0 6 1 1 3.0 2.4 0.2 0.5885939 -0.2695084 1.8439330 3.5885939 2.1304916 2.0439330 1
7 16 0 1 1 0 3 3 1 0 1 1 2 6 1 1 2.4 3.3 0.2 -1.8267379 0.3913881 1.0266739 0.5732621 3.6913881 1.2266739 2
8 1 0 1 1 1 3 6 1 0 1 0 1 0 2 1 3.6 0.9 0.2 0.2146311 3.0537811 0.4380427 3.8146311 3.9537811 0.6380427 2
8 2 1 0 0 1 0 7 0 1 0 0 1 2 2 1 2.7 1.2 0.2 2.0977653 0.7149544 -0.8744680 4.7977653 1.9149544 -0.6744680 1
8 3 1 0 0 1 1 1 1 1 1 0 3 5 2 1 1.2 3.3 0.2 0.1544896 0.6783032 2.7279251 1.3544896 3.9783032 2.9279251 2
8 5 0 0 1 0 2 4 0 1 0 1 1 7 2 1 2.1 3.0 0.2 0.9951714 -1.1309307 1.0282519 3.0951714 1.8690693 1.2282519 1
8 6 1 1 0 1 2 0 0 0 1 0 1 5 2 1 1.5 2.1 0.2 0.1404234 -0.4443173 1.6142149 1.6404234 1.6556827 1.8142149 3
8 10 1 1 1 0 1 4 0 0 0 1 0 2 2 1 2.4 0.9 0.2 0.9250596 2.3320759 1.3967575 3.3250596 3.2320759 1.5967575 1
8 11 0 1 1 0 0 1 1 0 0 1 3 4 2 1 0.9 2.7 0.2 1.9428024 2.1920130 1.0502366 2.8428024 4.8920130 1.2502366 2
8 15 0 0 0 0 3 7 1 1 1 1 0 3 2 1 3.0 2.1 0.2 0.6359381 -0.5364219 1.3727514 3.6359381 1.5635781 1.5727514 1
9 4 0 0 1 1 3 0 0 1 0 0 2 3 1 1 1.5 1.8 0.2 1.1706317 0.7692809 1.9345501 2.6706317 2.5692809 2.1345501 1
9 7 1 0 1 0 2 3 0 1 0 1 0 4 1 1 2.1 1.8 0.2 0.2212748 1.0374326 -0.2779420 2.3212748 2.8374326 -0.0779420 2
9 8 0 0 0 1 0 5 0 1 1 0 2 7 1 1 1.8 3.3 0.2 0.4676582 1.9664472 0.0234001 2.2676582 5.2664472 0.2234001 2
9 9 1 1 0 0 0 2 0 0 0 1 2 0 1 1 1.2 0.9 0.2 -0.5921694 3.3713295 0.2438945 0.6078306 4.2713295 0.4438945 2
9 12 1 1 0 0 1 6 1 0 1 1 3 1 1 1 2.7 2.1 0.2 -0.5150756 -0.7085529 2.7736114 2.1849244 1.3914471 2.9736114 3
9 13 0 0 1 1 1 2 1 1 0 0 3 1 1 1 1.5 1.8 0.2 1.4361750 -0.7134323 0.3206485 2.9361750 1.0865677 0.5206485 1
9 14 1 1 0 1 2 5 1 0 1 0 0 6 1 1 3.0 2.4 0.2 1.5673228 -0.1422197 1.8140208 4.5673228 2.2577803 2.0140208 1
9 16 0 1 1 0 3 3 1 0 1 1 2 6 1 1 2.4 3.3 0.2 -1.0005401 1.6881335 0.6345433 1.3994599 4.9881335 0.8345433 2
10 1 0 1 1 1 3 6 1 0 1 0 1 0 2 1 3.6 0.9 0.2 -1.3840494 3.4267966 0.1517616 2.2159506 4.3267966 0.3517616 2
10 2 1 0 0 1 0 7 0 1 0 0 1 2 2 1 2.7 1.2 0.2 0.3849055 1.6268282 2.0031301 3.0849055 2.8268282 2.2031301 1
10 3 1 0 0 1 1 1 1 1 1 0 3 5 2 1 1.2 3.3 0.2 0.8181420 -0.0641513 0.6330116 2.0181420 3.2358487 0.8330116 2
10 5 0 0 1 0 2 4 0 1 0 1 1 7 2 1 2.1 3.0 0.2 -0.4895110 1.0550499 0.3833172 1.6104890 4.0550499 0.5833172 2
10 6 1 1 0 1 2 0 0 0 1 0 1 5 2 1 1.5 2.1 0.2 0.5126866 -0.0090890 3.2630820 2.0126866 2.0909110 3.4630820 3
10 10 1 1 1 0 1 4 0 0 0 1 0 2 2 1 2.4 0.9 0.2 1.9876511 0.4168428 1.2407532 4.3876511 1.3168428 1.4407532 1
10 11 0 1 1 0 0 1 1 0 0 1 3 4 2 1 0.9 2.7 0.2 0.0757268 0.3020574 -1.1795159 0.9757268 3.0020574 -0.9795159 2
10 15 0 0 0 0 3 7 1 1 1 1 0 3 2 1 3.0 2.1 0.2 -1.1511680 0.8399190 -0.9910629 1.8488320 2.9399190 -0.7910629 2
11 4 0 0 1 1 3 0 0 1 0 0 2 3 1 1 1.5 1.8 0.2 0.9282458 0.0232414 1.5506934 2.4282458 1.8232414 1.7506934 1
11 7 1 0 1 0 2 3 0 1 0 1 0 4 1 1 2.1 1.8 0.2 1.3607399 -0.4737274 0.1729479 3.4607399 1.3262726 0.3729479 1
11 8 0 0 0 1 0 5 0 1 1 0 2 7 1 1 1.8 3.3 0.2 1.3973782 -0.1890841 -0.5690359 3.1973782 3.1109159 -0.3690359 1
11 9 1 1 0 0 0 2 0 0 0 1 2 0 1 1 1.2 0.9 0.2 -0.9596717 0.8572310 -0.1812751 0.2403283 1.7572310 0.0187249 2
11 12 1 1 0 0 1 6 1 0 1 1 3 1 1 1 2.7 2.1 0.2 -0.1112423 0.4320758 0.8665942 2.5887577 2.5320758 1.0665942 1
11 13 0 0 1 1 1 2 1 1 0 0 3 1 1 1 1.5 1.8 0.2 1.4082259 -0.1406057 -0.4491501 2.9082259 1.6593943 -0.2491501 1
11 14 1 1 0 1 2 5 1 0 1 0 0 6 1 1 3.0 2.4 0.2 0.8287018 0.4779937 -0.2796278 3.8287018 2.8779937 -0.0796278 1
11 16 0 1 1 0 3 3 1 0 1 1 2 6 1 1 2.4 3.3 0.2 1.1337371 0.0813474 2.6823146 3.5337371 3.3813474 2.8823146 1
12 1 0 1 1 1 3 6 1 0 1 0 1 0 2 1 3.6 0.9 0.2 0.2744547 1.9491903 0.5279599 3.8744547 2.8491903 0.7279599 1
12 2 1 0 0 1 0 7 0 1 0 0 1 2 2 1 2.7 1.2 0.2 0.9424685 -0.1499867 -0.5220195 3.6424685 1.0500133 -0.3220195 1
12 3 1 0 0 1 1 1 1 1 1 0 3 5 2 1 1.2 3.3 0.2 0.9921861 0.7232335 2.3452723 2.1921861 4.0232335 2.5452723 2
12 5 0 0 1 0 2 4 0 1 0 1 1 7 2 1 2.1 3.0 0.2 1.9371476 -1.2097155 0.5177815 4.0371476 1.7902845 0.7177815 1
12 6 1 1 0 1 2 0 0 0 1 0 1 5 2 1 1.5 2.1 0.2 2.8230491 2.9873047 1.4771512 4.3230491 5.0873047 1.6771512 2
12 10 1 1 1 0 1 4 0 0 0 1 0 2 2 1 2.4 0.9 0.2 0.1656205 1.6004518 -0.0609233 2.5656205 2.5004518 0.1390767 1
12 11 0 1 1 0 0 1 1 0 0 1 3 4 2 1 0.9 2.7 0.2 -0.1083923 1.4698743 0.9353046 0.7916077 4.1698743 1.1353046 2
12 15 0 0 0 0 3 7 1 1 1 1 0 3 2 1 3.0 2.1 0.2 0.9548782 1.3429298 1.9045229 3.9548782 3.4429298 2.1045229 1
13 4 0 0 1 1 3 0 0 1 0 0 2 3 1 1 1.5 1.8 0.2 -0.8056778 -0.2856111 -0.3927758 0.6943222 1.5143889 -0.1927758 2
13 7 1 0 1 0 2 3 0 1 0 1 0 4 1 1 2.1 1.8 0.2 0.2802507 1.1692328 -0.1666276 2.3802507 2.9692328 0.0333724 2
13 8 0 0 0 1 0 5 0 1 1 0 2 7 1 1 1.8 3.3 0.2 -0.3084121 1.9530998 -0.7567399 1.4915879 5.2530998 -0.5567399 2
13 9 1 1 0 0 0 2 0 0 0 1 2 0 1 1 1.2 0.9 0.2 0.5556699 0.0319981 1.6216316 1.7556699 0.9319981 1.8216316 3
13 12 1 1 0 0 1 6 1 0 1 1 3 1 1 1 2.7 2.1 0.2 0.1452410 1.9370333 0.9433656 2.8452410 4.0370333 1.1433656 2
13 13 0 0 1 1 1 2 1 1 0 0 3 1 1 1 1.5 1.8 0.2 -1.8146722 0.7953156 -0.1613881 -0.3146722 2.5953156 0.0386119 2
13 14 1 1 0 1 2 5 1 0 1 0 0 6 1 1 3.0 2.4 0.2 -1.1733101 0.0526088 0.8399431 1.8266899 2.4526088 1.0399431 2
13 16 0 1 1 0 3 3 1 0 1 1 2 6 1 1 2.4 3.3 0.2 0.1448206 0.2348710 -0.2917286 2.5448206 3.5348710 -0.0917286 2
14 1 0 1 1 1 3 6 1 0 1 0 1 0 2 1 3.6 0.9 0.2 2.9195735 2.3342955 -1.1718776 6.5195735 3.2342955 -0.9718776 1
14 2 1 0 0 1 0 7 0 1 0 0 1 2 2 1 2.7 1.2 0.2 -0.6677045 0.9097239 -0.1673405 2.0322955 2.1097239 0.0326595 2
14 3 1 0 0 1 1 1 1 1 1 0 3 5 2 1 1.2 3.3 0.2 0.3283862 -0.2843285 1.2265154 1.5283862 3.0156715 1.4265154 2
14 5 0 0 1 0 2 4 0 1 0 1 1 7 2 1 2.1 3.0 0.2 -0.5521096 -0.1646452 -0.4803340 1.5478904 2.8353548 -0.2803340 2
14 6 1 1 0 1 2 0 0 0 1 0 1 5 2 1 1.5 2.1 0.2 0.8036367 0.7639964 0.2947557 2.3036367 2.8639964 0.4947557 2
14 10 1 1 1 0 1 4 0 0 0 1 0 2 2 1 2.4 0.9 0.2 0.2712418 2.6402984 0.3438280 2.6712418 3.5402984 0.5438280 2
14 11 0 1 1 0 0 1 1 0 0 1 3 4 2 1 0.9 2.7 0.2 0.6655475 0.0852692 0.4024220 1.5655475 2.7852692 0.6024220 2
14 15 0 0 0 0 3 7 1 1 1 1 0 3 2 1 3.0 2.1 0.2 -0.4189039 0.8337069 0.8619900 2.5810961 2.9337069 1.0619900 2
15 4 0 0 1 1 3 0 0 1 0 0 2 3 1 1 1.5 1.8 0.2 3.3662310 -1.0742352 -0.5117387 4.8662310 0.7257648 -0.3117387 1
15 7 1 0 1 0 2 3 0 1 0 1 0 4 1 1 2.1 1.8 0.2 0.4008935 1.5207797 0.4983802 2.5008935 3.3207797 0.6983802 2
15 8 0 0 0 1 0 5 0 1 1 0 2 7 1 1 1.8 3.3 0.2 0.9260945 -0.6222102 -0.9271397 2.7260945 2.6777898 -0.7271397 1
15 9 1 1 0 0 0 2 0 0 0 1 2 0 1 1 1.2 0.9 0.2 -0.5243619 1.8711494 2.8309415 0.6756381 2.7711494 3.0309415 3
15 12 1 1 0 0 1 6 1 0 1 1 3 1 1 1 2.7 2.1 0.2 -0.7616209 0.3573748 1.8320659 1.9383791 2.4573748 2.0320659 2
15 13 0 0 1 1 1 2 1 1 0 0 3 1 1 1 1.5 1.8 0.2 0.1266627 2.2123202 0.5994242 1.6266627 4.0123202 0.7994242 2
15 14 1 1 0 1 2 5 1 0 1 0 0 6 1 1 3.0 2.4 0.2 -0.0121946 -0.6035430 -0.1161679 2.9878054 1.7964570 0.0838321 1
15 16 0 1 1 0 3 3 1 0 1 1 2 6 1 1 2.4 3.3 0.2 -1.5845099 2.6305392 2.3345019 0.8154901 5.9305392 2.5345019 2
16 1 0 1 1 1 3 6 1 0 1 0 1 0 2 1 3.6 0.9 0.2 -0.3738873 -0.9089386 -0.5736057 3.2261127 -0.0089386 -0.3736057 1
16 2 1 0 0 1 0 7 0 1 0 0 1 2 2 1 2.7 1.2 0.2 0.5988429 1.0705506 1.6749715 3.2988429 2.2705506 1.8749715 1
16 3 1 0 0 1 1 1 1 1 1 0 3 5 2 1 1.2 3.3 0.2 2.1800982 1.3332032 -1.3003579 3.3800982 4.6332032 -1.1003579 2
16 5 0 0 1 0 2 4 0 1 0 1 1 7 2 1 2.1 3.0 0.2 0.2012731 0.4369876 1.6261076 2.3012731 3.4369876 1.8261076 2
16 6 1 1 0 1 2 0 0 0 1 0 1 5 2 1 1.5 2.1 0.2 -0.2573656 2.0787436 1.2784146 1.2426344 4.1787436 1.4784146 2
16 10 1 1 1 0 1 4 0 0 0 1 0 2 2 1 2.4 0.9 0.2 1.3139167 -1.1542511 2.3183794 3.7139167 -0.2542511 2.5183794 1
16 11 0 1 1 0 0 1 1 0 0 1 3 4 2 1 0.9 2.7 0.2 -0.9480991 -1.0181748 3.5557648 -0.0480991 1.6818252 3.7557648 3
16 15 0 0 0 0 3 7 1 1 1 1 0 3 2 1 3.0 2.1 0.2 1.8565630 -0.1660831 -0.3234651 4.8565630 1.9339169 -0.1234651 1
17 4 0 0 1 1 3 0 0 1 0 0 2 3 1 1 1.5 1.8 0.2 -1.0977826 -0.3404461 -0.4008508 0.4022174 1.4595539 -0.2008508 2
17 7 1 0 1 0 2 3 0 1 0 1 0 4 1 1 2.1 1.8 0.2 3.1974096 2.0592356 3.0874854 5.2974096 3.8592356 3.2874854 1
17 8 0 0 0 1 0 5 0 1 1 0 2 7 1 1 1.8 3.3 0.2 -1.8217673 0.4812280 -1.2865181 -0.0217673 3.7812280 -1.0865181 2
17 9 1 1 0 0 0 2 0 0 0 1 2 0 1 1 1.2 0.9 0.2 -0.1881533 0.6361656 0.8144051 1.0118467 1.5361656 1.0144051 2
17 12 1 1 0 0 1 6 1 0 1 1 3 1 1 1 2.7 2.1 0.2 1.8219571 -0.7370248 -0.3671571 4.5219571 1.3629752 -0.1671571 1
17 13 0 0 1 1 1 2 1 1 0 0 3 1 1 1 1.5 1.8 0.2 1.4997066 -0.8101609 0.8319353 2.9997066 0.9898391 1.0319353 1
17 14 1 1 0 1 2 5 1 0 1 0 0 6 1 1 3.0 2.4 0.2 -1.4223804 -0.0060769 0.5832801 1.5776196 2.3939231 0.7832801 2
17 16 0 1 1 0 3 3 1 0 1 1 2 6 1 1 2.4 3.3 0.2 0.3769516 1.8240718 -0.4550739 2.7769516 5.1240718 -0.2550739 2
18 1 0 1 1 1 3 6 1 0 1 0 1 0 2 1 3.6 0.9 0.2 2.0155549 1.5512555 2.1012230 5.6155549 2.4512555 2.3012230 1
18 2 1 0 0 1 0 7 0 1 0 0 1 2 2 1 2.7 1.2 0.2 1.0485523 -0.4324169 -0.7062930 3.7485523 0.7675831 -0.5062930 1
18 3 1 0 0 1 1 1 1 1 1 0 3 5 2 1 1.2 3.3 0.2 0.7915597 -0.1261139 1.5728713 1.9915597 3.1738861 1.7728713 2
18 5 0 0 1 0 2 4 0 1 0 1 1 7 2 1 2.1 3.0 0.2 0.3270647 -0.3650944 0.9667361 2.4270647 2.6349056 1.1667361 2
18 6 1 1 0 1 2 0 0 0 1 0 1 5 2 1 1.5 2.1 0.2 -1.7650271 1.1411943 0.7593768 -0.2650271 3.2411943 0.9593768 2
18 10 1 1 1 0 1 4 0 0 0 1 0 2 2 1 2.4 0.9 0.2 0.1624004 -0.1229293 0.3440701 2.5624004 0.7770707 0.5440701 1
18 11 0 1 1 0 0 1 1 0 0 1 3 4 2 1 0.9 2.7 0.2 -0.2408775 -0.5623836 0.2683309 0.6591225 2.1376164 0.4683309 2
18 15 0 0 0 0 3 7 1 1 1 1 0 3 2 1 3.0 2.1 0.2 1.1126180 4.7210839 1.8612242 4.1126180 6.8210839 2.0612242 2
19 4 0 0 1 1 3 0 0 1 0 0 2 3 1 1 1.5 1.8 0.2 -0.4343847 -0.1149750 0.5250572 1.0656153 1.6850250 0.7250572 2
19 7 1 0 1 0 2 3 0 1 0 1 0 4 1 1 2.1 1.8 0.2 0.4024674 0.4119786 1.2110419 2.5024674 2.2119786 1.4110419 1
19 8 0 0 0 1 0 5 0 1 1 0 2 7 1 1 1.8 3.3 0.2 -0.9273641 2.4527016 1.8341532 0.8726359 5.7527016 2.0341532 2
19 9 1 1 0 0 0 2 0 0 0 1 2 0 1 1 1.2 0.9 0.2 -0.2324014 1.1450453 1.3187159 0.9675986 2.0450453 1.5187159 2
19 12 1 1 0 0 1 6 1 0 1 1 3 1 1 1 2.7 2.1 0.2 0.1785370 -0.8792794 0.2099514 2.8785370 1.2207206 0.4099514 1
19 13 0 0 1 1 1 2 1 1 0 0 3 1 1 1 1.5 1.8 0.2 -0.1566453 0.3628500 0.5262657 1.3433547 2.1628500 0.7262657 2
19 14 1 1 0 1 2 5 1 0 1 0 0 6 1 1 3.0 2.4 0.2 1.6552150 -0.4109460 -0.0047152 4.6552150 1.9890540 0.1952848 1
19 16 0 1 1 0 3 3 1 0 1 1 2 6 1 1 2.4 3.3 0.2 -1.0089813 1.7677465 -0.3999686 1.3910187 5.0677465 -0.1999686 2
20 1 0 1 1 1 3 6 1 0 1 0 1 0 2 1 3.6 0.9 0.2 -0.0896211 2.6858677 0.5622052 3.5103789 3.5858677 0.7622052 2
20 2 1 0 0 1 0 7 0 1 0 0 1 2 2 1 2.7 1.2 0.2 -0.8759458 0.0645036 0.0777809 1.8240542 1.2645036 0.2777809 1
20 3 1 0 0 1 1 1 1 1 1 0 3 5 2 1 1.2 3.3 0.2 -1.2208761 -1.0737197 -0.2418767 -0.0208761 2.2262803 -0.0418767 2
20 5 0 0 1 0 2 4 0 1 0 1 1 7 2 1 2.1 3.0 0.2 -0.1879346 -1.3174244 -0.5396619 1.9120654 1.6825756 -0.3396619 1
20 6 1 1 0 1 2 0 0 0 1 0 1 5 2 1 1.5 2.1 0.2 1.9705007 -0.5844581 -0.0885146 3.4705007 1.5155419 0.1114854 1
20 10 1 1 1 0 1 4 0 0 0 1 0 2 2 1 2.4 0.9 0.2 3.5131277 3.4063755 1.5950800 5.9131277 4.3063755 1.7950800 1
20 11 0 1 1 0 0 1 1 0 0 1 3 4 2 1 0.9 2.7 0.2 1.7915652 -1.6519993 1.7995446 2.6915652 1.0480007 1.9995446 1
20 15 0 0 0 0 3 7 1 1 1 1 0 3 2 1 3.0 2.1 0.2 1.4305241 1.4802242 0.2895619 4.4305241 3.5802242 0.4895619 1
21 4 0 0 1 1 3 0 0 1 0 0 2 3 1 1 1.5 1.8 0.2 0.5188243 0.0668446 0.0098922 2.0188243 1.8668446 0.2098922 1
21 7 1 0 1 0 2 3 0 1 0 1 0 4 1 1 2.1 1.8 0.2 -0.5777947 -1.0934556 1.2265592 1.5222053 0.7065444 1.4265592 1
21 8 0 0 0 1 0 5 0 1 1 0 2 7 1 1 1.8 3.3 0.2 -0.7175084 0.9790226 1.9289591 1.0824916 4.2790226 2.1289591 2
21 9 1 1 0 0 0 2 0 0 0 1 2 0 1 1 1.2 0.9 0.2 0.8034195 0.0541033 -0.5460703 2.0034195 0.9541033 -0.3460703 1
21 12 1 1 0 0 1 6 1 0 1 1 3 1 1 1 2.7 2.1 0.2 -0.7673433 4.0229858 3.1984928 1.9326567 6.1229858 3.3984928 2
21 13 0 0 1 1 1 2 1 1 0 0 3 1 1 1 1.5 1.8 0.2 -1.5733257 0.3840076 -0.4517147 -0.0733257 2.1840076 -0.2517147 2
21 14 1 1 0 1 2 5 1 0 1 0 0 6 1 1 3.0 2.4 0.2 1.4148579 -0.4140772 0.6351006 4.4148579 1.9859228 0.8351006 1
21 16 0 1 1 0 3 3 1 0 1 1 2 6 1 1 2.4 3.3 0.2 -0.7754695 2.7113716 -0.2856749 1.6245305 6.0113716 -0.0856749 2
22 1 0 1 1 1 3 6 1 0 1 0 1 0 2 1 3.6 0.9 0.2 1.3838974 -0.3356571 0.5537056 4.9838974 0.5643429 0.7537056 1
22 2 1 0 0 1 0 7 0 1 0 0 1 2 2 1 2.7 1.2 0.2 -0.6389404 1.4495835 0.4080598 2.0610596 2.6495835 0.6080598 2
22 3 1 0 0 1 1 1 1 1 1 0 3 5 2 1 1.2 3.3 0.2 2.3266369 0.7400508 1.3271600 3.5266369 4.0400508 1.5271600 2
22 5 0 0 1 0 2 4 0 1 0 1 1 7 2 1 2.1 3.0 0.2 0.9623242 0.2256843 0.7406459 3.0623242 3.2256843 0.9406459 2
22 6 1 1 0 1 2 0 0 0 1 0 1 5 2 1 1.5 2.1 0.2 0.5528150 -0.4985173 -0.1733400 2.0528150 1.6014827 0.0266600 1
22 10 1 1 1 0 1 4 0 0 0 1 0 2 2 1 2.4 0.9 0.2 0.6768319 -0.0740796 1.5270424 3.0768319 0.8259204 1.7270424 1
22 11 0 1 1 0 0 1 1 0 0 1 3 4 2 1 0.9 2.7 0.2 0.3764531 -0.5842859 -0.9676926 1.2764531 2.1157141 -0.7676926 2
22 15 0 0 0 0 3 7 1 1 1 1 0 3 2 1 3.0 2.1 0.2 -0.1156253 0.0092832 0.9547435 2.8843747 2.1092832 1.1547435 1
23 4 0 0 1 1 3 0 0 1 0 0 2 3 1 1 1.5 1.8 0.2 2.1150525 -0.0501334 0.3169699 3.6150525 1.7498666 0.5169699 1
23 7 1 0 1 0 2 3 0 1 0 1 0 4 1 1 2.1 1.8 0.2 0.4304851 0.8093394 -1.4509626 2.5304851 2.6093394 -1.2509626 2
23 8 0 0 0 1 0 5 0 1 1 0 2 7 1 1 1.8 3.3 0.2 -1.1341497 -0.9246697 -0.0148908 0.6658503 2.3753303 0.1851092 2
23 9 1 1 0 0 0 2 0 0 0 1 2 0 1 1 1.2 0.9 0.2 0.3188452 -0.8388149 0.5028024 1.5188452 0.0611851 0.7028024 1
23 12 1 1 0 0 1 6 1 0 1 1 3 1 1 1 2.7 2.1 0.2 -0.1011105 -0.1872467 -0.6401751 2.5988895 1.9127533 -0.4401751 1
23 13 0 0 1 1 1 2 1 1 0 0 3 1 1 1 1.5 1.8 0.2 6.4338041 0.1311793 0.0552560 7.9338041 1.9311793 0.2552560 1
23 14 1 1 0 1 2 5 1 0 1 0 0 6 1 1 3.0 2.4 0.2 0.3714764 -0.1609574 -0.4877071 3.3714764 2.2390426 -0.2877071 1
23 16 0 1 1 0 3 3 1 0 1 1 2 6 1 1 2.4 3.3 0.2 1.2949188 2.8480000 0.7783896 3.6949188 6.1480000 0.9783896 2
24 1 0 1 1 1 3 6 1 0 1 0 1 0 2 1 3.6 0.9 0.2 1.1361060 -0.6125712 -1.0794398 4.7361060 0.2874288 -0.8794398 1
24 2 1 0 0 1 0 7 0 1 0 0 1 2 2 1 2.7 1.2 0.2 0.4627792 -0.8500202 0.8204245 3.1627792 0.3499798 1.0204245 1
24 3 1 0 0 1 1 1 1 1 1 0 3 5 2 1 1.2 3.3 0.2 0.7799443 -0.0064863 0.6145697 1.9799443 3.2935137 0.8145697 2
24 5 0 0 1 0 2 4 0 1 0 1 1 7 2 1 2.1 3.0 0.2 0.8027735 0.3090594 0.9312733 2.9027735 3.3090594 1.1312733 2
24 6 1 1 0 1 2 0 0 0 1 0 1 5 2 1 1.5 2.1 0.2 0.2715031 -0.8831655 -0.1993149 1.7715031 1.2168345 0.0006851 1
24 10 1 1 1 0 1 4 0 0 0 1 0 2 2 1 2.4 0.9 0.2 1.1804119 2.2918195 -0.2871011 3.5804119 3.1918195 -0.0871011 1
24 11 0 1 1 0 0 1 1 0 0 1 3 4 2 1 0.9 2.7 0.2 0.6324759 1.6886296 -0.5010965 1.5324759 4.3886296 -0.3010965 2
24 15 0 0 0 0 3 7 1 1 1 1 0 3 2 1 3.0 2.1 0.2 0.7000821 1.9812950 0.7678483 3.7000821 4.0812950 0.9678483 2
25 4 0 0 1 1 3 0 0 1 0 0 2 3 1 1 1.5 1.8 0.2 -0.2936195 1.0797569 0.4001707 1.2063805 2.8797569 0.6001707 2
25 7 1 0 1 0 2 3 0 1 0 1 0 4 1 1 2.1 1.8 0.2 4.1840236 0.4616781 2.6931134 6.2840236 2.2616781 2.8931134 1
25 8 0 0 0 1 0 5 0 1 1 0 2 7 1 1 1.8 3.3 0.2 1.0122236 0.2287786 -0.1228385 2.8122236 3.5287786 0.0771615 2
25 9 1 1 0 0 0 2 0 0 0 1 2 0 1 1 1.2 0.9 0.2 0.6942845 0.1088424 -0.3769196 1.8942845 1.0088424 -0.1769196 1
25 12 1 1 0 0 1 6 1 0 1 1 3 1 1 1 2.7 2.1 0.2 1.0029091 -0.9149497 -1.8987643 3.7029091 1.1850503 -1.6987643 1
25 13 0 0 1 1 1 2 1 1 0 0 3 1 1 1 1.5 1.8 0.2 1.1353083 2.7187466 -0.2560346 2.6353083 4.5187466 -0.0560346 2
25 14 1 1 0 1 2 5 1 0 1 0 0 6 1 1 3.0 2.4 0.2 1.5448817 4.7223123 0.0382135 4.5448817 7.1223123 0.2382135 2
25 16 0 1 1 0 3 3 1 0 1 1 2 6 1 1 2.4 3.3 0.2 -1.0505700 -1.4655248 -0.3838638 1.3494300 1.8344752 -0.1838638 2

Heuristics

  • In the data there are three different DGPs

  • Group 1 (60%): RUM (\(\alpha=0.2\) and all \(\beta=0.3\))

\[ U = \alpha + \beta_{cow}\text{cow}+\beta_{adv}\text{adv} + \beta_{vet}\text{vet} + \\ \beta_{far}\text{far}+\beta_{met}\text{met}+\beta_{bon}\text{bon} + \epsilon\]

  • Group 2 (20%): Attribute Non-Attendance

\[ U = \alpha + \beta_{cow}\text{cow}+\beta_{adv}\text{adv} + \beta_{vet}\text{vet} + \beta_{far}\text{far}+\beta_{bon}\text{bon} + \epsilon\]

  • Group 3 (20%): Only Bonus with $ _{bon2}=1.9$

\[ U = \beta_{bon2}\text{bon} + \epsilon\]

Separate models

  • As we know the DGP, we can estimate models for each group and compare results to the full model

  • In reality, however we do not know this

  • Lets see if there is a problem

Choice Frequencies

Regression results with all models

Statistical models
  ALL RUM NonAttendence Price
basc 0.17 (0.04)*** 0.28 (0.05)*** 0.17 (0.06)** 0.52 (0.75)
bcow 0.29 (0.02)*** 0.33 (0.02)*** 0.27 (0.04)*** -0.05 (0.10)
badv 0.32 (0.02)*** 0.31 (0.02)*** 0.37 (0.03)*** -0.96 (1.07)
bvet 0.32 (0.02)*** 0.31 (0.02)*** 0.33 (0.04)*** -1.00 (1.13)
bfar 0.27 (0.02)*** 0.29 (0.02)*** 0.31 (0.03)*** -0.04 (0.07)
bmet 0.18 (0.01)*** 0.32 (0.01)*** 0.00 (0.01) 0.48 (0.33)
bbon 0.37 (0.01)*** 0.31 (0.01)*** 0.28 (0.01)*** 2.72 (0.76)***
No Observations 28800 17280 5760 5760
No Respondents 3600 2160 720 720
Log Likelihood (Null) -31640.03 -18984.02 -6328.01 -6328.01
Log Likelihood (Converged) -20401.99 -12884.39 -4817.40 -611.21

Latent Class models

  • We may try a Latent Class model to detect heuristics