Consider the following game that costs $2 to play: You roll a fair, six-sided die. You are awarded a $6 prize if, and only if, you roll a six; otherwise, you get nothing. Should you play the game? Well, considering the odds, the average payout - or "expected utility" - is (1/6)x($6)=$1, which is *less* than the $2 cost of playing. Therefore, since over many trials you would lose out, you should not play this game. That line of reasoning sounds OK. But let's say you are given a chance to play only once. What sort of bearing does this "average payout" argument have on this special "one shot" case? If you are in this for a single trial, it is not obviously irrelevant what the trend is "over many trials?"

Good question. My own view is that what happens in the long run is irrelevant to the rationality of betting (or in your case not betting) according to the odds in the single case. I think that it is a basic principle of practical rationality that your choices should be guided by the probabilities and that, surprisingly, there is no further justification for this. A first point. You say 'over many trials you would lose out'. Well, if you are talking about a finite number of trials, that's not guaranteed. It is possible--indeed there will be a positive probability--that in a finite number of trials you will win even if you bet against the odds. All we can say it that the probability of winning over many trials is low. So now we are just back with the original problem. Why is it rational to avoid doing something just because the probability of success is low? Does the situation change if we think about an infinite number of trials? Well, it's not even obvious that you are guaranteed to...