> Expected value doesn't mean jack shit if the game can only be played once.
Thinking like this was the mistake I've made.
While you can play a given game only once, your life will have plenty of such games. So there definitely is a relevance to "expected value". And this is easily to simulate with a program. The expected value of the wealth for those who take the chance when the "local expected value" is better than the certain outcome does tend to be higher.
Well, life doesn’t always give many chances to play a game. You can only work at so many failed startups, or have so many failed long-term romantic relationships before you’ve used your best years! Someone else already made the point about the risk of walking away empty handed, but I’m just pointing out that some domains allow for many retries and some don’t.
This represents the trap of over-rationalisation which is so prevalent in the Western world. You cannot devise universal rational guidelines suitable for every situation and every subjective experience. There is a multitude of various different factors involved in every particular situation. The lean and precise rational model breaks badly simply because it doesn’t (and can’t) account for all the factors.
> Well, life doesn’t always give many chances to play a game.
Disagree. Sure - you don't get many games involving millions of dollars, but you do get many for smaller amounts.
I could put all my extra money into paying off a low interest mortgage (guaranteed return), or I could put it in an index fund (higher average return, with no guarantees, and a potential for a loss).
And working at startups: Not sure the expected value is high there. May be higher than working at a FAANG. I doubt it.
I assume you agree that some kinds of opportunities are limited. Not trying new foods because you're afraid of wasting your money would be silly, or not saying hi to your neighbor because they might ignore you would be silly, but some things are very complicated. I'm thinking of: surgeries, mate selection, college degrees, white-collar crime, etc. I'm just saying that utility and loss aversion come into play, and "life is long" can't always save the day.
On startups, I think there are people who have been in situations where they have an expected value greater than something like a FAANG $300k/year over 3 years scenario (e.g. they own a large stake in a close-to-IPO company). And they should maybe still walk away, if the 50% chance of a tiny IPO payout would destroy their self esteem and make them feel even further behind their high-salary peers. (Also keep in mind that not everyone lands jobs at FAANG companies, so it shouldn't be super hard to find people who lucked into a startup where their EV is higher than their market salary over a few years). In other words: even if a startup somehow has higher EV, you may want to ignore the EV.
This is a much more profound statement than it seems at first and I wholeheartedly agree with it. Not only that but the gains compound over time.
It's not about the expected value of any one opportunity, it's about the expected value among every opportunity you will encounter in your life. This also implies that one should do what they can to expose themselves to said opportunities especially while they're young.
A very common one: You have a debt to pay off (typically mortgage). Should you put all your extra money to pay it off early or should you pay the minimum and invest the rest?
As another commenter pointed out: Most investments involve this. In the RE circles you often have the same dilemma: Buy a house for rental in a LCOL area where you get (mostly) guaranteed net income, or buy in a place like California where the rent income won't cover all the expenses, but you feel you can pay the difference and rely on profiting off the hoped appreciation.
Insurance is also a good example someone else pointed out.
Even: Get a guaranteed low paying job as a relatively unskilled worker, or get into deep debt to go into medical school, do a residency, and earn a lot. The latter can have significant risk: Some people don't do well enough to get a residency. Others get the residency but don't have what it takes to complete it. In both cases you're left with a huge amount of debt.
Everytime you book additional insurances that cover small amounts of money. Like a airplane ticket insurance (that only covers the fee of the ticket if you cancel). Or a additional rental car insurance.
Assuming that Insurance companies are not stupid and only offer an Insurance that is +ev for them, that means its -ev for you. If you are in the financial situation that 1-5k$ wont ruin you its rational to NOT take these kind of insurances.
Every spot in life that you encounter that can be seen purely from an EV perspektive should be played as that. Only exceptions are longtail ruinous outcomes, like House Fire insurance, Health Insurance. Thats why in many western nations these types of insurances are mandatory.
Investing has a degree of this as well. And, in practice, most rational investors will diversify based on a number of factors into fairly safe but low return assets and into potentially higher return but riskier ones.
The investor's situation, I believe, is very much different from the common person's. The investor put him or herself in the position of doing tons and tons of financial transactions and investments, etc, like that. He or she put him or herself in a situation such that EV-reasoning makes sense. It seems to be that this isn't the situation for the common person.
But I agree... If you are an investor, or maybe a professional poker player, then you'd have put yourself in a position that favors reasoning guided by EV.
There are other ones as well, non-money related. For example, in sports. I believe basketball players probably try to do this. There are so many shots. They're probably using EV to guide their strategy and practice.
Five Thirty Eight writes about this from time to time. Three points shots in basketball. Going for it on fourth down. Going for a two point conversion. You can work out the stats for all this sort of thing--and there are apparently biases for various reasons why coaches/players don't always follow the EV strategy.
I think I get it, but I'm not so sure I'm convinced. Those examples, however, don't resonate with me (don't have a car, nor a license to drive one; nor I own a house; I've been inside an airplane only once).
However, I believe I've done similar things with used electronics. I tend to favor buying a really cheap used ones for [sometimes] 1/5 of the price instead of a new one. It could break or be of low quality, but chances of that are small and thus (over time -- making an EV-ish calculation), I spend less money on electronics.
I also believe I do this in buying new products. In many situations, I can pay extra for an extra year or two of 'guarantee' (not sure if the right term is 'guarantee' or 'insurance'). However, very often, the first 6 months or 1 year of guarantee is given and has its cost embedded in the price of the product. The question becomes: how likely it is for the product to fail given it hasn't failed for the first year. I believe the chances are small so I don't buy it. I guess it's also an EV kind of calculation (just like you gave as an example).
However, those don't seem that common, really. Maybe it's just the kind of life that I live.
Is the situation 100%1M vs. 50%50M supposed to exemplify these ones? These not-so-frequent ones for small amount of money?
Another thing is that expected value has to do with a limit in this situation:
(1/n) x SUM [j = 1 to n] outcome(j) -> E for n -> oo
(there is an ergodicity assumption going on here -- which doesn't always hold in practice). That limit can be E while the first idk how many hundreds of values of outcome(j) be very distinct from E.
How many times will things like that happen in your lifetime? Some dozen? What if you separate away the large-scale ones (like the 100%1M vs 50%50M)? The small-scale ones will be more frequent and you just blindly follow the EV approach to them. The large scale ones will be extremely rare, and maybe another approach is better. No?
>In many situations, I can pay extra for an extra year or two of 'guarantee' (not sure if the right term is 'guarantee' or 'insurance'). However, very often, the first 6 months or 1 year of guarantee is given and has its cost embedded in the price of the product. T
Extended warranty which is basically insurance. Leaving aside the fact that some credit cards provide it for you anyway and things like that. Yes, for most purchases, this is a bad deal because the expected value is almost certainly negative and--probably--if something does break you can replace it.
Here we're talking about losses rather than gains. The certainty of small losses (extended warranty purchases) vs. the chance of a relatively large loss. But it's the same idea with a negative sign.
One of the thing that isn't obvious to me that seems to be for many people is the decision to maximize expected value instead of best worst case scenario. In this situation, given how exceptional the 100%1M vs. 50%50M situation is and how the 1M will definitely kill your financial problems, it really does seem like you'd like to pick the strategy that maximizes your worst case scenario (if choice=red, worst-case=1M; if choice=green, worst-case=0). I understand the reasoning behind expected values, I guess, it's just that it's not clear to me it is of any use here.
To me, the choice looks like "solve your financial issues with the red button; 100% chance" vs. "solve your financial issues and get extra money you won't really need, but with 50% chance through the green button".
I'd have a hard time choosing the green button.
It's curious because I'm a mathematician. I feel like I should know this better, but I've never really studied probability, much less statistics or economics.
(edit)
Another issue is what would it mean, in practice, that "50%" statement? I guess it means that if you'd play the game long enough, 50M would come out roughly half the times (by counting). This could mean a system in which the first 10 always fails, the second 10 always succeed, and the ones after that have their results based on a fair dice (1,2,3->50M; 4,5,6->0). This would certainly fit the frequency "definition". In practice, these probabilities don't mean a clean neat thing very often. Another issue is that the definition of that 50% means if you played that game long enough, you'd observe the half-half split, but you'll play it only once. Again, there is a statement about a limit (a statement about a_n, for n large), but you're only looking at a_1 (it often seems to me that people believe that information about EV transfers to information about a_1 -- it really does not). Even though I can mostly think of artificial examples (stuff like the one above), I'm not sure it'd be clear [in an actual situation] what is the meaning of that '50%'.
If the $1m "solves your financial problems" or is otherwise life-changing, you should almost certainly take the sure thing. As other discussions suggest, once you get into maybe the $3m-$5m net worth range, you presumably already don't have financial problems and another $1m is nice but not really transformative whereas $50m would be even though not a sure thing.
Even for a one time event, at some point it makes more sense to place the bet depending on a number of factors.
If it's hard to conceive of in this scenario, pick numbers about which it's easier to have intuition. What if you could take $10 for certain vs. a 50% chance of getting $500? Or pick some other values with the same ratio. 50% in this case just means a coin flip. You're right that no one gets the expected value. They get zero or they get $50m. But that may be a good bet depending on circumstances.
Thinking like this was the mistake I've made.
While you can play a given game only once, your life will have plenty of such games. So there definitely is a relevance to "expected value". And this is easily to simulate with a program. The expected value of the wealth for those who take the chance when the "local expected value" is better than the certain outcome does tend to be higher.