User:IssaRice/Additivity of small risks

From Machinelearning
Revision as of 20:39, 14 November 2020 by IssaRice (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Suppose you want to engage in some risky activities such as driving on the highway or swimming in the river, which have some small probability of resulting in death. It turns out that when the probabilities involved are independent and small, one can simply add the probabilities together instead of doing the more complicated correct calculation.

Let p1,,pn be the probabilities of the risk of dying from each of activities 1,,n, where each 0pi1. Then the probability of dying from doing all activities is Pr(death)=1i(1pi). This happens because for independent probabilities we know how to take conjunctions (AND) by multiplying but not disjunctions (OR) so we must first invert and use de Morgan's laws then invert again.

But now let mi:=log(1pi). Then we have Pr(death)=1exp(imi). When each pi is small, we have mipi. Thus imi is also small. For small positive values of x we have 1exp(x)x. Thus we have Pr(death)=1exp(imi)imiipi.

If y=log(1x) then x=1exp(y) which means that the two functions are inverses of each other. So it makes sense that when either quantity is small, the other is also small.