Summary table of probability terms
This page is a summary table of probability terms.
Table
Term | Symbol | Type | Definition |
---|---|---|---|
Reals | |||
Borel subsets of the reals | |||
A Borel set | |||
Sample space | |||
Outcome | |||
Events or measurable sets | |||
Probability measure | or or | ||
Probability triple or probability space | |||
Distribution | or or or or or | ||
Induced probability space | |||
Cumulative distribution function or CDF | |||
Probability density function or PDF | |||
Random variable | |||
Preimage of random variable | but all we need is | ||
Indicator of | |||
Expectation | or | ||
Dependencies
Let be a probability space.
- Given a random variable , we can compute its distribution . How? Just let
- Given a random variable, we can compute the probability density function. How?
- Given a random variable, we can compute the cumulative distribution function. How?
- Given a distribution, we can retrieve a random variable. But this random variable is not unique? This is why we can say stuff like "let ".
- Given a distribution , we can compute its density function. How? Just find the derivative of . (?)
- Given a cumulative distribution function, we can compute the random variable. (Right?)
- Given a probability density function, can we get everything else? Don't we just have to integrate to get the cdf, which gets us the random variable and the distribution?
- Given a cumulative distribution function, how do we get the distribution? We have , which gets us some of what the distribution maps to, but is bigger than this. What do we do about the other values we need to map? We can compute intervals like . And we can apparently do the same for unions and limiting operations.
Philosophical details about the sample space
Given a random variable and any reasonable predicate about , we can replace with its extension for some . And from then on, we can write as . In other words, we can just work with Borel sets of the reals (measuring them with the distribution) rather than the original events (measuring them with the original probability measure). Where did go? , so you can write using . But once you already have , you don't need to know what is.
See also
External links
- 254A, Notes 0: A review of probability theory by Terence Tao
- Basic Random Variable Concepts by Kenneth Kreutz-Delgado
- Various questions on Mathematics Stack Exchange:
- https://math.stackexchange.com/questions/2233731/discarding-random-variables-in-favor-of-a-domain-less-definition
- https://math.stackexchange.com/questions/18198/what-are-the-sample-spaces-when-talking-about-continuous-random-variables
- https://math.stackexchange.com/questions/2233721/the-true-domain-of-random-variables
- https://math.stackexchange.com/questions/712734/domain-of-a-random-variable-sample-space-or-probability-space
- https://math.stackexchange.com/questions/23006/the-role-of-the-hidden-probability-space-on-which-random-variables-are-defined
- https://math.stackexchange.com/questions/1612012/how-should-i-understand-the-probability-space-omega-mathcalf-p-what-d
- https://math.stackexchange.com/questions/2531810/why-does-probability-theory-insist-on-sample-spaces
- https://math.stackexchange.com/questions/1690289/what-is-a-probability-distribution
- https://math.stackexchange.com/questions/1073744/distinguishing-probability-measure-function-and-distribution
- https://math.stackexchange.com/questions/57027/concept-of-probability-distribution