Katana VentraIP

Expected value

In probability theory, the expected value (also called expectation, expectancy, expectation operator, mathematical expectation, mean, expectation value, or first moment) is a generalization of the weighted average. Informally, the expected value is the arithmetic mean of the possible values a random variable can take, weighted by the probability of those outcomes. Since it is obtained through arithmetic, the expected value sometimes may not even be included in the sample data set; it is not the value you would "expect" to get in reality.

This article is about the term used in probability theory and statistics. For other uses, see Expected value (disambiguation).

The expected value of a random variable with a finite number of outcomes is a weighted average of all possible outcomes. In the case of a continuum of possible outcomes, the expectation is defined by integration. In the axiomatic foundation for probability provided by measure theory, the expectation is given by Lebesgue integration.


The expected value of a random variable X is often denoted by E(X), E[X], or EX, with E also often stylized as or E.[1][2][3]

Notations[edit]

The use of the letter E to denote "expected value" goes back to W. A. Whitworth in 1901.[9] The symbol has since become popular for English writers. In German, E stands for Erwartungswert, in Spanish for esperanza matemática, and in French for espérance mathématique.[10]


When "E" is used to denote "expected value", authors use a variety of stylizations: the expectation operator can be stylized as E (upright), E (italic), or (in blackboard bold), while a variety of bracket notations (such as E(X), E[X], and EX) are all used.


Another popular notation is μX, whereas X, Xav, and are commonly used in physics,[11] and M(X) in Russian-language literature.

Let represent the outcome of a roll of a fair six-sided . More specifically, will be the number of pips showing on the top face of the die after the toss. The possible values for are 1, 2, 3, 4, 5, and 6, all of which are equally likely with a probability of 1/6. The expectation of is If one rolls the die times and computes the average (arithmetic mean) of the results, then as grows, the average will almost surely converge to the expected value, a fact known as the strong law of large numbers.

die

The game consists of a small ball and a wheel with 38 numbered pockets around the edge. As the wheel is spun, the ball bounces around randomly until it settles down in one of the pockets. Suppose random variable represents the (monetary) outcome of a $1 bet on a single number ("straight up" bet). If the bet wins (which happens with probability 1/38 in American roulette), the payoff is $35; otherwise the player loses the bet. The expected profit from such a bet will be That is, the expected value to be won from a $1 bet is −$1/19. Thus, in 190 bets, the net loss will probably be about $10.

roulette

Non-negativity: If (a.s.), then

Linearity of expectation: The expected value operator (or expectation operator) is linear in the sense that, for any random variables and and a constant whenever the right-hand side is well-defined. By induction, this means that the expected value of the sum of any finite number of random variables is the sum of the expected values of the individual random variables, and the expected value scales linearly with a multiplicative constant. Symbolically, for random variables and constants we have If we think of the set of random variables with finite expected value as forming a vector space, then the linearity of expectation implies that the expected value is a linear form on this vector space.

[34]

Monotonicity: If , and both and exist, then

Proof follows from the linearity and the non-negativity property for since (a.s.).

(a.s.)

Non-degeneracy: If then (a.s.).

If , then In other words, if X and Y are random variables that take different values with probability zero, then the expectation of X will equal the expectation of Y.

(a.s.)

If for some real number c, then In particular, for a random variable with well-defined expectation, A well defined expectation implies that there is one number, or rather, one constant that defines the expected value. Thus follows that the expectation of this constant is just the original expected value.

(a.s.)

As a consequence of the formula |X| = X+ + X as discussed above, together with the , it follows that for any random variable with well-defined expectation, one has

triangle inequality

Let 1A denote the of an event A, then E[1A] is given by the probability of A. This is nothing but a different way of stating the expectation of a Bernoulli random variable, as calculated in the table above.

indicator function

Formulas in terms of CDF: If is the of a random variable X, then where the values on both sides are well defined or not well defined simultaneously, and the integral is taken in the sense of Lebesgue-Stieltjes. As a consequence of integration by parts as applied to this representation of E[X], it can be proved that with the integrals taken in the sense of Lebesgue.[35] As a special case, for any random variable X valued in the nonnegative integers {0, 1, 2, 3, ...}, one has where P denotes the underlying probability measure.

cumulative distribution function

Non-multiplicativity: In general, the expected value is not multiplicative, i.e. is not necessarily equal to If and are , then one can show that If the random variables are dependent, then generally although in special cases of dependency the equality may hold.

independent

: The expected value of a measurable function of given that has a probability density function is given by the inner product of and :[34] This formula also holds in multidimensional case, when is a function of several random variables, and is their joint density.[34][36]

Law of the unconscious statistician

Central tendency

Conditional expectation

Expectation (epistemic)

– related to expectations in a way analogous to that in which quantiles are related to medians

Expectile

– the expected value of the conditional expected value of X given Y is the same as the expected value of X

Law of total expectation

– a generalization of the expected value

Nonlinear expectation

Population mean

Predicted value

– an equation for calculating the expected value of a random number of random variables

Wald's equation