Tuesday, October 22, 2019
Moment Generating Functions of Random Variables
Moment Generating Functions of Random Variables One way to calculate the mean and variance of a probability distribution is to find the expected values of the random variables X and X2. We use the notation E(X) and E(X2) to denote these expected values. In general, it is difficult to calculate E(X) and E(X2) directly. To get around this difficulty, we use some more advanced mathematical theory and calculus. The end result is something that makes our calculations easier. The strategy for this problem is to define a new function, of a new variable t that is called the moment generating function. This function allows us to calculate moments by simply taking derivatives. Assumptions Before we define the moment generating function, we begin by setting the stage with notation and definitions. We let X be a discrete random variable. This random variable has the probability mass function f(x). The sample space that we are working with will be denoted by S. Rather than calculating the expected value of X, we want to calculate the expected value of an exponential function related to X. If there is a positive real number r such that E(etX) exists and is finite for all t in the interval [-r, r], then we can define the moment generating function of X. Definition The moment generating function is the expected value of the exponential function above. In other words, we say that the moment generating function of X is given by: M(t) E(etX) This expected value is the formula à £ etx f (x), where the summation is taken over all x in the sample space S. This can be a finite or infinite sum, depending upon the sample space being used. Properties The moment generating function has many features that connect to other topics in probability and mathematical statistics. Some of its most important features include: The coefficient of etb is the probability that X b.Moment generating functions possess a uniqueness property. If the moment generating functions for two random variables match one another, then the probability mass functions must be the same. In other words, the random variables describe the same probability distribution.Moment generating functions can be used to calculate moments of X. Calculating Moments The last item in the list above explains the name of moment generating functions and also their usefulness. Some advanced mathematics says that under the conditions that we laid out, the derivative of any order of the function M (t) exists for when t 0. Furthermore, in this case, we can change the order of summation and differentiation with respect to t to obtain the following formulas (all summations are over the values of x in the sample space S): Mââ¬â¢(t) à £ xetx f (x)Mââ¬â¢Ã¢â¬â¢(t) à £ x2etx f (x)Mââ¬â¢Ã¢â¬â¢Ã¢â¬â¢(t) à £ x3etx f (x)M(n)ââ¬â¢(t) à £ xnetx f (x) If we set t 0 in the above formulas, then the etx term becomes e0 1. Thus we obtain formulas for the moments of the random variable X: Mââ¬â¢(0) E(X)Mââ¬â¢Ã¢â¬â¢(0) E(X2)Mââ¬â¢Ã¢â¬â¢Ã¢â¬â¢(0) E(X3)M(n)(0) E(Xn) This means that if the moment generating function exists for a particular random variable, then we can find its mean and its variance in terms of derivatives of the moment generating function. The mean is Mââ¬â¢(0), and the variance is Mââ¬â¢Ã¢â¬â¢(0) ââ¬â [Mââ¬â¢(0)]2. Summary In summary, we had to wade into some pretty high-powered mathematics, so some things were glossed over. Although we must use calculus for the above, in the end, our mathematical work is typically easier than by calculating the moments directly from the definition.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.