Ova

How Do You Add Random Variables?

Published in Random Variable Sums 4 mins read

Adding random variables involves understanding how their properties, such as expected value, variance, and distribution, combine to form a new random variable representing their sum. Unlike simple arithmetic addition of fixed numbers, the sum of random variables involves probabilistic considerations.

Understanding the Sum of Random Variables

When you "add" random variables, you are essentially creating a new random variable whose value is the sum of the outcomes of the individual random variables. For instance, if X represents the outcome of one die roll and Y represents the outcome of another die roll, then Z = X + Y is a new random variable representing the sum of the two rolls.

Expected Value of a Sum

One of the most fundamental and powerful properties of adding random variables concerns their expected value (or mean). The expected value of the sum of several random variables is always equal to the sum of their individual expected values. This holds true regardless of whether the random variables are independent or dependent.

  • Principle: For any two random variables, X and Y, the expected value of their sum is given by:
    E[X + Y] = E[X] + E[Y]

This linearity extends to any number of random variables. For example, if you have n random variables X₁, X₂, ..., Xₙ:

  • General Formula: E[X₁ + X₂ + ... + Xₙ] = E[X₁] + E[X₂] + ... + E[Xₙ]

Practical Insight: This property simplifies many calculations in probability and statistics. For example, to find the expected total profit from multiple independent projects, you can simply sum the expected profit from each individual project, without needing to consider their combined distribution.

Variance of a Sum

While the expected value adds linearly without conditions, the variance of a sum of random variables is more nuanced and depends on their relationship:

  • For Independent Random Variables: If X and Y are independent random variables, the variance of their sum is the sum of their individual variances:
    Var[X + Y] = Var[X] + Var[Y]

  • For Dependent Random Variables: If X and Y are dependent, their covariance must be included:
    Var[X + Y] = Var[X] + Var[Y] + 2 Cov[X, Y]
    Where Cov[X, Y] is the covariance between X and Y, which measures how much they change together. If X and Y are independent, their covariance is zero, which reduces the formula to the independent case.

Example:
Imagine you have two investments, X and Y.

  • If their returns are independent (e.g., investing in completely unrelated sectors), the risk (variance) of the combined portfolio is simply the sum of their individual risks.
  • If their returns are positively correlated (e.g., both tend to go up or down together), the combined risk will be higher than just the sum of individual risks. If negatively correlated, the combined risk can be lower, offering diversification benefits.

Distribution of a Sum

Determining the full probability distribution of a sum of random variables can be more complex:

  • Convolution: For continuous random variables, finding the probability density function (PDF) of their sum involves an operation called convolution. For discrete random variables, it involves summing products of their probability mass functions (PMFs).
  • Special Cases:
    • Sum of Normal Variables: If X and Y are independent Normal random variables, their sum X + Y is also a Normal random variable.
    • Central Limit Theorem (CLT): For a large number of independent and identically distributed (i.i.d.) random variables, their sum (or average) will tend towards a Normal distribution, regardless of the individual distribution of the variables, due to the Central Limit Theorem. This is why the Normal distribution is so prevalent in statistics.

Summary Table of Key Properties

Property Independent Random Variables (X, Y) Dependent Random Variables (X, Y)
Expected Value E[X + Y] = E[X] + E[Y] E[X + Y] = E[X] + E[Y]
Variance Var[X + Y] = Var[X] + Var[Y] Var[X + Y] = Var[X] + Var[Y] + 2 Cov[X, Y]
Distribution Determined by convolution or specific rules More complex, often requires joint distribution

Understanding these principles is crucial for fields ranging from finance and engineering to scientific research, enabling accurate modeling and prediction of combined outcomes.