Main content

### Course: Statistics and probability > Unit 9

Lesson 4: Combining random variables- Mean of sum and difference of random variables
- Variance of sum and difference of random variables
- Intuition for why independence matters for variance of sum
- Deriving the variance of the difference of random variables
- Combining random variables
- Combining random variables
- Example: Analyzing distribution of sum of two normally distributed random variables
- Example: Analyzing the difference in distributions
- Combining normal random variables
- Combining normal random variables

© 2024 Khan AcademyTerms of usePrivacy PolicyCookie Notice

# Mean of sum and difference of random variables

Mean of sum and difference of random variables.

## Want to join the conversation?

- Hi, can someone please clarify my basic confusion.Let's say if I have two hypothetical random independent variables X and Y like :

X : 1,2,3

Y : 4,5,6

Now if I have to combine these two variables what will be resultant output X+Y?

Will it be {1,2,3,4,5,6}

Or {5,7,9} ?(11 votes)- You can't really add the random variables themselves but can add their means and std dev.

You would get mean:7

StdDev.:1.1547(5 votes)

- Where is the proof of this? Please help with the link?(6 votes)
- if you like maths so much name every number(2 votes)
- If we define a description of a number as a finite string of symbols that uses a finite alphabet of symbols, then there are only countably many descriptions. However, there are uncountably many real numbers. So almost all real numbers are indescribable!(3 votes)

- I still don't know whats a sum(1 vote)
- the sum in a layperson's term would be the result when you add them together.(1 vote)

- so is there a difference between an expected value and a mean? like can the mean be decimals like 5.6 but you can expect to see 5.6 cats, so would the expected value be 6 cause you would round this number upwards? or would it be 5 cause that would give you a more accurate expected value? or am i thinking too much and the expected value would actually just be equal to the mean and therefor be 5.6 ?(1 vote)
- Expected value
*is*the average value.

So, seeing 5.6 cats could very well be the expected value, even though it's definitely not the expected outcome of any given trial.(1 vote)

- Does anyone know where I could find the proof video of this?(1 vote)
- I'm also stuck here and how do we actually combine two random variables like literally how do we construct the resultant distribution?, what did you do?(1 vote)

- Let's say I have the same dog and cat scenario scenario with the additional knowledge that we sampled our distributions in the same days and in the same ways, in that case lets say dogs = {3,4,5} and cats = {1,2,3}. can we say that animals = {3 + 1, 4 + 2, 5 + 3}?(1 vote)
- The expected value is always a mean? A sample mean? A population mean?

Expected value is the sum of multiplying probabilities with their respective events? Weird.(0 votes)- The expected value, denoted as E[X], is a concept in probability theory representing the theoretical average outcome of a random variable over a large number of trials or occurrences. It's calculated by multiplying each possible value of the random variable by its corresponding probability of occurrence and summing up these products. The expected value provides insight into the central tendency or average behavior of the random variable. It's essential to understand that while the expected value is often referred to as the mean, it's distinct from sample means or population means commonly used in statistics, which are calculated from observed data rather than theoretical probabilities.(1 vote)

- prove that the arithmetic mean of the sum of two or more variables is equal to the sum of their mean.(0 votes)
- Math definition of expected value for a continuous RV X is E(X)=integral(x*p(x)). Therefore definition of expected value of sum of two RVs X,Y is E(X+Y)=integral(x*p(x)+y*p(y)). Integral is a linear operator, so integral(x*p(x)+y*p(y))=integral(x*p(x))+integral(y*p(y)) which is E(X)+E(Y). Same is true of discrete RVs, they're defined as sum rather than integral, sum is likewise a linear operator.(2 votes)

## Video transcript

- [Instructor] Let's say that
I have a random variable X which is equal to the number
of dogs that I see in a day. And random variable Y is
equal to the number of cats that I see in a day. Let's say I also know what
the mean of each of these random variables are, the expected value. So the expected value of X
which I could also denote as the mean of our random
variable X let's say I expect to see three dogs a day
and similarly for the cats, the expected value of Y is
equal to I could also denote that as the mean of Y is going
to be equal to and this is just for the sake of (mumbles)
let's say I expect to see four cats a day. And pretty much we define how
you take the mean of a random variable or the expected
value for a random variable. What we're going to think
about now is what would be the expected value of X plus
Y or other way of saying that the mean of the sum of
these two random variables. Well it turns out, and I'm
not proving it just yet, that the mean of the sum of
random variables is equal to the sum of the means. So this is going to be equal
to the mean of random variable X plus the mean of random variable Y. And so in this particular case,
if I were to say well what's the expected number of dogs
and cats that I would see in a given day. Well I would add these two
means, it would be three plus four it would be equal to seven,
so in this particular case it is equal to three plus
four which is equal to seven. And similarly if I were to ask
you the difference if I were to say how many more cats in
a given day would I expect to see than dogs, so the
expected value of Y minus X. What would that be? Well intuitively you might
say well hey if we can add random... If the expected value of the
sum is the sum of the expected values, then the expected
value or the mean of the difference will be the
differences of the means and that is absolutely true. So this is the same thing as
the mean of Y minus X which is equal to the mean of Y is going
to be equal to the mean of Y minus the mean of X, minus the mean of X. And in this particular case,
it would be equal to four minus three, minus three is equal to one. So another way of thinking about
this intuitively is I would expect to see on a given
day one more cat than dog. The example that I just
used this is discrete random variables, on a given day I
wouldn't see 2.2 dogs or pi dogs, the expected value itself
does not have to be a whole number 'cause you could of
course average it over many days. But this same idea that the
mean of a sum is the same thing as a sum of means and the
mean of a difference of random variables is the same as
the difference of the means. In a future video I'll do a proof of this.