If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

Main content

Mean of sum and difference of random variables

Mean of sum and difference of random variables.

Want to join the conversation?

Video transcript

- [Instructor] Let's say that I have a random variable X which is equal to the number of dogs that I see in a day. And random variable Y is equal to the number of cats that I see in a day. Let's say I also know what the mean of each of these random variables are, the expected value. So the expected value of X which I could also denote as the mean of our random variable X let's say I expect to see three dogs a day and similarly for the cats, the expected value of Y is equal to I could also denote that as the mean of Y is going to be equal to and this is just for the sake of (mumbles) let's say I expect to see four cats a day. And pretty much we define how you take the mean of a random variable or the expected value for a random variable. What we're going to think about now is what would be the expected value of X plus Y or other way of saying that the mean of the sum of these two random variables. Well it turns out, and I'm not proving it just yet, that the mean of the sum of random variables is equal to the sum of the means. So this is going to be equal to the mean of random variable X plus the mean of random variable Y. And so in this particular case, if I were to say well what's the expected number of dogs and cats that I would see in a given day. Well I would add these two means, it would be three plus four it would be equal to seven, so in this particular case it is equal to three plus four which is equal to seven. And similarly if I were to ask you the difference if I were to say how many more cats in a given day would I expect to see than dogs, so the expected value of Y minus X. What would that be? Well intuitively you might say well hey if we can add random... If the expected value of the sum is the sum of the expected values, then the expected value or the mean of the difference will be the differences of the means and that is absolutely true. So this is the same thing as the mean of Y minus X which is equal to the mean of Y is going to be equal to the mean of Y minus the mean of X, minus the mean of X. And in this particular case, it would be equal to four minus three, minus three is equal to one. So another way of thinking about this intuitively is I would expect to see on a given day one more cat than dog. The example that I just used this is discrete random variables, on a given day I wouldn't see 2.2 dogs or pi dogs, the expected value itself does not have to be a whole number 'cause you could of course average it over many days. But this same idea that the mean of a sum is the same thing as a sum of means and the mean of a difference of random variables is the same as the difference of the means. In a future video I'll do a proof of this.