If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

Main content

### Course: Multivariable calculus>Unit 2

Lesson 1: Partial derivatives

# Symmetry of second partial derivatives

There are many ways to take a "second partial derivative", but some of them secretly turn out to be the same thing.  Created by Grant Sanderson.

## Want to join the conversation?

• is there no second partial derivative with respecte to y? So: fyy(x.y)
(10 votes)
• Just a minor correction: What the previous reply meant to say was either "He just didn't write it down," or "He just forgot to write it down."

At least I hope that is what the person meant to say :)
(25 votes)
• Is taking a partial derivative a commutative operation over the set of all real valued continuous differentiable multivariable functions?
(4 votes)
• I would like to point out, that operands of taking a partial derivative are
-a function
-a variable that you are taking a derivative with respect to
So, using somewhat ugly infix notation (when ' * ' stands for derivative) it would look like:
f * x is the partial derivative of f with respect to x
We have (f * x) * y = (f * y) * x
But is f * x = x * f ? Certainly not.
(5 votes)
• What's the intuition/meaning for taking partial derivative with respect to x first, then taking second partial derivative with respect to y after that? I mean the first step we already consider y term constant, so the second step seems to be inconsistent with the first one?
(4 votes)
• I'm late, but d^2f/dydx is essentially saying how the rate of change of x changes as you move in the y direction.
So if at a point the rate of change of x is 2, but when you move in the y direction a little bit the rate of change of x is 1, then d^2f/dydx is negative.
(4 votes)
• Isn't the denominator supposed to be partial squared times x^2?
(4 votes)
• when you take a second derivative and are using Leibniz notation, think of it as the 'd's in the numerator getting squared and the 'dx's in the denominator being squared.

So d/dx(dy/dx)= d*dy / dx*dx = d^2y/dx^2
(3 votes)
• I don't get exactly what happens at . It seems kind of chain rule, but since it's considering cos(x) as constant, wouldn't its derivative be 0 ?
(1 vote)
• In this case, cos(x) is a constant which is coefficient to your variable y^2. If you were to derive 3x with respect to x, you would get 3. Similarly, deriving (y^2)cos(x) with respect to y would keep your constant (cos(x)) as is, while taking the derivative of your variable (y^2 becomes 2y).
(7 votes)
• Maybe I am just bad at math, but I thought the derivative of a constant was supposed to be zero... Why in the beginning of the video did he leave the constants after taking the derivative instead of zeroing them out?
(3 votes)
• I don't think he does that. If you are asking why he doesn't touch the part of the expression that belongs to the other variable, then he does that because that is how you take a partial derivative. You only differentiate the part of the expression that contains the variable you are taking the partial derivative of.
(2 votes)
• What if you went into third, fourth, fifth, etc partial derivatives after the end of the video?
(2 votes)
• Yes you could. You'd just keep performing the differentiation with respect to your chosen variable, as many time as you want. So you could calculate something like fxxyxxyyyx if you had a need to. If you're interested, take f(x,y)=sin(x)e^(2y) and derive fxxyy, then fxyyx and see what happens.
(3 votes)
• at , why the second derivative written like this (in the Numerator : the superscript of 2 is written after the partial derivative symbol, while in the denominator it is written after the x ).. what is the intuition of this writing ?
(2 votes)
• when you take a second derivative and are using Leibniz notation, think of it as the 'd's in the numerator getting squared and the 'dx's in the denominator being squared.

So d/dx(dy/dx)= d*dy / dx*dx = d^2y/dx^2
(3 votes)
• please is there no article on partial derivatives ?
(2 votes)
• I am struggling to understand what the second derivative shows for the function in question.
The first partial derivative if i understand correctly is the slope of the graph. So is the second derivative the rate of change of the slope?Can someone give an explanation for me?
(exhibiting how clueless i am ..)
(3 votes)
• Yes second derivative is rate of change of slope. It’s how the slope of a graph is changing. If the slope is increasing, then the graph will curl upwards like a u. If the slope is decreasing, it will bend downwards like n.
(1 vote)

## Video transcript

- [Voiceover] So in the last couple videos, I talked about partial derivatives of multivariable functions. And here, I want to talk about second partial derivatives. So I'm gonna write some kind of multivariable function. Let's say it's well, sine of x times y squared. Sine of x multiplied by y squared. And if you take the partial derivative, you have two options, given that there's two variables. You can go one way and say what's the partial derivative? Partial derivative of f with respect to x. And what you do for that, x looks like a variable as far as this direction is concerned. Y looks like a constant. So we differentiate this by saying the derivative of sine of x is cosine x. You know, you're differentiating with respect to x. And then that looks like it's multiplied by a constant. So you just continue multiplying that by a constant. But you could also go another direction. You could also say, you know, what's the partial derivative with respect to y? And in that case, you're considering y to be the variable. So here it looks at y. And says y squared looks like a variable. X looks like a constant. Sine of x then just looks like sine of a constant, which is a constant. So that will be that constant, sine of x, multiplied by the derivative of y squared. Which is gonna be two times y. Two times y. And these are what you might call first partial derivatives. And there's some alternate notation here, df dy. You could also say f and then a little subscript y. And over here, similarly, you'd say f with a little subscript x. Now each of these two functions, these two partial derivatives that you get are also multivariable functions. They take in two variables and they output a scalar. So we can do something very similar, where here you might then apply the partial derivative with respect to x to that partial derivative of your original function with respect to x, right. It's just like a second derivative in ordinary calculus, but this time we're doing it partial. So when you do it with respect to x, cosine x looks like cosine of a variable. The derivative of which is negative sine times that variable. And y squared here just looks like a constant. So it just stays constant at y squared. And, similary, you could go down a different branch of options here. And say what if you did your partial derivative with respect to y? Of that whole function, which itself is a partial derivative with respect to x. And if you did that, then y squared now looks like the variable. So you're gonna take the derivative of that, which is two y. Two y. And then what's in front of it just looks like a constant as far as the variable y is concerned. So that stays as cosine of x. And the notation here. First of all, just as in single-variable calculus, it's common to kind of do a abusive notation with this kind of thing and write partial squared of f divided by partial x squared. And this always, I don't know. When I first learned about these things, they always threw me off because here, this Leibniz notation, you have the great intution of, you know, nudging the x and nudging the f. But you kind of lose that when you do this. But it makes sense if you think of this partial, partial x as being an operator and you're just applying it twice. And over here, the way that that would look, it's a little bit funny. Because you still have that partial squared f on top. But then on the bottom, you write partial y, partial x. And, you know, I'm putting them in these order just because it's as if I wrote it that way, right. This reflects the fact that first I did the x derivative. Then I did the y derivative. And you could do this on this side also. And this might feel tedious, but it's actually kind of worth doing for a result that we end up seeing here that I find a little bit surprising, actually. So here, if we go down the path of doing, in this case, like a partial derivative with respect to x. And, you know, you're thinking of this as being applied to your original partial derivative with respect to y. It looks here, it says sine of x looks like a variable. Two y looks like a constant. So what we end up getting is derivative of sine of x, cosine x. Multiplied by that two y. And a pretty cool thing worth pointing out here that maybe you take it for granted. Maybe you think it's as surprising as I did when I first saw it. Both of these turn out to be equal, right. Even though it was a very different way that we got there, right? You first take the partial derivative with respect to x and you get cosine x, y squared. Which looks very different from sine x, two y. And then when you take the derivative with respect to y, you know, you get a certain value. And when you go down the other path, you also get that same value. And maybe the way that you write this is that you'd say Let me just copy this guy over here. And what you might say is that the partial derivative of f. When you do it the other way around, when instead of doing x and then y, you do y and then x. Partial x. That these guys are equal to each other. And that's a pretty cool result. And maybe in this case, given that the original function just looks like the product of two things, you can kind of resaon through why it's the case. But what's surprising is that this turns out to be true for, I mean, not all functions. There's actually a certain criterion. There's a special theorem, it's called Schwarz's theorem. Where if the second partial derivatives of your function are continuous at the relevant point, that's the circumstance for this being true. But for all intents and purposes, the kind of functions you can expect to run into, this is the case. This order of partial derivatives doesn't matter. Truth turns out to hold. Which is actually pretty cool. And I'd encourage you to play around with some other functions. Just come up with any multivariable function, maybe a little bit more complicated than just multiplying two separate things there, and see that it's true. And maybe try to convince yourself why it's true in certain cases. I think that ought to actually be a really good exercise. And just before I go, one thing that I should probably mention, a bit of notation that people will commonly use. With the second partial derivative, sometimes instead of saying partial squared f, partial x squared, they'll just write it as partial and then x, x. And over here, this would be partial. Let's see, first you did it with x, then y. So over here you do it first x and then y. Kind of the order of these reverses. Because you're reading left to right. But when you do it with this, you're kind of reading right to left for how you multiply it in. Which would mean that this guy, let's see, this guy over here. Now he would be partial. First you did the y, and then you did the x. So those two guys are just different notations for the same thing. I mean, that can make it a little bit more convenient when you don't want to write out the entire partial squared f divided by partial x squared or things like that. And with that, I'll call it an end.