If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

### Course: Multivariable calculus>Unit 3

Lesson 3: Optimizing multivariable functions

# Second partial derivative test

How to determine if the critical point of a two-variable function is a local minimum, a local maximum, or a saddle point. Created by Grant Sanderson.

## Want to join the conversation?

• So H is the Hessian determinant?
• H = determinant of Hessian, evaluated at (x0 , y0)
• What should I do if H is equal to zero?
• When the H is zero, it means the second derivatives are unable to tell you whether it is minimum or maximum. Therefore, you need to turn to third derivatives.

When that fails, use forth derivatives. And fifth. And sixth.

What if all derivatives fail? Well, that means you have a flat (constant) function.
• Around , Grant was talking about the mixed partial derivative, and he said you could take the derivative of fx with respect to y (so fxy) OR you could take the derivative of fy with respect to y (so fyx) because either calculation would give the same answer. Is this true of any problem regarding maxs/mins/saddle points of a multi variable function? Or does it only apply here because fx and fy are similar?
• This is due to Clairut's Theorem:
If fxy and fyx are continuous at (a, b), then fxy(a, b) = fyx(a, b).
• Is this test usable if I have more than 2 variables?
• In general, at a stationary point of a twice continuously differentiable function, you have a minimum if the Hessian matrix is positive definite, a maximum when it is negative definite, and neither if it has both negative and positive eigenvalues.
• Why don’t we incorporate the Laplacian somehow? In the intuition video for the Laplacian it seemed like he was going to build up to optimisation.
• I was just wondering how do we determine the local maximum or minimum for a function with 3 variables? Assume we have a local maximum at (x0, y0, z0) for a 3-variable function, and the product of the second derivatives of the 3 variables at (x0, y0, z0) will be negative!
• From what I understand the general form to get the second partial derivative test is the determinant of the hessian matrix. I asume the H relations still work out, though I don't think the saddle points could still be called saddle points since it wouldn't be a 3d graph any more. If I'm wrong corrections are appreciated.
(1 vote)
• The Hessian is the Jacobian of the gradient function. The expression H = det(H) right? Under the transformation described by the Hessian, why does the gradient being inverted [det(H) < 0] at a point correspond to it being a saddle point and the orientation of the gradient being preserved [det(H) > 0] correspond to it being a local maxima or minima?
• If the Hessian determinant is negative and fxx = fyy = 0, won't the test still be inconclusive?
• It probably sounds like a very simple question, what is Xo and Yo meant to represent? Just a value or when x and y =0?
(1 vote)
• It represents a point. It can be any point, including (0,0). It's used in the formula for the 2nd derivative test because the purpose of the test is to know whether a given point is an extremum or a saddle point, and so if you wanted to know what a given point is, you would plug its coordinates in, look at the result, and from it you would determine what type of point it is.
• What if h > 0, but fxx and fyy = 0?
Did you ever end up making the video about if h = 0?
(1 vote)
• It's not possible. If either fxx or fyy is 0, then fxx*fyy - (fxy)^2 can never be positive, because the first term would be 0, and the second term would be either less than or equal to zero since it's a square, so the whole expression could only be 0 or negative.