If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

Main content

The Jacobian Determinant

How to interpret the determinant of a Jacobian matrix, along with some examples.

Want to join the conversation?

  • hopper cool style avatar for user Gustavo Soares
    How can I interpret negative values for Jacobian Determinant?
    (33 votes)
    Default Khan Academy avatar avatar for user
    • blobby green style avatar for user robshowsides
      Great question! It means that the orientation of the little area has been reversed. For example, if you travel around a little square in the clockwise direction in the parameter space, and the Jacobian Determinant in that region is negative, then the path in the output space will be a little parallelogram traversed counterclockwise. Another way to think about it is that two little vectors with a positive cross-product (right-hand rule gives a result in the positive z-direction) will get mapped to two new vectors that give a negative result (right-hand rule gives a result in the negative z-direction).
      (50 votes)
  • blobby green style avatar for user abaybektursun
    At the end you said "I will see you in the next video". But this is the last one :'(
    (20 votes)
    Default Khan Academy avatar avatar for user
  • male robot donald style avatar for user Evtushenko Georgy
    I've just published interactive visualization of Jacobian matrix https://senior-zero.github.io/JacobianVisualization/ , maybe it will be useful for someone)
    (19 votes)
    Default Khan Academy avatar avatar for user
  • blobby green style avatar for user ajaynandhivarman
    suppose if I take (x,y)=(0,0) then jacobin det=0 how should I geometrically interept it.
    (6 votes)
    Default Khan Academy avatar avatar for user
    • male robot hal style avatar for user Alan Lorenzato
      Well, I noticed that too during the video. There are other points as well that satisfy this equation, such as (0,2pi) or (pi,pi) or every other point such that cos(x)cos(y)=1. Basically the determinant there is zero, meaning that those little squares of space get literally squeezed to zero thickness.
      If you look close, during the video you can see that at point (0,0) the transformation results in the x and y axes meeting and at point (0,0) they're perfectly overlapping!
      (6 votes)
  • starky sapling style avatar for user Timothy  Simons
    I understand that the Jacobian determinant returns a factor by which a small area gets changed when undergoing a certain transformation. How then do we approximate an area over a certain region that is much larger than a small neighbourhood?
    (5 votes)
    Default Khan Academy avatar avatar for user
  • piceratops ultimate style avatar for user I. Bresnahan
    I've got a bit of a question here pertaining to where this set of videos about the Jacobian should be located.
    From the background knowledge that I have in linear algebra (3blue1brown's essence series) and the background I have on calculus (I-III from Khan Academy and calc I in school) it makes far more sense to put the Jacobian in the linear algebra playlist, at least in my eyes.
    First, the Jacobian matrix and its implications depend on your understanding of how linear transformations work and how the determinant is graphically represented. Luckily, the essence series was plenty of background for me to understand this, but others watching the calculus III playlist and not the linear algebra one might have some trouble grasping the concept or the practicality of the Jacobian since they lack that background in linear algebra.
    Second, and this is more of a personal opinion than a truth, but it seems as though calculus III material is being used to extend a linear algebra idea rather than linear algebra material being used to extend a calculus III idea. Calculus is being used to make nonlinear transformations and determinants able to be calculated precisely, not the other way around. Linear algebra isn't just being used here, it's being extended into nonlinear functions. Calculus isn't being extended.
    For these reasons, it makes sense to put the Jacobian in the linear algebra playlist instead of here. Please share thoughts, if you agree, or more importantly, if you disagree, so that I can better understand why it belongs here.
    (1 vote)
    Default Khan Academy avatar avatar for user
  • blobby green style avatar for user MandyZhang1673
    How does this work for the practice? There are 3 by 3 grids for the determinate and I'm a bit confused as to what to multiply.
    (3 votes)
    Default Khan Academy avatar avatar for user
  • blobby green style avatar for user Jacob Webber
    Am I correct in saying that the first part of the Jacobian Determinant is the same as the divergence?
    (df1/dx * df2/dy) - (df1/dy * df2/dx)
    ^^^^^^^^^^^^^^^^^

    Is there a nice way to reason about this geometrically? What does the other part of the determinant represent and why do we subtract it to get a "stretching outness" of space?
    (3 votes)
    Default Khan Academy avatar avatar for user
  • stelly blue style avatar for user parizat2709
    this is 3blue1brown right?
    (2 votes)
    Default Khan Academy avatar avatar for user
  • area 52 yellow style avatar for user Surya Raju
    Is this used in some version of a multivariable integral?
    (2 votes)
    Default Khan Academy avatar avatar for user

Video transcript

- [Voiceover] In this video, I want to talk about something called the Jacobian determinant. And it's, more or less, just what it sounds like. It's the determinant of the Jacobian matrix that I've been talking to you the last couple videos about. And before we jump into it, I just want to give a quick review of how you think about the determinant itself, just in an ordinary linear algebra context. So if I'm taking the determinant of some kind of matrix, let's say, three, zero, one, two, something like this, to compute the determinant, you take these diagonal terms here, so you take three multiplied by that two, and then you subtract off the other diagonal, subtract off one multiplied by zero. And in this case, that evaluates to six. But there is, of course, much more than just a computation going on here. There's a really nice geometric intuition. Namely, if we think of this matrix, three, zero, one, two, as a linear transformation, as something that's gonna take this first basis vector over to the coordinates three, zero, and that second basis vector over to the coordinates one, two, you know, thinking about the columns, you can think of the determinant as measuring how much this transformation stretches or squishes space. And in particular, you'll notice how I have this yellow region highlighted, and this region starts off as the unit square, a square with side lengths one so its area is one. And there's nothing special about this particular region. It's just nice as a canonical shape, with an area of one, so that we can compare it to what happens after the transformation. Ask, how much does that area get stretched out? And the answer is, it gets stretched out by a factor of the determinant. That's kind of what the determinant means, is that all areas, if you were to draw up any kind of shape, not just that one square, are gonna get stretched out by a factor of six. And we can actually verify, looking at this parallelogram that the square turned into. It has a base of three and then the height is two. And three times two is six. And that has everything to do with the fact that this three showed up here and this two showed up there. So now, let's think about what this might mean in the context of what I've been describing in the last couple videos. And if you'll remember, we had a multivariable function, something that you can write out as f one with two inputs and then the second component, f two, also with two inputs. And the function that I was looking at, that we were kind of analyzing to learn about the Jacobian, had the first component, x plus sine of y, x plus sine y, and the second component was y plus the sine of x. And the idea was that this function is not at all linear. It's gonna make everything very curvy and complicated. However, if we zoom in around a particular region, which is what this outer yellow box represents, zooming in, it will look like a linear transformation. In fact, I can kind of play this forward, and we see that even though everything is crazy, inside that zoomed in version, things loosely look like a linear function. And you'll notice I have this inner yellow box highlighted, and this yellow box inside corresponds to the unit square that I was showing in the last animation. And again, it's just a placeholder as something to watch to see how much the area of any kind of blob in that region gets stretched. So, in this particular case, when you play out the animation, areas don't really change that much. They get stretched out a little bit, but it's not that dramatic. So, if we know the matrix that describes the transformation that this looks like zoomed in, the determinant of that matrix will tell us the factor by which areas tend to get stretched out. And in particular, you can think of this little yellow box and the factor by which it gets stretched. And as a reminder, the matrix describing that zoomed in transformation is the Jacobian. It is this thing that kind of holds all of the partial differential information. You take the partial derivative of f, with respect to x, sorry, partial of f one of that first component, and then the partial derivative of the second component, with respect to x, and then on the other column, we have the partial derivative of that first component, with respect to y, and the partial derivative of that second component, with respect to y. And if you... Let's see, I'm gonna close this off. Close off this matrix. And if you evaluate each one of these partial derivatives at a particular point, at whatever point we happen to zoom in on, in this case, it was negative two, one, once you plug that into all of these, you get some matrix that's just full of numbers. And what turns out to be a very useful thing later on in multivariable calc concepts, is to take the determinant of that matrix, to kind of analyze how much space is getting stretched or squished in that region. So in the last video, we worked this out for this specific example here, where that top left function turned out just to be the constant function, one, right, because we were taking the partial derivative of this guy with respect to x and that was one. And likewise, in the bottom right, that was also a constant function of one. And then the others were cosine functions. This one was cosine x because we were taking the partial derivative of this second component here with respect to x. And then the top right of our matrix was cosine of y. And these are, in general, functions of x and y because you know, you're gonna plug in whatever the input point you're zooming in on. And when we're thinking about the determinant here, let's just go ahead and take the determinant in this form, in the form as a function. So I'm going to ask about the determinant of this matrix, or maybe you think of it as a matrix-valued function. And in this case, we do the same thing. I mean, procedurally, you know how to take a determinant. We take these diagonals, so that's just gonna be one times one, and then we subtract off the product of the other diagonal, subtract off cosine of x multiplied by cosine of y. And as an example, let's plug in this point here that we're zooming in on, negative two, one. So I'm going to plug in x is equal to negative two, and y is equal to one. And when you plug in cosine of negative two, that's gonna come out to be approximately negative 0.42. And when you plug in cosine of y, cosine of one in this case, that's gonna come out to be about 0.54. And when we multiply those, when we take one minus the product of those, it's gonna be about negative 0.227. And that's all stuff that you can plug into your calculator if you want. And what that means is that the total determinant, evaluated at that point, the Jacobian determinant at the point negative two, one, is about 1.227. So that's telling you that areas tend to get stretched out by this factor around that point. And that kind of lines up with what we see. We see that areas get stretched out maybe a little bit, but not that much, right? It's only by a factor of about 1.2. And now, let's contrast this. If instead we zoom in at the point where x is equal to zero and y is equal to one, so I'm gonna go over here and all I'm gonna change, all I'm gonna change is that x is equal to zero and y will still equal one, and what that means is that cosine of x, instead of being negative 0.42, well what's cosine of zero, that's actually precisely equal to one, right? We don't have to approximate on this one, which means when we multiply them, one times 0.54, well that, that's gonna now be about 0.54, right? So this one, once we actually perform the subtraction, instead when you take one minus 0.54, that's gonna give us 0.46. So even before watching, because this determinant of the Jacobian around the point zero, one is less than one, this is telling us we should expect areas to get squished down. Precisely, they should be squished by a factor of 0.46. And let's see if this looks right, right? We're looking at the zoomed in version around that point, and areas should tend to contract around that. And indeed, they do. You see it got squished down, it looks like by a fair bit, and from our calculation, we can conclude that they got scaled down precisely by a factor of 0.46. That's what the determinant means. So like I said, this is actually a very nice notion throughout multivariable calculus, is that you look at a tiny little local neighborhood around a point, and if you just want to get a general feel for, does this function, as a transformation, tend to stretch out that region or to squish it together, how much do areas change in that little neighborhood, that's exactly what this Jacobian determinant is, you know, built to solve. So with that, I'll see you guys next video.