If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

Main content

Partial derivatives of vector-valued functions

Partial Derivatives of Vector-Valued Functions. Created by Sal Khan.

Want to join the conversation?

  • blobby green style avatar for user rearmed527
    Is there a difference between ∂s and ds ? If dy=dx *2, is it legal to say (∂r/∂s)*ds =∂r ?
    (12 votes)
    Default Khan Academy avatar avatar for user
    • leafers ultimate style avatar for user Nnamdi Nwaokorie
      Edited:
      - ∂s is specially reserved for and specifically refers to partial derivatives while ds is used to indicate a regular derivative being taken!

      - Since ds and ∂s are two separate things, it would not be legal to do what you mentioned in the second question!

      Note: I know this question was asked two years ago! I am simply posting the answers for any who may read your post and also wonder the same things!
      . . . I also want to try and get one of those question badges!
      (20 votes)
  • hopper cool style avatar for user Iron Programming
    I hear Sal & Grant talk about how "Mathematicians cringe" when we treat differentials like variables/tiny-changes-in-a-direction.

    While I've heard some people say that, I've also heard many others say that it is perfectly fine to do.

    Also all of these differential math seems to beautifully work out to me, so why would mathematicians not like that? Is there some rigorous reason some people think that mathematicians cringe?

    Who are the "official" mathematicians out there who get to overrule others, and why does everyone think that they are cringing so often (lol)?

    Thanks for your time! :-)
    (5 votes)
    Default Khan Academy avatar avatar for user
    • blobby green style avatar for user lakern
      Good question... I'll preface by saying I've never entered this debate myself. As someone who knows a decent amount of math though, I'd say that there's no reason not to treat the differentials as tiny changes in a direction as long as we keep clear what we're talking about. Our notation df/dx implies a certain limit, while \delta f / \delta x implies a small step. In some ways it's abuse of notation to say "Consider df/dx as a small step in the x direction," but really all that matters is that your audience is on the same page as you. There are many different contexts in which you would want the exact derivative (think actual calculus/physics), others where a finite difference works great (think numerical approximation). As long as you personally have the two concepts straight in your head, and you make sure your audience is following, then you're all set. Just my take on the issue :)
      (6 votes)
  • stelly blue style avatar for user rissmac26
    Can u make the vidoes smaller and not so long?/
    (3 votes)
    Default Khan Academy avatar avatar for user
  • leaf yellow style avatar for user brian.g.neaves
    On the left side of each of the two boxed equations, what is the difference between the dt (or ds) and the dt with the curly d (and ds with the curly s) ? Thanks
    (4 votes)
    Default Khan Academy avatar avatar for user
  • male robot hal style avatar for user Daniel Sun
    So what is the real difference between dx and delta x?
    (2 votes)
    Default Khan Academy avatar avatar for user
  • blobby green style avatar for user csolo921
    any videos about derivative of algebraic functions? :(
    (2 votes)
    Default Khan Academy avatar avatar for user
  • blobby green style avatar for user Zain-ul-Abideen Sahu
    A little before , where Sal represents the limits in terms of partial differentials, wouldn't there be a partial r w.r.t partial x time i (r/y for j and r/z for k) or is that redundant? Because in my understanding, r is a function of x,y and z which are then functions of s and t.

    Thank you.
    (2 votes)
    Default Khan Academy avatar avatar for user
  • area 52 yellow style avatar for user Surya Raju
    I feel like ∂s doesn’t give us much information about which variable is being varied to cause ∂s. Is there some alternate notation that we can use to represent the differential of s while also telling us with respect to which variable we are talking about?
    (2 votes)
    Default Khan Academy avatar avatar for user
    • hopper cool style avatar for user Iron Programming
      I think you may be confused. When calculating partial derivatives of a function f(x, y) then we would say something like this;
      ∂f/∂x = derivative of f(x, y) with respect to x
      ∂f/∂y = derivative of f(x, y) with respect to y

      So the above denotation does tell us what variable we are taking the partial derivative of.

      Hope this helps,
      - Convenient Colleague
      (2 votes)
  • blobby green style avatar for user LAWES123DC
    i feel the flex is maybe the source of surface integral,can not figure out when the surface in 3 dimension
    (2 votes)
    Default Khan Academy avatar avatar for user
  • blobby green style avatar for user festavarian2
    At Sal multiplies both sides of the partial derivative by ds.
    Does this convert the the vector: r(s+dt, t)-r(s,t) from a "velocity" vector to a "displacement" vector? Wouldn't a pristine partial derivative be a tangent -velocity vector? So if you multiply it times ds, wouldn't it become a displacement vector?
    (1 vote)
    Default Khan Academy avatar avatar for user

Video transcript

Let's have the vector valued function r of s and t is equal to-- well, x is going to be a function of s and t. So we'll just write it as x of s and t times the x unit vector, or i, plus y of s and t times the y unit factor, or j, plus x of s and t times the z unit vector, k. So given that we have this vector valued function, let's define or let's think about what it means to take the partial derivative of this vector valued function with respect to one of the parameters, s or t. I think it's going to be pretty natural, nothing completely bizarre here. We've taken partial derivatives of non-vector valued functions before, where we only vary one of the variables. We only take it with respect to one variable. You hold the other one constant. We're going to do the exact same thing here. And we've taken regular derivatives of vector valued functions. The path in those just ended up being the regular derivative of each of the terms. And we're going to see, it's going to be the same thing here with the partial derivative. So let's define the partial derivative of r with respect to s. And everything I do with respect to s, you can just swap it with t, and you're going to get the same exact result. I'm going to define it as being equal to the limit as delta s approaches 0 of r of s plus delta s. Only finding the limit with respect to a change in s comma t. We're holding t, as you can imagine, constant for given t, minus r of s and t. All of that over delta s. Now, if you do a little bit of algebra here, you literally, you know-- r of s plus delta s comma t, that's the same thing as x of s plus delta s t i, plus y of s plus delta s t j, plus z. All that minus this thing. If you do a little bit of algebra with that, and if you don't believe me, try it out. This is going to be equal to the limit of delta s approaching 0-- and I'm going to write it small because it'd take up a lot of space-- of x of s plus delta s comma t minus x of s and t, I think you know where I'm going. This is all a little bit monotonous to write it all out, but never hurts. Times s or divided by delta s times i-- and then I'll do it in different colors, so it's less monotonous-- plus y. Where every-- those limited delta s [? approaches ?] 0 applies to every term I'm writing out here. y of s plus delta s comma t minus y of s comma t, all of that over delta s times j. And then finally, plus z of s plus delta s comma t minus z of s and t, all of that over delta s times the z unit vector, k. And this all comes out of this definition. If you literally just put s plus delta s in place for s-- you evaluate all this, do a little algebra-- you're going to get the exact same thing. And this, hopefully, pops out at you as, gee, we're just taking the partial derivative of each of these functions with respect to s. And these functions right here, this x of s and t, this is a non-vector valued function. This y, this is also a non-vector valued function. z is also a non-vector valued function. When you put them all together, it becomes a vector valued function, because we're multiplying the first one times a vector. The second one times another vector. The third one times another vector. But independently, these functions are non-vector valued. So this is just the definition of the regular partial derivatives. Where we're taking the limit as delta s approaches 0 in each of these cases. So this is the exact same thing. This is equal to-- this is the exact same thing as the partial derivative of x with respect to s times i plus the partial derivative y with respect to s times j plus the partial derivative of z with respect to s times k. I'm going to do one more thing here and this is pseudo mathy, but it's going to come out-- the whole reason I'm even doing this video, is it's going to give us some good tools in our tool kit for the videos that I'm about to do on surface integrals. So I'm going to do one thing here that's a little pseudo mathy, and that's really because differentials are these things that are very hard to define rigorously, but I think it'll give you the intuition of what's going on. So this thing right here, I'm going to say this is also equal to-- and you're not going to see this in any math textbook, and hard core mathematicians are going to kind of cringe when they see me do this. But I like to do it because I think it'll give you the intuition on what's going on when we take our surface integrals. So I'm going to say that this whole thing right here, that that is equal to r of s plus the differential of s-- a super small change in s-- t minus r of s and t, all of that over that same super small change in s. So hopefully you understand at least why I view things this way. When I take the limit as delta s approaches 0, these delta s's are going to get super duper duper small. And in my head, that's how I imagine differentials. When someone writes the derivative of y with respect to x-- and let's say that they say that that is 2-- and we've done a little bit of math with differentials before. You can imagine multiplying both sides by dx, and you could get dy is equal to 2dx. We've done this throughout calculus. The way I imagine it is super small change in y-- infinitely small change in y-- is equal to 2 times-- though, you can imagine an equally small change in x. So it's a-- well, if you have a super small change in x, your change in y is going to be still super small, but it's going to be 2 times that. I guess that's the best way to view it. But in general, I view differentials as super small changes in a variable. So with that out of the way, and me explaining to you that many mathematicians would cringe at what I just wrote, hopefully this gives you a little-- this isn't like some crazy thing I did. I'm just saying, oh, delta s as delta approaches 0, I kind of imagine that as ds. And the whole reason I did that, is if you take this side and that side, and multiply both sides times this differential ds, then what happens? The left hand side, you get the partial of r with respect to s is equal to this times ds. I'll do ds in maybe pink. Times ds-- this is just a regular differential, super small change in s. This is a kind of a partial, with respect to s. That's going to be equal to-- well, if you multiply this side of the equation times ds, this guy's going to disappear. So it's going to be r of s, plus our super small change in s, t minus r of s and t. Now let me put a little square around this. This is going to be valuable for us in the next video. We're going to actually think about what this means and how to visualize this on a surface. As you can imagine, this is a vector right here. You have 2 vector valued functions and you're taking the difference. And we're going to visualize it in the next video. It's going to really help us with surface integrals. By the same exact logic, we can do everything we did here with s, we can do it with t, as well. So we can define the partial-- I'll draw a little-- I can define the partial of r with respect-- let me do it in a different color, completely different color. It's orange. The partial of r with respect to t-- the definition is just right here. The limit as delta t approaches 0 of r of s t plus delta t minus r of s and t. In this situation we're holding the s, you can imagine, in constant. We're finding its change in t, all of that over delta t. And the same thing falls out. This is equal to the partial of x with respect to ti plus y with respect to tj, plus z with respect to tk. Same exact thing, you just kind of swap the s's and the t's. And by that same logic, you'd have the same result but in terms of t. If you do this pseudo mathy thing that I did up here, then you would get the partial of r with respect to t times a super small change in t. dt, our t differential, you could imagine, is equal to r of st plus dt minus r of s and t. So let's box these two guys away. And in the next video, we're going to actually visualize what these mean. And sometimes, when you kind of do a bunch of like, silly math like this, you're always like, all right, what is this all about? Remember, all I did is I said, what does it mean to take the derivative of this with respect to s or t? Played around with it a little bit, I got this result. These 2 are going to be very valuable for us, I think, in getting the intuition for why surface integrals look the way they do.