If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

# Proving vector dot product properties

Proving the "associative", "distributive" and "commutative" properties for vector dot products. Created by Sal Khan.

## Want to join the conversation?

• In Linear Algebra, we are also learning about inner products. I was wondering what the difference was between dots products and inner products, and if you could make a video about inner products. Thanks!
• I think that the best answer I can give you is to say that the inner product is a generalized version of the dot product. The dot product is well defined in euclidean vector spaces, but the inner product is defined such that it also function in abstract vector space, mapping the result into the Real number space.

In any case, all the important properties remain:
1. The norm (or "length") of a vector is the square root of the inner product of the vector with itself.
2. The inner product of two orthogonal vectors is 0.
3. And the cos of the angle between two vectors is the inner product of those vectors divided by the norms of those two vectors.

Hope that helps!
• If a vector is a matrix with one column or row, mustn't we apply the rules of matrix multiplication? In that case we could only multiply a lets say 1x3 vector with a 3x1 vector to get a scalar (1x1 vector). Is that what the dot product is doing, but without formally writing the second vector as a row-vector?
But for matrix multiplication the commutavite property does not apply.
• Since the vectors are one column matrices, why aren't we multiplying vectors the same way we multiply matrices? Matrix multiplication does not allow for commutativity, and yet the dot product does. I am willing to "allow" that the dot product gives us a scalar, not another vector (as one would expect when multiplying two matrices together), but why can we do this with vectors and not matrices? I even can understand the idea that the scalar is the "shadow" of one vector onto another --but where does the matrix behavior appear? Or do matrices have their own "dot products"?
• In this video, Sal uses Rn as to generalize it to all n-tuplets, but wouldn't the proof be just as valid if proven only in R2? (since it is mundane, R2 would save space and time)
• If he proved just R2, the proof might not work with other Rs, but in this case proving in R2 would work for other Rs. The problem with that is you won't know if proving in R2 works for all Rs until you prove in Rn. So you might as well prove it in Rn.
• How can i solve the equation of dot product when vectors are parallel?
• When they both point in the same direction, the dot product is equal to their magnitudes multiplied by eachother:
`a·b ≡ |a|*|b|*cos(θ)`
`a·b = |a|*|b|*cos(0)`when they point the same direction
`a·b = |a|*|b|`
• But my physics instructor said ∇.F != F.∇ for a force field F. Why so?
• I don't want to step on any physics peoples' toes, but "del notation" is a non-mathematical hack that only provides useful memory aids for valuable real-world ideas. Be sure to think of ∇. as a single operator and not the dot product of a "del" with something. There is danger in trying to take the metaphor too far.

• Why vector cross product not form a group
• For a set G to be a group under a binary operation x [formally, we say the ordered pair (G, x) is a group], the following must hold for all elements u, v, and w in G:
1. There is an identity element e, where u x e = e x u = u.
2. For every element u, there is an element -u called u inverse such that u x -u = -u x u = e.
3. The operation is associative, i.e. (u x v) x w = u x (v x w).

The cross product for one, fails associativity. It also has no identity element.

Thus R^3 under the cross product binary operation is not a group.
• Is dot product of a vector and a scalar quantity possible
(1 vote)
• It looks to me that your adding the first term plus the second term until n which looks like a series until n. Is there a connection between vectors and series? Could you please elaborate on this in a video if it is (can the relation be proved for example)?
• Well, a series is just a compact notation for writing arbitrary sums. In the case of dot products, if we have two vectors x = (x1, x2, ... , xn), and y = (y1, y2, ... , yn), and we wanted to write the dot product as a series (which we can because we can write every sum as a series), then it would be like this:
x dot y = summation i=1 to n of xi*yi.