If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

# Introduction to orthonormal bases

Looking at sets and bases that are orthonormal -- or where all the vectors have length 1 and are orthogonal to each other. Created by Sal Khan.

## Want to join the conversation?

• It seems to me that when Sal proves that the orthonormal set is linearly independent, he just proves that a member of his set isn´t a multiple of another member of his set. Shouldn´t he prove that the member is not a linear combination of all other members of his set, so that the set, not 2 vectors would be proven linearly independent?
If no 2 vectors in his set are scalar multiples of each other, the set could still be linearly dependent. Please elaborate, thank you.
• You're right, but the proof can be extended to show the v's are linearly independent.

First suppose that the v's are linearly dependent. Then v_i is some linear combination of v_j (for all j != i), or v_i = c_1*v_1 + c_2*v2 + c_{i-1}v_{i-1} + c_{i+1}*v_{i+1} + ... + c_n*v_n where the c's can't all be zero. Since you've defined the v's so that they are all orthogonal, then v_i . v_k = 0 for some k != i, or (c_1*v_1 + c_2*v2 + c_{i-1}*v_{i-1} + c_{i+1}*v_{i+1} + ... + c_n*v_n) . v_k = 0. All the terms on the left hand side except for c_k*v_k will be wiped out because of the orthogonality, leaving (c_k*v_k . v_k) = c_k||v_k||^2 = 0. Since you defined ||v_k||^2 = 1, then c_k must be zero, and since k is just some arbitrary index, then c_k = 0 for all k from 1 to n. This however contradicts the definition of a linear combination when we said all the c's can't be zero. Thus the v's are linearly independent.
• What are the prerequisites for this lesson? How do I determine what other videos I need to watch in order to understand this one?

I have < 1 week (for a Quantum Computing course), it mentions specifically this and one other Linear Algebra topic (eigenvalues/vectors). I've been serially watching every video in the "Linear Algebra" section from the beginning, but there will not be enough time.

So, how to determine what videos I can skip in order to reach this one and be able to understand it?
• I my honest opinion, you will require more than one week to get to this point. For example, I have been working up to this for about 2 months and I practice every day.
• All these concepts are directly applied to electrons in atoms. In that sense, if I consider Vi be the wave function of i^th electron, is it correct to consider as follow:

Normalized vector Vi : the total probability of finding the electron is 1 (Vi.Vi=1)
No two electrons can be in the same place (becasue Vi.Vj=0)
Vi and Vj are Linearly independent : One electron does not cross each other electrons position.

Please give corrections and suggestions for further reading of basic level for this.
• minute , hello there, big fan of you... how come you say "it can span V"? isn't it have to span V form the first place? - isn't the definition of orthonormal set is a standard indipendent set of vectors in a slicly difrrent angle from the normal x\y\z\... axis?
hopefully i explained myself ok with my louzy engilsh, thanks in advance keep up the incredible work, you make me love math, simply love it.
• he said it spans a subspace not entire space
• If you have a orhonormal basis set u, then is their inner product <u|u> defined to be 1?
• I think you're confusing sets and their elements. An orthonormal basis is a set of vectors, whereas "u" is a vector. Say B = {v_1, ..., v_n} is an orthonormal basis for the vector space V, with some inner product defined say < , >. Now <v_i, v_j> = d_ij where d_ij = 0 if i is not equal to j, 1 if i = j. This is called the kronecker delta. This says that if you take an element of my set B, such as v_1 and consider <v_1 , v_1> then this value must be 1. If the subscript isn't 1 then you will always get zero! The short answer is yes, but you had a slight conceptual mishap in your question.
(1 vote)
• Must a scalar multiple of an orthogonal matrix orthogonal as well? Is this answered in another video?
• Do you mean that if "M" is an orthogonal matrix is "kM" orthogonal? If so lets check the definition. I would recommend trying some examples.

"kM" is orthogonal if all of its columns are unit vectors. But if "M" was orthogonal and we multiply a "k" into "M" somewhere it will multiply one of the columns by a scaler that is not 1 so that column will no longer be a unit vector.
(1 vote)
• Is it called "Orthonormal bases" or "Orthonormal basis"?
It was "bases" in the title, but he said and wrote (as at ) "basis"
(1 vote)
• When bases is the plural of base, it is pronounced bay-sez. One base, several bay-sez.
When bases is the plural of basis, it is pronounced bay-sees. One basis, several bay-sees.
(ie you never have several basises)
• For expressing the dot product of the vectors, shouldn't we put the first vector transposed?
(1 vote)
• If you treat the vectors as 1-column matrices, then yes, in order to do the dot product you have to put express your first vector as a 1-row matrix. But if you are using normal vector notation (as most of the video does) then you are not committed to the matrix representation of vectors, and as such each vector can be seen as either a 1-column matrix, a 1-row matrix, a tuple of numbers or even as an arrow in space.

In notation, there is no difference between:
``    ⎡a_x⎤a = ⎥a_y⎥    ⎣a_z⎦a = [a_x a_y a_z]``