Main content

### Course: Linear algebra > Unit 3

Lesson 1: Orthogonal complements- Orthogonal complements
- dim(v) + dim(orthogonal complement of v) = n
- Representing vectors in rn using subspace members
- Orthogonal complement of the orthogonal complement
- Orthogonal complement of the nullspace
- Unique rowspace solution to Ax = b
- Rowspace solution to Ax = b example

© 2024 Khan AcademyTerms of usePrivacy PolicyCookie Notice

# dim(v) + dim(orthogonal complement of v) = n

Showing that if V is a subspace of Rn, then dim(V) + dim(V's orthogonal complement) = n. Created by Sal Khan.

## Want to join the conversation?

- if V is a basis, then doesn't that imply that its columns are linearly independent and that the dimension of V perp is 0 since the dim of A = dim of A transpose (so dim nulA = 0 = dim nulA transpose)(6 votes)
- I think where you're confused is that V is
**not**a basis. V is a subspace (see beginning of video). Then, Sal defined k vectors which form a basis**for**V.

Also, V doesn't have columns because it's not a matrix, it's a subspace. To be fair, you**could**represent it as a matrix, but if k > 0 like in the example, it would have an infinite number of columns (all the linear combinations of the basis vectors v1 to vk)(10 votes)

- Is there such a thing as a 3-d matrix? Like a box of numbers in three dimensions?

I feel like that should exist for some reason... a n x m x k matrix?(4 votes)- Tensors can be seen as a generalization of matrices to higher dimensions (they are more complicated than that, but it's a starting point), so you could argue that a rank 3 tensor could be seen as a 3D matrix.

But other than that, matrices are always a 2D array of numbers.(4 votes)

- Isn't this the rank and nullity theorem?(2 votes)
- The result is essentially the rank-nullity theorem, which tells us that given a m by n matrix A, rank(A)+nullity(A)=n. Sal started off with a n by k matrix A but ended up with the equation rank(A transpose)+nullity(A transpose)=n. Notice that A transpose is a k by n matrix, so if we set A transpose equal to B where both matrices have the same sizes and entries, then you get the standard form for the rank-nullity theorem, which is rank(B)+nullity(B)=n.(5 votes)

- why do we have to use the column space of V to represent V instead of using the row space?(3 votes)
- He starts with V, and then wants to talk about a basis for it.. so declares the vectors that make up the basis. I'm not sure there is any more special reason than it is more common to use column vectors.

If he declared them as row vectors he would get to the same conclusion via the rowspace of V being the orthogonal complement of the Nullspace N(V).(2 votes)

- rename this as rank-nullity theorem to make it easier to find(3 votes)
- When you are indicating the number of rows and columns in a matrix, you usually choose from k,m, and n. In a square matrix, obviously, you use the same letter for both rows and columns.

Is there some rationale as to when you use an**n x k**matrix as opposed to an**n x m**matrix.

In other words, why did you choose**k**? Does it imply a different relationship to**n**than does**m**?

I hope I explained what confuses me clearly. If you understand what I am trying to say, can you help me word it better before you answer it?

Thanks.(2 votes)- Short answer: Usually m>=n>=k.

Long answer: I think he chose k, because he wanted to concentrate on the vectors (columns) more and show that it doesn't have to be equal to n, but it really doesn't matter that much. Most notation in math is as it is, because someone wrote it that way and from that point everyone else started to follow him to not confuse people with different notation. Like in your case, usually less used k shows up and you start to be little confused or at least curious :)(2 votes)

- whats is n? Is it the number of rows of a matrix formed from column vectors of V or the number of column vectors of V ?(2 votes)
`n`

is the dimension of the space in which the vectors live:`ℝⁿ`

(1 vote)

- If A is visualized in n-dim space, the dimensions that A cannot reach is null(A-transpose).

Is that what null(A-tran) really is?

The rref(A) shows where it spans, why does A have to be transposed to know where it cannot span?(2 votes)- From what I understood, null(A-trans) just represents the part of the overall vector that includes the free variables (a set of redundant values), hence finishing the full "k" length of a vector (along with the pivot variables that are present).(1 vote)

- i am having a really hard time with this.(2 votes)
- Why don't we say that the null space of A equals the orthogonal complement of the column space?

i mean orthogonal complement is the set of all x'es that satisfy this equation x1v1+x2v1+...+xnvn ?where v1,v2....vn are the columns of A

i am really confused(1 vote)- Let A be an m x n matrix. Null space vectors live in R^n. Vectors in the column space live in R^m.

Vectors in the orthogonal complement of the column space still live in R^m. Unless m=n, there is no way to compare R^n vectors to R^m.

For example, there is no notion of adding a triple (1, 0, 2) to the pair

(5, -6), or asking how we could compare the two vectors.(2 votes)

## Video transcript

Let's say I've got some subspace
of Rn called V. So V is a subspace of Rn. And let's say that
I know its basis. Let's say the set. So I have a bunch of-- let me
make that bracket a little nicer-- so let's say the set of
the vectors v1, v2, all the way to vk, let's say that this
is equal to-- this is the basis for V. And just as a reminder, that
means that V's vectors both span V and they're linearly
independent. You can kind of see there's a
minimum set of vectors in Rn that span V. So, if I were to ask you what
the dimension of V is, that's just the number of vectors
you have in your basis for the subspace. So we have 1, 2 and we
count to k vectors. So it is equal to k. Now let's think about, if we
can somehow figure out what the dimension of the orthogonal complement of V can be. And to do that, let's
construct a matrix. Let's construct a matrix
whose column vectors are these basis vectors. So let's construct a matrix
A, and let's say it looks like this. First column is v1. This first basis vector
right there. v2 is the second one, and then
you go all the way to vk. Just to make sure we remember
the dimensions, we have k of these vectors, so we're going
to have k columns. And then how many rows
are we going to have? Well, as a member of Rn, so
these are all going to have n entries in each of these
vectors, there's going to be an n-- we're going to have
n rows and k columns. It's an n by k matrix. Now, what's another way of
expressing the subspace V? Well, the basis for V is-- or
V is spanned by these basis vectors, which is the
columns of these. So if I talk about the span--
so let me write out this-- V is equal to the span of
these guys, v1, v2, all the way to vk. And that's just the same thing
as the column space of A. Right? These are the column vectors,
and the span of them, that's equal to the column
space of A. Now, I said a little while ago,
we want to somehow relate to the orthogonal
complement of V. Well, what's the orthogonal
complement of the column space of A? The orthogonal complement of
the column space of A, I showed you-- I think it was two
or three videos ago-- that the column space of A's
orthogonal complement is equal to-- you could either view
it as the null space of A transpose, or another way
you call it is the left null space of A. This is equivalent to the
orthogonal complement of the column space of A, which is
also going to be equal to, which is also since this piece
right here is the same thing as V, you take it's orthogonal
complement, that's the same thing as V's orthogonal
complement. So if we want to figure out
there orthogonal complement of-- if we want to figure out
the dimension-- if we want to figure out the dimensional of
the orthogonal complement of V, we just need to figure out
the dimension of the left null space of A, or the null space
of A's transpose. Let me write that down. So the dimension-- get you
tongue-tied sometimes-- the dimension of the orthogonal
complement of V is going to be equal do the dimension
of A transpose. Or another way to think of it
is-- sorry, not just the dimension of A transpose, the
dimension of the null space of A transpose. And if you have a good memory,
I don't use the word a lot, this thing is the nullity--
this is the nullity of A transpose. The dimension of your null
space is nullity, the dimension of your column
space is your rank. Now let's see what
we can do here. So let's just take A transpose,
so you can just imagine A transpose
for a second. I can just even draw it out. It's going to be a k by n matrix
that looks like this. These columns are going
to turn into rows. This is going to be v1
transpose, v2 transpose, all the way down to vk transpose
vectors. These are all now row vectors. So we know one thing. We know one relationship between
the rank and nullity of any matrix. We know that they're equal to
the number of columns we have. We know that the rank of A
transpose plus the nullity of A transpose is equal
to the number of columns of A transpose. We have n columns. Each of these have n entries. It is equal to n. We saw this a while ago. And if you want just a bit of a
reminder of where that comes from, when you take a-- if I
wrote A transpose as a bunch of column vectors, which I can,
or maybe let me take some other vector B, because I want
to just remind you where this, why this made sense. If I take some vector B here,
and it has got a bunch of column vectors, b1, b2, all the
way to bn, and I put it into reduced row echelon form,
you're going to have some pivot columns and some
non-pivot columns. So let's say this is
a pivot column. You know, I got a 1 and a bunch
of 0's, let's say that this is one of them, and then
let's say I got one other one that's out, and it would be a 0
there, it's a 1 down there, and everything else is
a non-pivot column. I showed you in the last video
that your basis for your column space is the number of
pivot columns you have. So these guys are pivot columns. The corresponding column vectors
form a basis for your column space. I showed you that in
the last video. And so, if you want to know the
dimension of your column space, you just have to
count these things. You just count these things. This was equal to the number of,
well, for this B's case, the rank of B is just equal to
the number of pivot columns I have. Now the nullity is the
dimension of your null space. We've done multiple problems
where we found the null space of matrices. And every time, the dimension,
it's a bit obvious, and I actually showed you this proof,
it's related to the number of free columns you have,
or non-pivot columns. So, if you have no pivot
columns, then you are -- if all of your columns are pivot
columns, and none of them have free variables or are
associated with free variables, then you're null
space is going to be trivial. It's just going to have
the 0 vector. But the more free variables
you have, the more dimensionality your
null space has. So the free columns correspond
to the null space, and they form actually a basis
for your null space. And because of that, the basis
for your null space vectors, plus the basis for your column
space, is equal to the total number of columns you have. I
showed that to you in the past, but it's always good
to remind ourselves where things come from. But this was just
a bit of a side. I did this with a separate
vector B. Just to remind ourselves
where this thing right here came from. Now, in the last video, I showed
you that the rank of A transpose is the same thing
is the rank of A. This is equal to, this part
right here, is the same thing as the rank of A. I showed you that in
the last video. When you transpose a matrix, it
doesn't change its rank, or it doesn't change the dimension
of its column space. So we can rewrite this
statement, right here, as the rank of A plus the nullity of A
transpose is equal to n, and the rank of A is the same thing
as the dimension of the column space of A. And then the nullity of A
transpose is the same thing as the dimension of the null space
of A transpose-- that's just the definition of nullity--
they're going to be equal to n. Now what's the dimension--
what's the column space of A? The column space of A, that's
what's spanned by these vectors right here, which
were the basis for V. So this is the same thing
as the dimension of V. The column space of A is the
same thing as the dimension of my subspace V that I started
this video with. And what is the null space
of A transpose? The null space of A transpose,
we saw already, that's the orthogonal complement of V. So I could write this as plus
the dimension of the orthogonal complement
of V is equal to n. And that's the result
we wanted. If V is a subspace of Rn, that n
is the same thing as that n, then the dimension of V plus
the dimension of the orthogonal complement of V is
going to be equal to n.