Main content

### Course: Linear algebra > Unit 3

Lesson 2: Orthogonal projections- Projections onto subspaces
- Visualizing a projection onto a plane
- A projection onto a subspace is a linear transformation
- Subspace projection matrix example
- Another example of a projection matrix
- Projection is closest vector in subspace
- Least squares approximation
- Least squares examples
- Another least squares example

© 2024 Khan AcademyTerms of usePrivacy PolicyCookie Notice

# Subspace projection matrix example

Example of a transformation matrix for a projection onto a subspace. Created by Sal Khan.

## Want to join the conversation?

- I have one question.

Is A (A^T A) ^(-1) A^T equal to A A^(-1) A^T^(-1) A^T and equal to I I and equal to I?

I am confused with the projection matrix.

Thank you in advance!! ^^(9 votes)- The property (AB)^-1=(B)^-1*(A)^-1 is valid only when both A and B are invertible and when matrix multiplication between them is defined.

If A is invertible, then it follows that A^T is also invertible. Their product A^T A is defined because the number of rows in A^T is equal to the number of columns in A. In such a case, the simplification A (A^T A) ^(-1) A^T =A A^(-1) A^T^(-1) A^T=I would be valid. So the projection of x onto the column space is simply x.

In fact, this makes since because when A is invertible, the system Ax=b has a unique solution for every b in Rn. This implies that the columns of A are a basis for Rn(since they are linearly independent and they span Rn) and that therefore any projection of an arbitrary vector x onto the subspace spanned by the columns of A is simply x, since x is already in the columns space of A.

However, If A is not invertible, then apparently there are some elements of Rn that are not in the column space of A, and so it makes since to speak of the projection of arbitrary vectors into C(A), which can be computed using the projection matrix A (A^T A) ^(-1) A^T. Keep in mind that even if A itself is not invertible, A^T A is invertible since A consists of linearly independent columns.(7 votes)

- So the final transformation matrix at12:07is symmetric. Is there anything significant about this? Will it always be symmetric?(2 votes)
- Yes it will be symmetric.

If A is axb, then A' is bxa. So A'A is bxb inverse(A'A) is still bxb... A * inverse(A'A) * A' gives axa matrix.

Since in this case we are dealing with R4, we expect a vector of R4 as input so the final transformation matrix has 4 columns. In general the projection will be a vector in R4 so the matrix is 4x4. But the interesting thing here is that the 3rd row is zero. So the projection matrix takes a vector in R4 and returns a vector in R4 whose 3rd component is 0 (so it is kind of like in R3).

Why is the 3rd row all zeroes? note that all the basis of V have zeroes in the 3rd position, So this subspace can span vectors that have components in 1s, 2nd and 4th dimension only. So if you project something onto V, you are bound to lose the 3rd component (9which is what we find from the matrix which has 3rd row full of zeroes only).

To conclude: yes the transformation matrix is symmetric as it is a general matrix that transforms R4 vectors to R4... but some components of the output vector might be zero making them vectors of a smaller subspace.(2 votes)

- How can we find projection of a vector in R4 on R2 ?(1 vote)
- R2 isn't a subspace of R4, it's an entirely separate vector space; so you can't. However, if you're asking how we can find the projection of a vector in R4 onto the plane spanned by the î and ĵ basis vectors, then all you need to do is take the [x y z w] form of the vector and change it to [x y 0 0]. For example:
`S = span(î, ĵ)`

`v = [2 3 7 1]`

`proj(v onto S) = [2 3 0 0]`

(1 vote)

- What if you try to project a vector in R5 onto this subspace? How would you multiply the 1x5 vector by a 4x4 matrix(1 vote)
- Does anyone have an good cookie recipes?(0 votes)

## Video transcript

Let's say I've got some subspace
V, which tends to be our favorite letter for
subspaces, and it's equal to the span of two vectors in R4. Let's say that the first vector
is 1 0 0 1, and the second vector is 0 1 0 1. That is my subspace V. And you can see that these
are going to be a basis. That these are linearly
independent. Two vectors that are linear-- or
any set of vectors that are linearly independent and that
span a subspace are a basis for that subspace. You can see they are linearly
independent. This guy's has a 1 here. There's no way you can take some
combination of this guy to somehow get a 1 there. And this guy has a 1 here. There's no way you can get some
linear combination of these zeroes here
to a 1 there, so they're linearly dependent. You can also call this
a basis for V. Now, given that, let's see
if we can find out the transformation matrix for the
projection of any arbitrary vector onto this subspace. So let's say that X-- we're
dealing in R4 here, right? Let's say that x is a member
of R4, and I want to figure out a transformation
matrix for the projection onto V of x. Now, in the last video, we came
up with a general way to figure this out. We said if A is a transformation
matrix-- sorry. If A is a matrix who's columns
are the basis for the subspace, so let's say A is
equal to 1 0 0 1, 0 1 0 1. So A is a matrix whose columns
are the basis for our subspace, then the projection
of x onto V would be equal to-- and this is kind of hard. The first time you look at, it
gives you a headache, but there's a certain pattern or
symmetry or a way of-- you could say it's A times, you're
gonna have something in the middle, and then you have A
transpose times your vector x. And the way I remember it is in
the middle, you have these two guys switched around. So then you have A transpose
A, and you take the inverse of it. You probably won't be using
this in your everyday life five or ten years from now, so
it's OK if you don't memorize it, but temporarily, put this
in your medium-term memory because it's a good thing
to know for doing these projection problems. So if we want to find the
general matrix for this transformation, we just have to
determine what this matrix is equal to, and that's just a
bunch of matrix operations. So that's A. What is A transpose? A transpose is going to be equal
to just all the rows turn into columns. So the first column becomes
the first row. So it becomes 1 0 0 1. The second column becomes
the second row 0 1 0 1. That's what A transpose is. Now, what's A transpose A? To figure out that, I want to
figure out what A transpose times A is. So let me multiply A
transpose times A. So I'll rewrite A right here. 1 0 0 1, 0 1 0 1. This is giving us some
good practice on matrix-matrix products. This is going to be
equal to what? Well, first of all, this is
a 2-by-4 matrix, and I'm multiplying it by a 4-by-2
matrix, so it's going to be a 2-by-2 matrix. So the first entry is
essentially the dot product of that row with that column. So it's 1 times 1 plus 0
times 0 plus 0 times 0 plus 1 times 1. So it's just going to be 2 for
that first entry right there. And then you take the dot
product of this guy with this guy right here. So it's 1 times 0, which is 0,
plus 0 times 1, which is 0, plus 0 time 0, which is 0, plus
1 times 1, which is 1. Now, we do this guy dotted with
this column right there. 0 times 1 is 0 plus 1 time 0 is
0 plus 0 times 0 is 0 plus 1 times 1 is 1. And then finally,
this row dotted with this second column. Second row, second column. 0 times 0 is 0, 1 times 1
is 1, 0 times 0 is 0, 1 times 1 is 1. So we have 1 times
1 plus 1 times 1. It's going to be 2. It's going to be equal to 2. So this right here
is A transpose A. But that's not good enough. We need to figure out what the
inverse of A transpose A is. This is A transpose A. But we need to figure out
A transpose A inverse. So what's the inverse of this? So let me write it here. The inverse A transpose
A inverse is going to be equal to what? It's 1 over the determinant
of this guy. What's the determinant here? It's going to be 1 over the
determinant of this. The determinant is 2 times 2,
which is 4, minus 1 times 1. So it's 4 minus 1, which is 3. So 1 over the determinant times
this guy, where if I swap these two, so I swap the
1's-- sorry, I swap the 2's. So this 2 goes here, and then
this orange 2 goes over here. And then I make these
1's negative. This becomes a minus 1 and
this becomes a minus 1. We learned that this is a
general solution for the inverse of a 2-by-2 matrix. I think it was 10 or 11 videos
ago, and you probably learned this in your Algebra II class,
frankly, but there you go. We have A transpose A inverse. So we have this guy. We have this whole guy here
is just this matrix. I could multiply the 1/3 into
it, but I don't have to do that just yet. But let's figure out the
whole matrix now. The whole A times this
guy, A transpose A inverse times A transpose. Let me write it this way. So the projection onto the
subspace V of x is going to be equal to A. 1 0 0 1-- let me write a little
bit bigger like this. So 1 0 0 1, 0 1 0 1 times A
transpose A inverse, right? A times A transpose A inverse,
which is this guy right here. Let's just put the 1/3 out
front just because that's just a scalar. I'll put the 1/3 out front
times this guy. This A transpose A inverse
is 1/3 times 2 minus 1, minus 1, 2. And then I'm going to multiply
it times A transpose. And all that times
our vector x. So A transpose is right there. It is 1 0 0 1, 0 1 0 1. And then all of that's going
to be times your vector x. So we still have some
nice matrix-matrix products ahead of us. Let see if we can do these. So the first one, let's just
multiply these two guys. I don't think there's any
simple way to do it. This is a 2-by-2 matrix and this
is a 2-by-4 matrix, so when I multiply them,
I'm going to end up with a 2-by-4 matrix. Let me write that 2-by-4
matrix right here. And then I can write this
guy right here. 1 0 0 1, 0 1 0 1. And then I have the 1/3 that
was from A transpose A inverse, but I put the scaling
factor out there. And all of this is equal to the
projection of x onto V. So let's do this product. So this first entry is going to
be 2 times 1 plus minus 1 times 0, so that is just 2. Then you're going to have 2
times 0 plus minus 1 times 1. Well, that's minus 1. Then you have 2 times 0
plus minus 1 times 0. Well, that's just 0. And then you're going to
have 2 times 1 plus minus 1 times 1. That's 2 minus 1. That's just 1, right? 2 times 1 plus minus
1 times 1. Fair enough. Now, let's do the second row. Minus 1 times 1 plus 2 times
0, so that's just minus 1. Minus 1 times 0 plus
2 times 1. Well, that's just 2. Minus 1 times 0 plus
2 times 0. That's just 0. Minus 1 times 1 plus
2 times 1. Well, that's minus 1 plus
2, so that is 1. Almost there, and, of course, we
have to multiply it times x at the end. That's what the transformation
is. But this right here is our
transformation matrix. One more left to do. Let's hope I haven't made any
careless mistakes and that I won't make any when doing
this product. This is going to be a little
more complicated because this is a 4 by 2 times a 2 by 4. I'm going to end up with
a 4-by-4 matrix. Let me give myself some
breathing room here because I'm going to generate a 4-by-4
matrix right there. And so what am I going to get? So this first entry is going
to be 1 times 2 plus 0 times minus 1. So it's just going
to be equal to 2. The next entry: 1 times-- this
row times any column here is just going to be the first entry
in the column because it gets zeroed out. So 1 times 2 plus 0 times
minus 1 is just 2. 1 times minus 1 plus 0 times
2 is just minus 1. 1 times 0 plus 0 times 0 is 0. 1 times 1 plus 0 times
1 is just 1. When you take this row and you
multiply it times these columns, you literally just
got your first row there. Now, let's do this row
times these columns. Now, you've got a 0 here, so
you're going to have a 0 times the first entry of all
of these and a 1 times the second one. So 0 times 2 plus 1 times
minus 1 is minus 1. 0 times minus 1 plus
1 times 2 is 2. You're just going to get
the second row here. 2 0 1. That actually makes sense,
because if you just look at this part of the matrix, it's
the 2-by-2 identity matrix. So, anyway, that's a little hint
why this looks very much like that, but we're just
going to go through this matrix product. Now, you multiply this-- let me
do it in a different color. You multiply this guy times
each of these columns. That guy dotted with that is
just going to 0 because this guy's essentially the 0 row
vector, so you're just going to get a bunch of zeroes. And then, finally, this last
row, it's 1 times the first entry plus 1 times
the second entry. So this guy's going to be 2
plus minus 1, which is 1. Minus 1 plus 2, which is 1. 0 plus 0, which is 0. And then 1 plus 1, which is 2. And all that times x. And there you have it. This is exciting! The projection onto V of x
is equal to this whole matrix times x. So this thing right here, I
could multiply the 1/3 into it, but we don't have
to do that. That'll just make it a little
bit more messy. This thing right here is the
transformation matrix. As you can see, since we're
transforming-- remember, this projection onto V,
this is a linear transformation from R4 to R4. You give me some member of R4,
and I'll give you another member of R4 that's in my
subspace that is the projection. So this is going to be
a 4-by-4 You can see it right there. Anyway, hopefully, you found
that useful to actually see a tangible result. R4 is very abstract, so this
would even be beyond our three-dimensional programming
example. We're dealing with a more
abstract data set where we're interested in finding
a projection.