- Introduction to eigenvalues and eigenvectors
- Proof of formula for determining eigenvalues
- Example solving for the eigenvalues of a 2x2 matrix
- Finding eigenvectors and eigenspaces example
- Eigenvalues of a 3x3 matrix
- Eigenvectors and eigenspaces for a 3x3 matrix
- Showing that an eigenbasis makes for good coordinate systems
Example solving for the eigenvalues of a 2x2 matrix
Example solving for the eigenvalues of a 2x2 matrix. Created by Sal Khan.
Want to join the conversation?
- Does the order matter when you use the equation: Det(lambda x I-A)=0? Because in my linear algebra textbook they just have the order of the equation flipped around. Det(A- lambda x I)=0. Thank you very much!(43 votes)
- They are actually the same thing. Remember that the determinant is a multilinear function, so basically det(cA) = c det(A). In this case our c will be -1. (λI - A) = -(A - λI). So det(λI-A) = det(-(A-λI)) = - det(A-λI) and since you know that one is equal to 0, this equation says that both are equal to zero and therefore they are equal to eachother.(31 votes)
- At4:05Sal derives the "characteristic polynomial". This seems to be a simple quadratic equation that can be solved (as long as b^2-4ac is >= 0).
So does that mean that most 2by2 matrices have an eigenvalue ?
Also, does the fact that the 2 eigenvalues exist mean that the columns are linearly dependent? It doesn't seem like that to me though.(3 votes)
- You can actually find the eigenvalue even if b^2 - 4ac is < 0. That would just mean that the eigenvalue is imaginary. Matrices can have imaginary eigenvalues.(6 votes)
- At the end of the video Sal gets the factors (λ-5)(λ+1), but says that the eigenvalues are 5 and -1. Why did the signs change?(1 vote)
- To find the eigenvalues you have to find a characteristic polynomial P which you then have to set equal to zero. So in this case P is equal to (λ-5)(λ+1). Set this to zero and solve for λ. So you get λ-5=0 which gives λ=5 and λ+1=0 which gives λ= -1(9 votes)
- Hi Sal, in my notes I have the characteristic equation : | A-hI |= 0 which is the reverse of yours. It doesn't seem to matter in this case which way they go A-hI or hI-A do you know if this is always the case?(5 votes)
- What we have do when characteristic equation does not have any roots?(2 votes)
- When working with only the real numbers, you are not guaranteed roots, so you could factor the characteristic polynomial as much as possible, until it is no longer factor-able.
However, the Fundamental Theorem of Algebra tells us that if we shift our perspective from real eigenvalues to complex eigenvalues (i.e. eigenvalues with imaginary components), then our characteristic polynomial is guaranteed to have all roots. By all roots, I mean that an nth degree characteristic polynomial has exactly n complex roots.
An extension/generalization to this answer (if you know some field theory), is if your matrix A is nxn in a field F, then since every field F has an algebraic closure F', if we work in the field F', A has n eigenvalues. In other words, the characteristic polynomial under F' has n roots.(3 votes)
- Can you tell me the basic applications of eigenvalues and eigenvectors?(2 votes)
- Machine Learning. Advanced use case but worthy of a mention.(2 votes)
- Why are there only two eigenvalues for this matrix?
For any reflection transformation surely there are infinite eigenvalues, because all of the vectors along the line of reflection would not be changed, nor would those orthogonal to it. And there are infinitely many vectors along these lines, thus infinitely many eigenvalues for these vectors.
It seems weird to me that there would only be two eigenvalues for a transformation - I would have thought there would either be 0 (i.e. rotation) or infinite.
Is there some limit to eigenvectors (thus eigenvalues) along a line? i.e. can they only be unit vectors before the transformation?
This is ignoring imaginary eigenvalues of course.(2 votes)
- We only count eigenvectors as separate if one is not just a scaling of the other. Otherwise, as you point out, every matrix would have either 0 or infinitely many eigenvectors. And we can show that if v and cv (for some scalar c) are eigenvectors of a matrix A, then they have the same eigenvalue.
Suppose vectors v and cv have eigenvalues p and q. So
A(cv)=c(Av). Substitute from the first equation to get A(cv)=c(pv)
So from the second equation, q(cv)=c(pv)
Since v is an eigenvector, it cannot be the 0 vector, so qc=cp, or q=p. The eigenvalues are the same.(2 votes)
- what is trivial and non trivial?(2 votes)
- If a solution is trivial, there is only one solution. For example, we have equation y=x. If y=0, then x=0.
Non-trivial refers to many possible solutions. Look at y=sin(x). For y=0, there is more than one solution. In fact, there is an infinite amount of solutions: x=0, pi, 2*pi, 3*pi...n*pi(2 votes)
- Sum of eigenvalues is equal to trace of matrix(2 votes)
- how is this λI matrix with 1's and 0's generated? (I mean how I determine when to put 1 or 0)(1 vote)
- It's the identity matrix multiplied by λ. So the 1's are all on the diagonal from top-left to bottom-right.(2 votes)
In the last video we were able to show that any lambda that satisfies this equation for some non-zero vectors, V, then the determinant of lambda times the identity matrix minus A, must be equal to 0. Or if we could rewrite this as saying lambda is an eigenvalue of A if and only if-- I'll write it as if-- the determinant of lambda times the identity matrix minus A is equal to 0. Now, let's see if we can actually use this in any kind of concrete way to figure out eigenvalues. So let's do a simple 2 by 2, let's do an R2. Let's say that A is equal to the matrix 1, 2, and 4, 3. And I want to find the eigenvalues of A. So if lambda is an eigenvalue of A, then this right here tells us that the determinant of lambda times the identity matrix, so it's going to be the identity matrix in R2. So lambda times 1, 0, 0, 1, minus A, 1, 2, 4, 3, is going to be equal to 0. Well what does this equal to? This right here is the determinant. Lambda times this is just lambda times all of these terms. So it's lambda times 1 is lambda, lambda times 0 is 0, lambda times 0 is 0, lambda times 1 is lambda. And from that we'll subtract A. So you get 1, 2, 4, 3, and this has got to equal 0. And then this matrix, or this difference of matrices, this is just to keep the determinant. This is the determinant of. This first term's going to be lambda minus 1. The second term is 0 minus 2, so it's just minus 2. The third term is 0 minus 4, so it's just minus 4. And then the fourth term is lambda minus 3, just like that. So kind of a shortcut to see what happened. The terms along the diagonal, well everything became a negative, right? We negated everything. And then the terms around the diagonal, we've got a lambda out front. That was essentially the byproduct of this expression right there. So what's the determinant of this 2 by 2 matrix? Well the determinant of this is just this times that, minus this times that. So it's lambda minus 1, times lambda minus 3, minus these two guys multiplied by each other. So minus 2 times minus 4 is plus eight, minus 8. This is the determinant of this matrix right here or this matrix right here, which simplified to that matrix. And this has got to be equal to 0. And the whole reason why that's got to be equal to 0 is because we saw earlier, this matrix has a non-trivial null space. And because it has a non-trivial null space, it can't be invertible and its determinant has to be equal to 0. So now we have an interesting polynomial equation right here. We can multiply it out. We get what? Let's multiply it out. We get lambda squared, right, minus 3 lambda, minus lambda, plus 3, minus 8, is equal to 0. Or lambda squared, minus 4 lambda, minus 5, is equal to 0. And just in case you want to know some terminology, this expression right here is known as the characteristic polynomial. Just a little terminology, polynomial. But if we want to find the eigenvalues for A, we just have to solve this right here. This is just a basic quadratic problem. And this is actually factorable. Let's see, two numbers and you take the product is minus 5, when you add them you get minus 4. It's minus 5 and plus 1, so you get lambda minus 5, times lambda plus 1, is equal to 0, right? Minus 5 times 1 is minus 5, and then minus 5 lambda plus 1 lambda is equal to minus 4 lambda. So the two solutions of our characteristic equation being set to 0, our characteristic polynomial, are lambda is equal to 5 or lambda is equal to minus 1. So just like that, using the information that we proved to ourselves in the last video, we're able to figure out that the two eigenvalues of A are lambda equals 5 and lambda equals negative 1. Now that only just solves part of the problem, right? We know we're looking for eigenvalues and eigenvectors, right? We know that this equation can be satisfied with the lambdas equaling 5 or minus 1. So we know the eigenvalues, but we've yet to determine the actual eigenvectors. So that's what we're going to do in the next video.