The Student Room Group

eigenvalues and vectors question

For number 7b) I got the first eigenvector thats corresponding to eigenvalue=1

and for the second in which the eigevalue=4, I wrote down the eigenvector as
(1 1 2) which is in the MS but it says 'any two independent vectors satisfying...'

How was I suppose to know that there were 2 eigenvectors that I needed to write?

Because I got the second part wrong, I got part c wrong as well. I wrote eigenvec
(1 1 2) has invariant line but in the MS it says invariant plane.

Could you please explain to me from what was in the question that hints that there are 2 eigenvectors for eigenval=4?

Thanks
(edited 6 years ago)

Scroll to see replies

Reply 1
Original post by liemluji
How was I suppose to know that there were 2 eigenvectors that I needed to write?


When you have a repeated eigenvalue and you are trying to find an eigenvector, you should end up with all three of your equations coming to the same thing (e.g. x + 2y + 3z = 0). You should recognise this as the equation of a plane, not a line. If you haven't seen it before, this is the time to learn that you are expected to find two non-parallel eigenvectors that satisfy this equation.
Reply 2
Original post by Pangol
When you have a repeated eigenvalue and you are trying to find an eigenvector, you should end up with all three of your equations coming to the same thing (e.g. x + 2y + 3z = 0). You should recognise this as the equation of a plane, not a line. If you haven't seen it before, this is the time to learn that you are expected to find two non-parallel eigenvectors that satisfy this equation.


Thanks for your reply. You said all three of the equations come to the same thing but so does for the eigenvec corresponding to eigenval=1.

Also, for eigenvect with eigenval=1, i had equations like (2x -y + z=0), (-x +2y + z=0) and (x + y +2x = 0) Aren't these euqations of planes too?

Your answer is rly helpful but please help me clarify things better Im so bad at fp4 it's unbelievable
Reply 3
Original post by Pangol
When you have a repeated eigenvalue and you are trying to find an eigenvector, you should end up with all three of your equations coming to the same thing (e.g. x + 2y + 3z = 0). You should recognise this as the equation of a plane, not a line. If you haven't seen it before, this is the time to learn that you are expected to find two non-parallel eigenvectors that satisfy this equation.


All because there is a repeated eigenvalue doesn't mean that there is only one eigenvector associated with it.
Reply 4
Wow im utterly confused rn
Reply 5
Original post by liemluji
Wow im utterly confused rn


When all three equations are the same, then a you have is say x+y+z=0. Notice that this is the Cartesian equation that defines a plane and so you have an eigenplane not just an eigenvector. You will be able to find two linearly independent eigenvectors that parametrically define the plane.
Reply 6
Original post by liemluji
Thanks for your reply. You said all three of the equations come to the same thing but so does for the eigenvec corresponding to eigenval=1.

Also, for eigenvect with eigenval=1, i had equations like (2x -y + z=0), (-x +2y + z=0) and (x + y +2x = 0) Aren't these euqations of planes too?

Your answer is rly helpful but please help me clarify things better Im so bad at fp4 it's unbelievable


Sorry, been out since my last post.

Can you post the question, or at least a link to the paper it is from? It doesn't show up for me in the first post. I think I recognise it, but I can't be bothered to trawl through all the FP4 papers to find it! Once I see which one it is, I'll try to be clearer.
Reply 7
Original post by Pangol
Sorry, been out since my last post.

Can you post the question, or at least a link to the paper it is from? It doesn't show up for me in the first post. I think I recognise it, but I can't be bothered to trawl through all the FP4 papers to find it! Once I see which one it is, I'll try to be clearer.


I'm sorry, I thought I attached them after I edited it but here you go.
Thanks for your help :smile:
Reply 8
Original post by liemluji
I'm sorry, I thought I attached them after I edited it but here you go.
Thanks for your help :smile:

So I assume that you've happily been able to show that 1 and 4 are the eigenvalues, and that 4 is a repeated eigenvalue.

When you use the eigenvalue of 4, all three equations reduce to x+y-z = 0. This is the equation of a plane. You can therefore find two linearly independent eigenvectors that satisfy this equation. The easiest way to do this is to find one where x=0 and one where y=0 (or use z=0 in place of either of these).

When you use the eigenvalue of 1, I claim that you get these equations;

2x-y+z = 0
-x+2y+z=0
x+y+2z=0

Now, these are all the equations of planes, but the important point is that they are not the same plane. You need an eigenvector that could lie on all three planes at the same time. You therefore have to do a bit of playing about to see what happens. It turns out that the three planes meet along a line (which must be through the origin), but you should be expecting this to happen in any case. The important point is that there is only one eigenvalue that can solve all three equations (up to multiplication by a scalar), whereas for the repeated eigenvector, there are two linearly independent solutions (up to multiplication by a scalar.

Hope this helps!
Reply 9
As I said above I would like to add that all because you have a repeated eigenvalue does not mean that you necessarily have two linearly independent eigenvalues for that particular eigenvalue.
I think they will for A level though because I think they often ask about diagonalisation and so the matrix will need 3 linearly independent eigenvectors (at least for 3x3 matrices) for this to be possible.
Reply 10
Original post by Pangol
So I assume that you've happily been able to show that 1 and 4 are the eigenvalues, and that 4 is a repeated eigenvalue.

When you use the eigenvalue of 4, all three equations reduce to x+y-z = 0. This is the equation of a plane. You can therefore find two linearly independent eigenvectors that satisfy this equation. The easiest way to do this is to find one where x=0 and one where y=0 (or use z=0 in place of either of these).


Thanks for your help, but please help me clarify this a bit more. So I get that repeated eigenvalue will not always mean that there are more than 1 independent eigenvector as mentioned by someone else, but for eigenvector corresponding to eigenvalue of 4, I found out that (1 1 2) works in all 3 of the same planes. So I don't quite get why there's a need to find two independent eigenvectors.

Thanks for your help, you are helping me a lot!
Reply 11
Original post by liemluji
for eigenvector corresponding to eigenvalue of 4, I found out that (1 1 2) works in all 3 of the same planes.


You are quite right - this is one possible eigenvector, and so is any scalar multiple of it. But you can find another eigenvector for this eigenvalue that is not a scalar multiple of it, so you need to do that to have a full set of eigenvectors for the eigenvalue 4.

For the eigenvalue 1, once you have found one eigenvector, it is not possible to find any other that is not a scalar multiple of the one you have already found, so you stop at the first one.
Reply 12
Original post by Pangol
You are quite right - this is one possible eigenvector, and so is any scalar multiple of it. But you can find another eigenvector for this eigenvalue that is not a scalar multiple of it, so you need to do that to have a full set of eigenvectors for the eigenvalue 4.

For the eigenvalue 1, once you have found one eigenvector, it is not possible to find any other that is not a scalar multiple of the one you have already found, so you stop at the first one.


Oh okay, I understand a bit better now. Ill have to work on those more though. Thanks a lot!!
Original post by B_9710
As I said above I would like to add that all because you have a repeated eigenvalue does not mean that you necessarily have two linearly independent eigenvalues for that particular eigenvalue.
I think they will for A level though because I think they often ask about diagonalisation and so the matrix will need 3 linearly independent eigenvectors (at least for 3x3 matrices) for this to be possible.
It's odd that the question doesn't specifically ask you to find 2 eigenvectors for the repeated eigenvalue. It certainly implies that "according to the syllabus", repeated eigenvalues => repeated (independent) eigenvectors.

Original post by liemluji
Thanks for your help, but please help me clarify this a bit more. So I get that repeated eigenvalue will not always mean that there are more than 1 independent eigenvector as mentioned by someone else, but for eigenvector corresponding to eigenvalue of 4, I found out that (1 1 2) works in all 3 of the same planes. So I don't quite get why there's a need to find two independent eigenvectors.
Unless you're told otherwise, you're pretty much always going to be trying to end up with a "complete" set of lin. indep. eigenvectors. That is, 3 eigenvectors for a 3x3 matrix, 4 eigenvectors for a 4 x 4 matrix, etc.

One fact (that your syllabus may or not cover): if a matrix is symmetric as is the case here, it always has a complete set of linearly independent eigenvectors, and moreover you can always choose these eigenvectors to be orthogonal. (i.e. the dot product of any 2 different eigenvectors will be 0). I'm wondering if you are supposed to know at least some of this, and that's why they expect you to know there will be 2 eigenvectors for the repeated root.
Reply 14
Original post by DFranklin
One fact (that your syllabus may or not cover): if a matrix is symmetric as is the case here, it always has a complete set of linearly independent eigenvectors, and moreover you can always choose these eigenvectors to be orthogonal. (i.e. the dot product of any 2 different eigenvectors will be 0). I'm wondering if you are supposed to know at least some of this, and that's why they expect you to know there will be 2 eigenvectors for the repeated root.


The specification is a bit vague on this, but the orthogonal part is certainly way beyond what is expected. Every time that this has ever come up, repeated eigenvalues do indeed lead to two linearly independent eigenvectors.

I am interested in the situations when a repeated eigenvalue does not lead to two eigenvectors. Are these unusual cases? Complex eigenvalues? Matrices with non-real elements? Or are there some really obvious cases that I'm not aware of?
Original post by Pangol
The specification is a bit vague on this, but the orthogonal part is certainly way beyond what is expected. Every time that this has ever come up, repeated eigenvalues do indeed lead to two linearly independent eigenvectors.

I am interested in the situations when a repeated eigenvalue does not lead to two eigenvectors. Are these unusual cases? Complex eigenvalues? Matrices with non-real elements? Or are there some really obvious cases that I'm not aware of?
The canonical example where repeated eigenvalues don't lead to linearly independent eigenvectors is a matrix like:

(λ10λ)\begin{pmatrix}\lambda & 1 \\ 0 & \lambda\end{pmatrix}

Obviously, you can also end up with the characteristic polynomial having non-real roots; at a university level you'd generally always factor over C\mathbb{C} so this doesn't arise. In any event, this is more a case of "missing" (i.e. non-real) roots, not repeated ones.

Edit: regarding orthogonal eigenvectors: this was "on syllabus" when I did FM A-level, so I wasn't sure what it's status was now. It's not really much "more" to learn - it was literally being told "you can always do this with (real) symmetric matrices" and then doing some examples.
(edited 6 years ago)
Reply 16
Original post by Pangol
The specification is a bit vague on this, but the orthogonal part is certainly way beyond what is expected. Every time that this has ever come up, repeated eigenvalues do indeed lead to two linearly independent eigenvectors.

I am interested in the situations when a repeated eigenvalue does not lead to two eigenvectors. Are these unusual cases? Complex eigenvalues? Matrices with non-real elements? Or are there some really obvious cases that I'm not aware of?


(111011001) \begin{pmatrix} 1&1&1\\0&-1&1\\0&0&1 \end{pmatrix} only has 2 linearly independent eigenvalues.
Reply 17
Original post by DFranklin
The canonical example where repeated eigenvalues don't lead to linearly independent eigenvectors is a matrix like:

(λ10λ)\begin{pmatrix}\lambda & 1 \\ 0 & \lambda\end{pmatrix}


That's interesting, although I now realise that I am more interested in 3x3 than 2x2 examples (I was actually aware of a 2x2 that fits the bill, and had wondered why it didn't give the same outcome). Are you aware of a general condition where a 3x3 has a repeated eigenvalue but does not have two independent eigenvectors to go with it?
Reply 18
Original post by B_9710
(111011001) \begin{pmatrix} 1&1&1\\0&-1&1\\0&0&1 \end{pmatrix} only has 2 linearly independent eigenvalues.


Again, very interesting - but is there some way we could see this in advance? Is there a general condition that can be checked? I'm expecting it's something to do with all the zeroes.
Original post by Pangol
That's interesting, although I now realise that I am more interested in 3x3 than 2x2 examples (I was actually aware of a 2x2 that fits the bill, and had wondered why it didn't give the same outcome). Are you aware of a general condition where a 3x3 has a repeated eigenvalue but does not have two independent eigenvectors to go with it?
I'm not sure there's any criteria that's more "useful" than "try to diagonalize it and see what happens".

I don't know if you know the Cayley-Hamilton theorem (a matrix satisifies it's own characteristic equation).

That is, if p(X) is the characteristic polynomial for a matrix A, then p(A) = 0.

There's also the concept of the minimal polynomial for A. That is, the polynomial q of minimal degree such that q(A) = 0.

Note that q(A) | p(A) (since otherwise the remainder r(X) from dividing p(X) by q(X) would have to also satisfy r(A) = 0 and would be of lower degree than q.

Then the condition for diagonalizability is that this minimal polynomial must have no repeated roots.

The problem is that I'm aware of no method for finding the minimal polynomial that isn't at least as hard as "for each eigenvalue of mulitplicity k, see if you can find k lin. indep. eigenvectors".

Quick Reply

Latest