The Student Room Group

How to Quicky Compute Eigenvectors

Screenshot from 2019-01-15 14-35-18.png

The question is only worth 4 marks but it took me 3 pages to work out the solution. For each eigenvector, I formed a system of equations and tried to solve it. Is there a more efficient way?
Original post by esrever


The question is only worth 4 marks but it took me 3 pages to work out the solution. For each eigenvector, I formed a system of equations and tried to solve it. Is there a more efficient way?


Perhaps a quicker approach than yours:

For λ=1 \lambda = 1;

(022636658)\displaystyle \begin{pmatrix} 0 & 2 & -2 \\ 6 & 3 & -6 \\ 6 & 5 & -8 \end{pmatrix}

can be reduced to echelon form by

(a) swapping the first and second rows, then (b) performing row operations (*) to effectively get down to:

(636022000)\displaystyle \begin{pmatrix} 6 & 3 & -6 \\ 0 & 2 & -2 \\ 0 & 0 & 0 \end{pmatrix}

So if (v1,v2,v3)(v_1, v_2, v_3) is our eigenvector then

6v1+3v26v3=06v_1 + 3 v_2 - 6v_3 = 0
2v22v3=02v_2 - 2v_3 = 0

and we are free to make one of these variables our free parameter. Let v3=1v_3 = 1 then v2=1v_2 = 1 hence v1=12v_1 = \dfrac{1}{2}.

Thus the eigenvector is (12,1,1)(\frac{1}{2},1,1), but of course any vector of the same direction but different size is valid as well, so you can rewrite it as the vector (1,2,2)(1,2,2) as well.

(*) first thing after swapping rows is to subtract the first row from the third row, then subtract the second row from the third row.
(edited 5 years ago)
Reply 2
Original post by esrever
Screenshot from 2019-01-15 14-35-18.png

The question is only worth 4 marks but it took me 3 pages to work out the solution. For each eigenvector, I formed a system of equations and tried to solve it. Is there a more efficient way?


I'm not sure why it would take 3 pages to work out a solution. Just subbing the values (lambda=1) in gives
0 2 -2
6 3 -6
6 5 -8
The first row gives v2 = v3, so its fairly simple to solve now? Simillarly lambda = -1
2 2 -2
6 5 -6
6 5 6
The last two rows gives v3=0, so again fairly simple now? lambda=-2
3 2 -2
6 6 -6
6 5 -5
Again, the last two rows gives v2=v3 so simple to solve?

Four marks does seem a bit mean, but 3 pages of working seems a bit long.
Original post by esrever
For each eigenvector, I formed a system of equations and tried to solve it. Is there a more efficient way?

Not really, no. However I agree with:
Original post by mqb2766
I'm not sure why it would take 3 pages to work out a solution. ~snip~

Four marks does seem a bit mean, but 3 pages of working seems a bit long.

Something to realise is that you almost always only need to consider 2 equations (i.e.2 rows) to find the eigenvector (when working in 3D).
A "trick" I found helpful is to use the fact that "any multiple of the eigenvector will do", to assume right at the start that one component, v1v_1, say = 1. Then 2 rows of the matrix will give you 2 equations in 2 unknowns that you can solve for v2,v3v_2, v_3. Solving such a set of equations should be 2-3 lines of working at most.

The one caveat is that can go wrong if the actual eigenvector has v1=0v_1 = 0, because in that case there's no multiple that has v1=1v_1 = 1. This is very unlikely to happen, but you should always check your solution "works" for the 3rd row of the matrix just to make sure. With experience you can usually spot this when it comes up (and just pick a different component to assume = 1) without doing much extra work.
Reply 4
I read something about cross product being used to compute eigenvector. Does anyone know about that method?
Original post by esrever
I read something about cross product being used to compute eigenvector. Does anyone know about that method?


Never heard of it, but it does seem *a bit* easier after looking it up. If you're at A-Level, then the explanation is beyond you but you can apply it blindly, I suppose.

But the method works only when your matrix AA has distinct eigenvalues λi\lambda_i, and only if you can pick out two linearly independent rows out of AλiIA - \lambda_iI. That said, you just take the cross product between these two row vectors and you get the eigenvector.

https://math.stackexchange.com/questions/178830/cross-product-technique-to-find-the-eigenspaces-of-a-3-times-3-matrix

So that's the drawback, (and there could be more conditions/cases that I can't be bothered to look up) and it's not so much worth the hassle in general.


In your question, this works.

For λ=1\lambda = 1 we have AλI=(022636658)A-\lambda I = \displaystyle \begin{pmatrix} 0 & 2 & -2 \\ 6 & 3 & -6 \\ 6 & 5 & -8 \end{pmatrix} and the two vectors which are linearly indep. are

a1T=(636)\mathbf{a}_1^T = \begin{pmatrix} 6 & 3 & -6 \end{pmatrix}
and
a2T=(658)\mathbf{a}_2^T = \begin{pmatrix} 6 & 5 & -8 \end{pmatrix}

Their cross product yields a1×a2=(61212)\mathbf{a}_1 \times \mathbf{a}_2 = \begin{pmatrix} 6 \\ 12 \\ 12 \end{pmatrix}

which is in the same direction as the eigenvector (122)\begin{pmatrix} 1 \\ 2 \\ 2 \end{pmatrix}
(edited 5 years ago)
Reply 6
Original post by RDKGames
Never heard of it, but it does seem *a bit* easier after looking it up. If you're at A-Level, then the explanation is beyond you but you can apply it blindly, I suppose.

But the method works only when your matrix AA has distinct eigenvalues λi\lambda_i, and only if you can pick out two linearly independent rows out of AλiIA - \lambda_iI. That said, you just take the cross product between these two row vectors and you get the eigenvector.

So that's the drawback, and it's not so much worth the hassle in general.


In your question, this works.

For λ=1\lambda = 1 we have AλI=(022636658)A-\lambda I = \displaystyle \begin{pmatrix} 0 & 2 & -2 \\ 6 & 3 & -6 \\ 6 & 5 & -8 \end{pmatrix} and the two vectors which are linearly indep. are

a1T=(636)\mathbf{a}_1^T = \begin{pmatrix} 6 & 3 & -6 \end{pmatrix}
and
a2T=(658)\mathbf{a}_2^T = \begin{pmatrix} 6 & 5 & -8 \end{pmatrix}

Their cross product yields a1×a2=(61212)\mathbf{a}_1 \times \mathbf{a}_2 = \begin{pmatrix} 6 \\ 12 \\ 12 \end{pmatrix}

which is in the same direction as the eigenvector (122)\begin{pmatrix} 1 \\ 2 \\ 2 \end{pmatrix}


Thank you so much :smile:
Original post by esrever
I read something about cross product being used to compute eigenvector. Does anyone know about that method?

That's going to be a method that only works for 3x3 matrices. So you're going to have to learn how to solve the general case anyhow. I'm unconvinced of the utility of learning methods for special cases unless they save a *lot* of time, and I don't think this does.

Quick Reply

Latest