The Student Room Group

Matrices (Generalised Inverse 2)

Consider the solution of the matrix system M x = r, where the matrix and right-hand side are given by

M=(440452024)M= \begin{pmatrix} 4& 4& 0\\4 & 5 & 2\\0 & 2& 4\end{pmatrix}; r=(256)r = \begin{pmatrix} 2\\ 5\\ 6\end{pmatrix}



The previous parts to this question was proving what the eigenvectors were (simple dot product) and writing down associated eigenvalues.

The question I'm stuck on:

We can write the vector x as a linear combination of the eigenvectors.

Writing x=iβiVi x = \sum\limits_{i} \beta_i V_i; r=iαiVir = \sum \limits_{i} \alpha_i V_i, derive a formula for βi \beta_i in terms of αi \alpha_i and the eigenvalues λi\lambda_i

Edit: Is something up with latex? Wow...
(edited 7 years ago)
Original post by Jagwar Ma
Consider the solution of the matrix system M x = r, where the matrix and right-hand side are given by

M=(440452024)M= \begin{pmatrix} 4& 4& 0\\4 & 5 & 2\\0 & 2& 4\end{pmatrix}; r=(256)r = \begin{pmatrix} 2\\ 5\\ 6\end{pmatrix}



The previous parts to this question was proving what the eigenvectors were (simple dot product) and writing down associated eigenvalues.

The question I'm stuck on:

We can write the vector x as a linear combination of the eigenvectors.

Writing x=iβiVi x = \sum\limits_{i} \beta_i V_i; r=iαiVir = \sum \limits_{i} \alpha_i V_i, derive a formula for βi \beta_i in terms of αi \alpha_i and the eigenvalues λi\lambda_i

Edit: Is something up with latex? Wow...


Yes, something appears to be up in latex land...anyway, press on...

How far have you got? Have you derived the relation iβiλiVi=iαiVi\sum_{i} \beta_{i} \lambda_{i} V_{i} = \sum_{i} \alpha_{i} V_{i} ?

If V3V_{3} is the eigenvector corresponding to the zero eigenvalue, what does that imply about β3\beta_{3} ?
(edited 7 years ago)
Reply 2
Original post by Gregorius
Yes, something appears to be up in latex land...anyway, press on...

How far have you got? Have you derived the relation
Unparseable latex formula:

\sum_{i} \beta_{i} \lambda_{i} V_{i} = \sum_{i} \alpha_{i} \V_{i}

?

Now you should notice that the ViV_{i} are not linearly independent (as the matrix is singular). If V3V_{3} is the eigenvector corresponding to the zero eigenvalue, what does that imply about β3\beta_{3} ?


v3 is not the eigenvector that corresponds to the zero eigenvalue though?

I did manage to solve it, but now I'm trying to use the result to determine the solution vector x
(edited 7 years ago)
Original post by Jagwar Ma
v3 is not the eigenvector that corresponds to the zero eigenvalue though?


I gave them an arbitrary ordering for convenience: v3 = (2,-2,1)
Reply 4
Original post by Gregorius
I gave them an arbitrary ordering for convenience: v3 = (2,-2,1)


Ah, yeah that's my V1 :wink:

I'm trying to solve to find the solution vector x but don't really understand my lecturers working (to a similar problem).
mx = r
(edited 7 years ago)
Original post by Jagwar Ma
Ah, yeah that's my V1 :wink:

I'm trying to solve to find the solution vector x but don't really understand my lecturers working (to a similar problem).
mx = r


OK, so I take v1 = (4,5,2); v2 = (-1,0,2) and v3 = (2,-2,1). Then by inspection, r = v1 + 2 v2. So you now have to solve

β1λ1V1+β2λ2V2=V1+2V2\displaystyle \beta_{1} \lambda_{1} V_{1} + \beta_{2} \lambda_{2} V_{2}= V_{1} + 2 V_{2}

remembering the comment about β3 \beta_{3}.
Reply 6
Original post by Gregorius
OK, so I take v1 = (4,5,2); v2 = (-1,0,2) and v3 = (2,-2,1). Then by inspection, r = v1 + 2 v2. So you now have to solve

β1λ1V1+β2λ2V2=V1+2V2\displaystyle \beta_{1} \lambda_{1} V_{1} + \beta_{2} \lambda_{2} V_{2}= V_{1} + 2 V_{2}

remembering the comment about β3 \beta_{3}.


r = (2, 5, 6) I've already completed that stage.

Trying to find the x in Mx = r

Quick Reply

Latest