The Student Room Group

matrices problem

Question: The matrix M is given by M =(abcd) = \begin{pmatrix} a & b\\ c & d \end{pmatrix} , where a,b,c,dϵR a,b,c,d \epsilon \mathbb{R} .
a)Find M2 M^{2} .
b)Given that M2=M M^{2} = M and that b and c are non-zero, prove that M is singular.
c)Prove that in this case, the transformation T, is defined by T:(xy)M(xy) T:\begin{pmatrix} x\\ y \end{pmatrix} \mapsto M \begin{pmatrix} x\\ y \end{pmatrix}
maps all points of the plane to points of the line (1a)x=by (1-a) x =by

my attempt:
M2=(a2+bcab+bdac+dcbc+d2) M^2 = \begin{pmatrix} a^2 + bc & ab+bd \\ ac+dc & bc+d^2 \end{pmatrix}

prove that M is singular
(a2+bcab+bdac+dcbc+d2)=(abcd) \begin{pmatrix} a^2 + bc & ab+bd \\ ac+dc & bc+d^2 \end{pmatrix} = \begin{pmatrix} a & b \\ c & d \end{pmatrix}
if matrix is singular then det(M)=0 det(M) = 0

thus we need to prove that det(M)=0 det(M) = 0
det(M)=adbc det(M) = ad -bc

a2+bc=a......1ab+bd=b.....2ac+dc=c......3bc+d2=d.....4 a^2+bc=a......1 \\ ab+bd = b .....2\\ ac+dc=c ......3 \\ bc+d^2=d .....4

from 1
bc=aa2 bc = a- a^2


from 2
ab+bd=b ab+ bd =b

bd=bab bd = b - ab
bd=b(1a) bd = b(1-a)
d=1a d = 1-a
sub bc and d into adbc ad-bc

thus
a(1a)(aa2)=0 a(1-a) - (a - a^2) = 0

since adbc=0 ad -bc =0 therefore M is singular

I need help with part c
please help
(edited 5 years ago)
Reply 1
bump
Reply 2
Couple of ways, but use the fact that the second column of M contains much of the info, given that you've already found a relationship between a and d. You can also use the det expression to get the equivalent relationship for c and consider the column space of the matrix.

Original post by bigmansouf
bump
(edited 5 years ago)

Quick Reply

Latest