This discussion is closed.
Gaz031
Badges: 15
Rep:
?
#1
Report Thread starter 15 years ago
#1
If i find an eigenvector to be:

2
-1
0

yet the answer in the back states it as:
-2
1
0

Is my answer still correct as it is merely a scalar constant away? Would it matter which form you showed it in during an exam?

Basically, are you allowed to multiply an eigenvector by whatever value you like in order to put it in its simplest form?
0
RichE
Badges: 15
Rep:
?
#2
Report 15 years ago
#2
Any non-zero scalar of an eigenvector is another eigenvector.

The eigenspace (in this case the line z=0=x+2y) is unique though.
0
Gaz031
Badges: 15
Rep:
?
#3
Report Thread starter 15 years ago
#3
(Original post by RichE)
Any non-zero scalar of an eigenvector is another eigenvector.

The eigenspace (in this case the line z=0=x+2y) is unique though.
Thanks very much. It makes sense as the scalar multiple is parallel to the original eigenvector.

I'm just introducing myself to matrices and they seem fairly interesting.

I'll give you some reputation when i can.
0
Gauss
Badges: 11
Rep:
?
#4
Report 15 years ago
#4
(Original post by RichE)
Any non-zero scalar of an eigenvector is another eigenvector.

The eigenspace (in this case the line z=0=x+2y) is unique though.
Eigenvectors are vector spaces??
I thought eigenvectors must be non-trivial, i.e. not equal to the 0 vector!
0
jpowell
Badges: 15
Rep:
?
#5
Report 15 years ago
#5
Erm. Eigenvectors are not vector spaces, however seeing as any vector gives a basis for a one dimensional subspace it follows that an eigenvector has an associated vector subspace. And as RichE said any non-zero scalar multiple of an eigenvector is another eigenvector.
0
Gauss
Badges: 11
Rep:
?
#6
Report 15 years ago
#6
(Original post by AntiMagicMan)
Erm. Eigenvectors are not vector spaces, however seeing as any vector gives a basis for a one dimensional subspace it follows that an eigenvector has an associated vector subspace. And as RichE said any non-zero scalar multiple of an eigenvector is another eigenvector.
Yes I know that, I thought that RichE was claiming that eigenvectors are vector spaces themselves. For an eigenvector to be a vector space the 0 vector must be an eigenvector, which doesn't hold.
0
RichE
Badges: 15
Rep:
?
#7
Report 15 years ago
#7
(Original post by Galois)
Yes I know that, I thought that RichE was claiming that eigenvectors are vector spaces themselves. For an eigenvector to be a vector space the 0 vector must be an eigenvector, which doesn't hold.
Yes, but the definition of an eigenspace is the collection of eigenvectors associated with a particular eigenvalue, together with 0, and then such an eigenspace will be subspace.
0
Gaz031
Badges: 15
Rep:
?
#8
Report Thread starter 15 years ago
#8
Another short query would be:
Does it matter in which order you write the columns in an orthagonal matrix formed from normalised eigenvectors? I've just been adopting the approach of writing the eigenvector from the lowest eigenvalue first and the book seems to use the same one.
Thanks.
0
RichE
Badges: 15
Rep:
?
#9
Report 15 years ago
#9
(Original post by Gaz031)
Another short query would be:
Does it matter in which order you write the columns in an orthagonal matrix formed from normalised eigenvectors? I've just been adopting the approach of writing the eigenvector from the lowest eigenvalue first and the book seems to use the same one.
Thanks.
It only matters to the extent that the diagonal entries in P^T A P will appear in the order that the unit eigenvectors were put into the columns of P.
0
RichE
Badges: 15
Rep:
?
#10
Report 15 years ago
#10
Also there may be times when you want P to represent a rotation of axes, in which case you want the columns to form a right-handed orthonormal basis, or put more easily, want detP = +1.
0
Gauss
Badges: 11
Rep:
?
#11
Report 15 years ago
#11
(Original post by RichE)
Yes, but the definition of an eigenspace is the collection of eigenvectors associated with a particular eigenvalue, together with 0, and then such an eigenspace will be subspace.
Fair enough, I wasn't aware of that definition.

(Original post by AntiMagicMan)
... however seeing as any vector gives a basis for a one dimensional subspace ...
I just realised that statement is wrong. Any non-zero vector gives a basis for a one dimensional subspace.

Galois.
0
Gaz031
Badges: 15
Rep:
?
#12
Report Thread starter 15 years ago
#12
(Original post by RichE)
It only matters to the extent that the diagonal entries in P^T A P will appear in the order that the unit eigenvectors were put into the columns of P.
Thanks. I tried using an orthagonal matrix i generated and got a diagonalised matrix. I just wanted to check that it's okay generally.
0
Gaz031
Badges: 15
Rep:
?
#13
Report Thread starter 15 years ago
#13
Here's a query about the vector product:

I know it can be stated as:
a X b = [a].[b].sint.n^

Is it true that n^ is simply the unit vector corresponding to a X b.
Hence, if the magnitude of A x B is Y, can the scalar product be rewritten as:
1 = a.b.sint.(1/Y)
Hence: Y = a.b.sint
sint = Y/(a.b) and t=arcsinY/(a.b) ?
0
RichE
Badges: 15
Rep:
?
#14
Report 15 years ago
#14
(Original post by Gaz031)
Here's a query about the vector product:

I know it can be stated as:
a X b = [a].[b].sint.n^

Is it true that n^ is simply the unit vector corresponding to a X b.
Hence, if the magnitude of A x B is Y, can the scalar product be rewritten as:
1 = a.b.sint.(1/Y)
Hence: Y = a.b.sint
sint = Y/(a.b) and t=arcsinY/(a.b) ?
n^ is the unit vector which is perpendicular to a,b and {a,b,n^} is right-handed.
0
Gaz031
Badges: 15
Rep:
?
#15
Report Thread starter 15 years ago
#15
(Original post by RichE)
n^ is the unique vector which is perpendicular to a,b and {a,b,n^} is right-handed.
Right handed?
0
jpowell
Badges: 15
Rep:
?
#16
Report 15 years ago
#16
(Original post by Galois)
I just realised that statement is wrong. Any non-zero vector gives a basis for a one dimensional subspace.
That is just pedantic.
0
Gauss
Badges: 11
Rep:
?
#17
Report 15 years ago
#17
(Original post by AntiMagicMan)
That is just pedantic.
Mathematicians are born to be pedantic.
0
RichE
Badges: 15
Rep:
?
#18
Report 15 years ago
#18
(Original post by Gaz031)
Right handed?
Yes, so that they have the same orientation as i,j,k in that order, rather than i,k,j which is left-handed.

I guess you've realised there are two unit vectors normal to a and b. Well if you have the formula

axb = |a| |b| [email protected] n^

then the angle @ is measured anti-clockwise from a to b as viewed from the top of n^. (If you can picture that.)
0
BCHL85
Badges: 12
Rep:
?
#19
Report 15 years ago
#19
(Original post by RichE)
Yes, so that they have the same orientation as i,j,k in that order, rather than i,k,j which is left-handed.

I guess you've realised there are two unit vectors normal to a and b. Well if you have the formula

axb = |a| |b| [email protected] n^

then the angle @ is measured anti-clockwise from a to b as viewed from the top of n^. (If you can picture that.)
Yeah, it's similar to find the direction of magnetic field around the current. Or you can find it by a direction of a screw
0
evariste
Badges: 1
Rep:
?
#20
Report 15 years ago
#20
(Original post by Galois)
Fair enough, I wasn't aware of that definition.



I just realised that statement is wrong. Any non-zero vector gives a basis for a one dimensional subspace.

Galois.
the space generated by the zero vector 0 satifies the conditions for a vector space with the proviso we ignore the usual scalar identity element being "1".
since 1.v=v.1=v for all v in a vector space V is the definition of scalar identity we can replace 1 by 0 in the 0-space as we have 0.0=0.0=0.
ie for all v in 0-space 0.v=v.0=v
(since only vector in 0 space is the zero space.)
0
X
new posts
Back
to top
Latest
My Feed

See more of what you like on
The Student Room

You can personalise what you see on TSR. Tell us a little about yourself to get started.

Personalise

How do you feel about your grades? Are they...

What I expected (102)
25.37%
Better than expected (82)
20.4%
Worse than expected (218)
54.23%

Watched Threads

View All
Latest
My Feed