# Short eigenvector query.

WatchThis discussion is closed.

If i find an eigenvector to be:

2

-1

0

yet the answer in the back states it as:

-2

1

0

Is my answer still correct as it is merely a scalar constant away? Would it matter which form you showed it in during an exam?

Basically, are you allowed to multiply an eigenvector by whatever value you like in order to put it in its simplest form?

2

-1

0

yet the answer in the back states it as:

-2

1

0

Is my answer still correct as it is merely a scalar constant away? Would it matter which form you showed it in during an exam?

Basically, are you allowed to multiply an eigenvector by whatever value you like in order to put it in its simplest form?

0

Report

#2

Any non-zero scalar of an eigenvector is another eigenvector.

The eigenspace (in this case the line z=0=x+2y) is unique though.

The eigenspace (in this case the line z=0=x+2y) is unique though.

0

(Original post by

Any non-zero scalar of an eigenvector is another eigenvector.

The eigenspace (in this case the line z=0=x+2y) is unique though.

**RichE**)Any non-zero scalar of an eigenvector is another eigenvector.

The eigenspace (in this case the line z=0=x+2y) is unique though.

I'm just introducing myself to matrices and they seem fairly interesting.

I'll give you some reputation when i can.

0

Report

#4

**RichE**)

Any non-zero scalar of an eigenvector is another eigenvector.

The eigenspace (in this case the line z=0=x+2y) is unique though.

I thought eigenvectors must be non-trivial, i.e. not equal to the 0 vector!

0

Report

#5

Erm. Eigenvectors are not vector spaces, however seeing as any vector gives a basis for a one dimensional subspace it follows that an eigenvector has an associated vector subspace. And as RichE said any

**non-zero**scalar multiple of an eigenvector is another eigenvector.
0

Report

#6

(Original post by

Erm. Eigenvectors are not vector spaces, however seeing as any vector gives a basis for a one dimensional subspace it follows that an eigenvector has an associated vector subspace. And as RichE said any

**AntiMagicMan**)Erm. Eigenvectors are not vector spaces, however seeing as any vector gives a basis for a one dimensional subspace it follows that an eigenvector has an associated vector subspace. And as RichE said any

**non-zero**scalar multiple of an eigenvector is another eigenvector.
0

Report

#7

(Original post by

Yes I know that, I thought that RichE was claiming that eigenvectors are vector spaces themselves. For an eigenvector to be a vector space the 0 vector must be an eigenvector, which doesn't hold.

**Galois**)Yes I know that, I thought that RichE was claiming that eigenvectors are vector spaces themselves. For an eigenvector to be a vector space the 0 vector must be an eigenvector, which doesn't hold.

0

Another short query would be:

Does it matter in which order you write the columns in an orthagonal matrix formed from normalised eigenvectors? I've just been adopting the approach of writing the eigenvector from the lowest eigenvalue first and the book seems to use the same one.

Thanks.

Does it matter in which order you write the columns in an orthagonal matrix formed from normalised eigenvectors? I've just been adopting the approach of writing the eigenvector from the lowest eigenvalue first and the book seems to use the same one.

Thanks.

0

Report

#9

(Original post by

Another short query would be:

Does it matter in which order you write the columns in an orthagonal matrix formed from normalised eigenvectors? I've just been adopting the approach of writing the eigenvector from the lowest eigenvalue first and the book seems to use the same one.

Thanks.

**Gaz031**)Another short query would be:

Does it matter in which order you write the columns in an orthagonal matrix formed from normalised eigenvectors? I've just been adopting the approach of writing the eigenvector from the lowest eigenvalue first and the book seems to use the same one.

Thanks.

0

Report

#10

Also there may be times when you want P to represent a rotation of axes, in which case you want the columns to form a right-handed orthonormal basis, or put more easily, want detP = +1.

0

Report

#11

(Original post by

Yes, but the definition of an eigenspace is the collection of eigenvectors associated with a particular eigenvalue, together with 0, and then such an eigenspace will be subspace.

**RichE**)Yes, but the definition of an eigenspace is the collection of eigenvectors associated with a particular eigenvalue, together with 0, and then such an eigenspace will be subspace.

(Original post by

... however seeing as any vector gives a basis for a one dimensional subspace ...

**AntiMagicMan**)... however seeing as any vector gives a basis for a one dimensional subspace ...

**non-zero**vector gives a basis for a one dimensional subspace.

Galois.

0

(Original post by

It only matters to the extent that the diagonal entries in P^T A P will appear in the order that the unit eigenvectors were put into the columns of P.

**RichE**)It only matters to the extent that the diagonal entries in P^T A P will appear in the order that the unit eigenvectors were put into the columns of P.

0

Here's a query about the vector product:

I know it can be stated as:

a X b = [a].[b].sint.n^

Is it true that n^ is simply the unit vector corresponding to a X b.

Hence, if the magnitude of A x B is Y, can the scalar product be rewritten as:

1 = a.b.sint.(1/Y)

Hence: Y = a.b.sint

sint = Y/(a.b) and t=arcsinY/(a.b) ?

I know it can be stated as:

a X b = [a].[b].sint.n^

Is it true that n^ is simply the unit vector corresponding to a X b.

Hence, if the magnitude of A x B is Y, can the scalar product be rewritten as:

1 = a.b.sint.(1/Y)

Hence: Y = a.b.sint

sint = Y/(a.b) and t=arcsinY/(a.b) ?

0

Report

#14

(Original post by

Here's a query about the vector product:

I know it can be stated as:

a X b = [a].[b].sint.n^

Is it true that n^ is simply the unit vector corresponding to a X b.

Hence, if the magnitude of A x B is Y, can the scalar product be rewritten as:

1 = a.b.sint.(1/Y)

Hence: Y = a.b.sint

sint = Y/(a.b) and t=arcsinY/(a.b) ?

**Gaz031**)Here's a query about the vector product:

I know it can be stated as:

a X b = [a].[b].sint.n^

Is it true that n^ is simply the unit vector corresponding to a X b.

Hence, if the magnitude of A x B is Y, can the scalar product be rewritten as:

1 = a.b.sint.(1/Y)

Hence: Y = a.b.sint

sint = Y/(a.b) and t=arcsinY/(a.b) ?

0

(Original post by

n^ is the unique vector which is perpendicular to a,b and {a,b,n^} is right-handed.

**RichE**)n^ is the unique vector which is perpendicular to a,b and {a,b,n^} is right-handed.

0

Report

#16

(Original post by

I just realised that statement is wrong. Any

**Galois**)I just realised that statement is wrong. Any

**non-zero**vector gives a basis for a one dimensional subspace.
0

Report

#17

(Original post by

That is just pedantic.

**AntiMagicMan**)That is just pedantic.

0

Report

#18

(Original post by

Right handed?

**Gaz031**)Right handed?

I guess you've realised there are two unit vectors normal to a and b. Well if you have the formula

axb = |a| |b| [email protected] n^

then the angle @ is measured anti-clockwise from a to b as viewed from the top of n^. (If you can picture that.)

0

Report

#19

(Original post by

Yes, so that they have the same orientation as i,j,k in that order, rather than i,k,j which is left-handed.

I guess you've realised there are two unit vectors normal to a and b. Well if you have the formula

axb = |a| |b| [email protected] n^

then the angle @ is measured anti-clockwise from a to b as viewed from the top of n^. (If you can picture that.)

**RichE**)Yes, so that they have the same orientation as i,j,k in that order, rather than i,k,j which is left-handed.

I guess you've realised there are two unit vectors normal to a and b. Well if you have the formula

axb = |a| |b| [email protected] n^

then the angle @ is measured anti-clockwise from a to b as viewed from the top of n^. (If you can picture that.)

0

Report

#20

(Original post by

Fair enough, I wasn't aware of that definition.

I just realised that statement is wrong. Any

Galois.

**Galois**)Fair enough, I wasn't aware of that definition.

I just realised that statement is wrong. Any

**non-zero**vector gives a basis for a one dimensional subspace.Galois.

**0**satifies the conditions for a vector space with the proviso we ignore the usual scalar identity element being "1".

since 1.

**v**=

**v**.1=

**v**for all

**v**in a vector space V is the definition of scalar identity we can replace 1 by 0 in the 0-space as we have 0.

**0**=

**0**.0=

**0**.

ie for all

**v**in 0-space 0.

**v**=

**v**.0=

**v**

(since only vector in 0 space is the zero space.)

0

X

new posts

Back

to top

to top