# Vector Spaces

Watch
Announcements
#1
1. We know that a zero vector of a vector space (V) is defined by z, where v + z = v for all v in V.

Then, is it true that for any scalar λz = z. Also, if true, why is this true?

2. For a subset W of a vector space V to be a subspace, one of the conditions is that it must be a vector space in its own right i.e. it must contain the unique zero vector. As W lies in V, the zero vector of W is the zero vector of V right? So e.g. if W is a plane in R3, we have to show it contains the unique zero vector of 'R3' right? i.e. the plane is a flat through the origin?

3. For a set X to span a vector space V, every vector in V must be able to be written as a linear combination of the vectors in the set X.
If Ax = b admits at least one solution, it means every vector b can be written in at least one linear combination of the vectors in X and therefore it spans V. Infinitely many solutions to Ax = b means every vector b can be written as a linear combination of the vectors in X in infinitely many ways (and it spans V). Also, a set does not span a vector space V if Ax = b does not admit at least one solution for every b in Rn i.e. at least some vectors cannot be written as a linear combination of the vectors in X (i.e. it doesn’t span all of V - so the system Ax=b is inconsistent for some values of b?).

4. The definition of a basis: A subset B = of a vector space V is a basis of V if and only if every v ∈ V can be expressed as a unique linear combination of the vectors in B.

Logically, is this simply because: 'linear combination' - since B spans X i.e. Lin(B) = X and unique as they are linearly independent i.e. there is only a unique linear combination to write a certain vector v ∈ V since you cannot e.g. write any as a linear combination of the other where , hence it is unique?
Last edited by Chittesh14; 2 years ago
0
2 years ago
#2
1) yes, just multiply previous equation by lambda to get this.
2) Yes, it must contain the zero vector.
3) Be careful how you're defining the terms in Ax=b. Typically, the columns of A would be the vectors in set X, x would be the linear coeffs and b is the vector in V? But yes you could imagine that there are fewer cols than rows and hence fewer x parameters hence potentially inconsistent.
,4) Yes a basis is linearly independent.

. We know that a zero vector of a vector space (V) is defined by z, where v + z = v for all v in V.
Then, is it true that for any scalar λz = z. Also, if true, why is this true?

2. For a subset W of a vector space V to be a subspace, one of the conditions is that it must be a vector space in its own right i.e. it must contain the unique zero vector. As W lies in V, the zero vector of W is the zero vector of V right? So e.g. if W is a plane in R3, we have to show it contains the unique zero vector of 'R3' right? i.e. the plane is a flat through the origin?

3. For a set X to span a vector space V, every vector in V must be able to be written as a linear combination of the vectors in the set X.
If Ax = b admits at least one solution, it means every vector b can be written in at least one linear combination of the vectors in X and therefore it spans V. Infinitely many solutions to Ax = b means every vector b can be written as a linear combination of the vectors in X in infinitely many ways (and it spans V). Also, a set does not span a vector space V if Ax = b does not admit at least one solution for every b in Rn i.e. at least some vectors cannot be written as a linear combination of the vectors in X (i.e. it doesn’t span all of V - so the system Ax=b is inconsistent for some values of b?).

4. The definition of a basis: A subset B = of a vector space V is a basis of V if and only if every v ∈ V can be expressed as a unique linear combination of the vectors in B.

Logically, is this simply because: 'linear combination' - since B spans X i.e. Lin(B) = X and unique as they are linearly independent i.e. there is only a unique linear combination to write a certain vector v ∈ V since you cannot e.g. write any as a linear combination of the other where , hence it is unique?
Last edited by mqb2766; 2 years ago
0
#3
(Original post by mqb2766)
1) yes, just multiply previous equation to get this.
2) Yes, it must contain the zero vector.
3) Be careful how you're defining the terms in Ax=b. Typically, the columns of A would be the vectors in set X, x would be the linear coeffs and b is the vector in V? But yes you could imagine that there are fewer cols than rows and hence fewer x parameters hence potentially inconsistent.
,4) Yes a basis is linearly independent.

. We know that a zero vector of a vector space (V) is defined by z, where v + z = v for all v in V.
So is it like this:
1. λ(v+z = v)
λv + λz = λv and so λz = λv - λv = z?

2. Thank you

3. Yes, sorry, I was meant to post a picture with it. Fair enough, but is it true that if free parameters exist, this means that each vector in V can be written as a linear combination of the vectors in the set X in infinitely many ways?

4. Thanks, but is the linearly independent part linked to the 'unique' part of the characterisation?
0
2 years ago
#4
1) yes lambda v is in V so by definition you have the result. The final subtraction is unnecessary just use the definition.
3) Yes, if the basis is linearly dependent or there are free parameters or ... the v can be expressed in infinitely many ways. You'd have more columns than rows in A and there would be a non empty null space.
4) Yes the linearly independent is equivalent to being unique ,(zero null space ...)

oNonzeal post by Chittesh14)
So is it like this:
1. λ(v+z = v)
λv + λz = λv and so λz = λv - λv = z?

2. Thank you

3. Yes, sorry, I was meant to post a picture with it. Fair enough, but is it true that if free parameters exist, this means that each vector in V can be written as a linear combination of the vectors in the set X in infinitely many ways?

4. Thanks, but is the linearly independent part linked to the 'unique' part of the characterisation?
0
#5
(Original post by mqb2766)
1) yes lambda v is in V so by definition you have the result. The final subtraction is unnecessary just use the definition.
3) Yes, if the basis is linearly dependent or there are free parameters or ... the v can be expressed in infinitely many ways. You'd have more columns than rows in A and there would be a non empty null space.
4) Yes the linearly independent is equivalent to being unique ,(zero null space ...)

oNonzeal post by Chittesh14)
Thank you, I appreciate your help. Also, by definition: 0v = z by definition (0 is the scalar), so in a linear span: (n times) = ?
0
2 years ago
#6
Yes, I suppose so.
(Original post by Chittesh14)
Thank you, I appreciate your help. Also, by definition: 0v = z by definition (0 is the scalar), so in a linear span: (n times) = ?
0
#7
(Original post by mqb2766)
Yes, I suppose so.
Thank you!
0
X

new posts
Back
to top
Latest
My Feed

### Oops, nobody has postedin the last few hours.

Why not re-start the conversation?

see more

### See more of what you like onThe Student Room

You can personalise what you see on TSR. Tell us a little about yourself to get started.

### Poll

Join the discussion

#### How do you prefer to get careers advice?

I like to speak to my friends and family (23)
10.22%
I like to do my own research online using careers specific websites (134)
59.56%
I like speaking to the careers advisors at school, college or uni (30)
13.33%
I prefer to listen watch videos or listen to podcasts of people in my chosen career (33)
14.67%
Something else (let us know in the thread) (5)
2.22%