Hehe, that spoiler was more for David to look at to see what had slipped in my mind. I know my explanation was bad; because I don't really grasp this good enough myself to be confident with it.
is the same as grad (it's an operator, not the derivative)
is just a symbol, but usually used to describe a small increase in something (in the same way or are), or occasionally an area.
edit:\cdot should be itApologies as I can't get the c . T statement to show up in LaTex.

nota bene
 Follow
 5 followers
 13 badges
 Send a private message to nota bene
Offline13ReputationRep: Follow
 41
 11032008 23:18

insparato
 Follow
 6 followers
 13 badges
 Send a private message to insparato
Offline13ReputationRep: Follow
 42
 11032008 23:18

 Follow
 43
 11032008 23:25
(Original post by Zhen Lin)
Well, { i, j, k } form a basis for 3D space, but not for the reasons you describe. A basis is a linearly independent set (that is, there are no nontrivial solutions for scalars to the equation where are the elements in the set) of vectors that spans the space, that is, all vectors in the space are linear combinations of those basis vectors.
(Original post by generalebriety)
A more intuitive idea of a basis than Zhen's explanation above is a set of vectors (think about our 'space' as normal 3D space if it helps) that you can make any vector in your space out of (i.e. they're a spanning set) and where you have no superfluous vectors (i.e. they're not linearly dependent).
(Original post by generalebriety)
A set of vectors is linearly dependent if there's one of them you can remove that wouldn't help you 'reach' any other vectors in the space
(Original post by generalebriety)
(this obviously places an upper bound on the number of vectors you can have  if you take four vectors that span 3D space, one of them is always superfluous because you can always remove one and make any vector you like out of the other three). A set of vectors is a spanning set if you can make any vector in the space you like by scaling vectors in that set and adding them together (this obviously places a lower bound on the number of vectors you can have  if you take two linearly independent vectors in 3D, there'll always be some vectors you can't make, e.g. the vector defined by their cross product). A set of vectors is a basis for a space if they are linearly independent and span the space.
So how do you know that a (number of) vectors span a space??
Is there a book somewhere, with numerical examples in, that I could look at? I haven't had anything to think about that's a concrete example that I can work through myself to see if I've got my head around this. Sorry. I'm probably being a bit slow, but I think I understand most of the idea but I don't quite know it as I haven't tried it out for myself. 
 Follow
 44
 11032008 23:30
(Original post by grace_)
What's a trivial solution? One where all the relevant values are zero?
How does this tie in to a set of equations (vectors?) being linearly dependent if the determinant of the matrix  presumably the matrix consisting of these vectors  is zero?
So how do you know that a (number of) vectors span a space?? 
 Follow
 45
 11032008 23:32
So if I had
then
but what is this thing, as a physical idea? Is it the gradient of the function f(x, y, z) in some sense? 
insparato
 Follow
 6 followers
 13 badges
 Send a private message to insparato
Offline13ReputationRep: Follow
 46
 11032008 23:35
Yes, if the given function f(x,y,z) represents a scalar field. So if you plop numbers in x,y,z you will get a scalar value. Grad F is the gradient essentially.
I think its easier understood if you deal with scalar and vector fields in terms of physical things.
A scalar field is where a scalar value is associated with every point in space.
So take a room with a radiator, different parts of the room are hotter than others. So the room represents space, different points in the room correspond to a different temperatures (which is the scalar value).
Now a gradient of a scalar field(Grad F)
Take the room again, and take a point in the room. The gradient at this point will show you the direction of which the temperature rises quickest(so if you look at Grad F, it is a vector and it always points in the direction where it scalar values rise quickest). The magnitude of the gradient will show you how fast the temperature will rise.
http://en.wikipedia.org/wiki/Image:Gradient2.svg This should help.
I think Billy might have my nuts off, i've diverged pretty far from discussion but if you're interested you can PM me, or one of the others(probably better) to find out more. 
DFranklin
 Follow
 71 followers
 18 badges
 Send a private message to DFranklin
Offline18ReputationRep: Follow
 47
 11032008 23:35
(Original post by grace_)
I'm glad someone can understand this! 
 Follow
 48
 11032008 23:36
(Original post by Zhen Lin)
The minimum number of vectors you need to span a space is the dimension of the space. To prove that a set of vectors span a space... well, if the number of vectors is equal to the dimension, then you can use the matrix determinant. If not, there's something called row reduction you can use to find out which ones a linearly independent and how many.
So if you need three linearly independent vectors to reach any given point in the space (and exactly and only three vectors) then you have a threedimensional space?
I think I remember doing something about row reduction when learning about finding the determinant of a matrix  you can add/subtract any multiple of a row from any other row without changing the value of the determinant? Is that a similar idea to being able to see if one of the vectors you've got is linearly dependent on the others, or not? 
 Follow
 49
 11032008 23:38
Correct  a smallest spanning set is a largest linearly independent set and is a basis, so it has as many vectors as there are dimensions.

nota bene
 Follow
 5 followers
 13 badges
 Send a private message to nota bene
Offline13ReputationRep: Follow
 50
 11032008 23:40
(Original post by grace_)
What's a trivial solution? One where all the relevant values are zero?
The cross product of two vectors gives you a vector which is perpendicular to the original two, is that correct? So you wouldn't be able to reach that vector by combining (adding multiples of) the original 2 vectors in 3D space.
Is there a book somewhere, with numerical examples in, that I could look at? I haven't had anything to think about that's a concrete example that I can work through myself to see if I've got my head around this. Sorry. I'm probably being a bit slow, but I think I understand most of the idea but I don't quite know it as I haven't tried it out for myself.
You're not being slow! You grasp things in this thread in 24hrs that takes me at least a week to understand 
 Follow
 51
 11032008 23:45
(Original post by grace_)
The cross product of two vectors gives you a vector which is perpendicular to the original two, is that correct? So you wouldn't be able to reach that vector by combining (adding multiples of) the original 2 vectors in 3D space. 
 Follow
 52
 11032008 23:46
Sorry to bother you all, but we've all been using this word "span" and I have no real idea what it means?? How do you know if a set of vectors spans a space?

 Follow
 53
 11032008 23:51
(Original post by DFranklin)
Hey, I didn't start it. Blame nota...
(Grace isn't exactly a normal GCSE student either...)(Original post by nota bene)
That's right, I'll take the blame!
This time we littered a uni student's thread, with something relatively relevant to the topic (at least we're still discussing eigenvalues). And as said, Grace isn't the average student in year 11... Billy will you let us all off? 
nota bene
 Follow
 5 followers
 13 badges
 Send a private message to nota bene
Offline13ReputationRep: Follow
 54
 11032008 23:51
(Original post by DFranklin)
Though I think nota might be on his own on this one. I don't understand it... (Though I think as much as anything this is because the discussion is scattered over about 20 posts. I looked at it, thought, "what's h supposed to be again?", scrolled up to the top and couldn't see it in any post and lost the will to carry on). 
nota bene
 Follow
 5 followers
 13 badges
 Send a private message to nota bene
Offline13ReputationRep: Follow
 55
 11032008 23:54
edit:Sorry to bother you all, but we've all been using this word "span" and I have no real idea what it means?? How do you know if a set of vectors spans a space? 
 Follow
 56
 12032008 00:05
I apologise if for being pedantic, but:
1. A 2D vector is not a 3D vector. That is to say, and are not the same kinds of vector.
2. You can't take the cross product of 2D vectors. The cross product is only defined in . 
 Follow
 57
 12032008 00:09
Um.
1. So the span of a set of vectors is basically the set of points that can be reached using linear combinations of those vectors?
2. So a set of vectors spans a space if you can reach any point in the space using a linear combination of those vectors?
3. If you have the minimum number of vectors in that set, then they form a basis for the space?
4. The basis is orthogonal if all those vectors are perpendicular to one another.
5. The basis is orthonormal if all those vectors are perpendicular to one another and have length 1.
6. "(Three) vectors are linearly independent" means that you cannot make the third vector by adding together (multiples of) the first two.
Corrections? 
nota bene
 Follow
 5 followers
 13 badges
 Send a private message to nota bene
Offline13ReputationRep: Follow
 58
 12032008 00:13
(Original post by Zhen Lin)
I apologise if for being pedantic
edit: Your 6 points above seem all correct Grace. 
generalebriety
 Follow
 15 followers
 14 badges
 Send a private message to generalebriety
 Wiki Support Team
Offline14ReputationRep:Wiki Support Team Follow
 59
 12032008 03:05
(Original post by grace_)
How does this tie in to a set of equations (vectors?) being linearly dependent if the determinant of the matrix  presumably the matrix consisting of these vectors  is zero?
I'll assume throughout that M is 3x3. It's quite easy to show that if a matrix M = (c_{1} c_{2} c_{3}) is made of three column vectors, with components:
then adding a multiple of one column to another won't change the determinant. So, for example,
M = c_{1} c_{2} c_{3} = (c_{1}3c_{2}) c_{2} c_{3}
i.e.
You can also swap columns without affecting the determinant much (i.e. the magnitude of the determinant stays the same, but each time you swap two columns the determinant gains a minus sign).
It's also obvious that if M has a zero column, the determinant will be zero. So if you can add together linear combinations of the three column vectors to get the zero vector (this is the definition of "linearly dependent"), you can do this column reduction to M and get a zero column, and the determinant is zero. A similar argument can be applied to the rows, but the following is much more interesting and important.
Let's assume we've done this 'column reduction' with a specific aim in mind: we're gonna add multiples of c_2 and c_3 to the first column to try and get a column of zeros. If we can't do that, we're gonna do the best we can and try and just leave one number in the top left corner.
(We can always get at least two zeros. Why? Well, consider (x_2, x_3), (y_2, y_3), (z_2, z_3) informally as 2D vectors. There's obviously one superfluous one, so we can always take two of them and make the third out of it, i.e. a linear combination of them will always give the zero vector. Just do this to the bottom six elements of the matrix, and ignore what happens to the top three while you're doing so.)
Then, after doing that, we're gonna add multiples of (the new) c_1 and c_3 to the second column to get as many zeros as we can. (We can always get at least one zero by the same argument as above  we have (y_3) and (z_3), two onedimensional column vectors, and it's obvious that you can multiply z_3 by something to get y_3.) The type of matrix we get at the end we call 'upper triangular'  and you can see why, because the lowerleft three elements of the matrix (which form a triangle ) are zero. Now, count the number of nonzero columns  this is called the column rank of M. If there's at least one nonzero number in each column, its rank is 3; if there's one zero column, its rank is 2; if there are two zero columns, its rank is 1; if there are three zero columns, its rank is 0. (See examples of column reduction in the spoiler below.)
Spoiler:Show
In the first transformation, I've subtracted the second column from the first; then in the second transformation, I've subtracted the third column from the first. This leaves me my three zeroes in the bottom left. Upper triangular form. No zero columns: column rank = 3.
I'm sure you can work out the transformations yourself. One zero column, so column rank = 2.
I'll leave you to do the column reduction. (Note that the second column is a multiple of the first.) Column rank = 1.
Column rank = 0. For obvious reasons.
Now, if we do the same thing to the rows (just using row reduction), we also get a row rank. The amazing thing is that row rank is always equal to column rank, and this is called the rank of the matrix. The rank of the matrix is the number of linearly independent vectors (read as columns or rows, it doesn't matter) it contains; the rank is also therefore the dimension of the space that these vectors span. The determinant is therefore nonzero (and hence the matrix is invertible) if and only if the rank of the matrix is the dimension of the space it's working in.
We can also define the kernel of the matrix M as the set of all vectors v with Mv = 0, and this defines a legitimate subspace of R^3 or whatever space we're working in. (By "subspace", I just mean it defines a nice point or line or plane or other 'nice' space through the origin; rather than there just being a few vectors dotted around R^3 which satisfy Mv = 0.) As this is a subspace, we can talk about its dimension (i.e. the dimension of the space that the vectors in the kernel span), and this is called the nullity. In addition to our stuff above, we also have the ranknullity theorem operating, which states that the rank of a matrix plus its nullity is the dimension of the space (e.g. 3 here).
Example:
Spoiler:Show
has rank 2. So by the ranknullity theorem, 2 + nullity = 3, and so nullity = 1, so the kernel has dimension 1 (i.e. is a line). Which line? Well, I said before it had to go through the origin (because M0 = 0...). By looking at the column reduction we did, or solving simultaneous equations, or guessing, you can also work out that (1, 1, 1) is in the kernel. But then if M(1, 1, 1) is zero, so is M times any multiple of (1, 1, 1) (note: including 0 times it, which is just 0). So the kernel is the line in the direction (1, 1, 1) through 0.
Take the rank 1 matrix we had earlier in the previous spoiler; nullity = 2. Its kernel contains (0, 0, 1) and (1, 2, 0) (again by guessing, or more sophisticated stuff), because if you multiply the matrix by either of these you get zero. These define a plane.
Take the rank 0 matrix we had earlier (the zero matrix). Its kernel contains (1, 0, 0), (0, 1, 0), (0, 0, 1) and is therefore a basis for R^3.
In summary, if it's a 3x3 matrix, then the following statements are all equivalent:
 the rank is 3
 the determinant isn't 0
 the matrix is invertible
 the three column vectors that form the matrix are linearly independent and spanning / a basis for R^3
 the three row vectors that form the matrix are linearly independent and spanning / a basis for R^3
 the kernel is 0dimensional, and therefore contains only the 0 vector.
[/geekery]
Hope I haven't made any mistakes there...
Edit 10000: Incidentally, if you look on MIT OpenCourseWare (google it; videoed lectures, basically), they have a good (fairly slowpaced) linear algebra course which explains this stuff over about 20 hours, broken up into lots of small lectures. Have a look if you're interested. It goes into quite a bit of detail  a bit more than my vectors and matrices course did last term. Ok, should go to bed now  last day of lectures tomorrow. Yawn... 
generalebriety
 Follow
 15 followers
 14 badges
 Send a private message to generalebriety
 Wiki Support Team
Offline14ReputationRep:Wiki Support Team Follow
 60
 12032008 03:15
(Original post by insparato)
I think Billy might have my nuts off, i've diverged pretty far from discussion but if you're interested you can PM me, or one of the others(probably better) to find out more.
That said, I've just realised grace_ is a year 11 student, and so a lot of my above post might be completely inappropriate. Ah well... I think I've tried to give a fairly intuitive approach rather than a stone cold hard theory approach, so hopefully she'll understand. Might make a few additions, though.
(After all, intuition is what maths is about. Proof is, in the end, not what gets results; it's a formality that mathematicians are particularly proud of, but no mathematician seriously thinks in epsilons and deltas naturally unless they have to...)
Related discussions
 What exactly are matrices used for?
 Frequency of nondiagonalizable matrices
 Matrices at further maths GCSE
 members of the mathematical community  where art thou
 Reducing a symmetric matrix to diagonal form
 advice please  linear equations with 3 unknowns
 ALevel Definition of a Matrix
 FP3 Matrices question
 Games development and matrices
 Matrices
Related university courses

Computing Science/Mathematics
University of Glasgow

History and Mathematics
University of Derby

Mathematics
University of Cambridge

Mathematics and Philosophy
University of St Andrews

Mathematics and Physics
University of St Andrews

Mathematics and Physics
University of Surrey

Mathematics and Physics
University of Surrey

Mathematics with Accounting and Finance
Aberystwyth University

Mathematics with an International Foundation Year
Loughborough University

Mathematics with Mathematical Physics
University College London
see more
We have a brilliant team of more than 60 Support Team members looking after discussions on The Student Room, helping to make it a fun, safe and useful place to hang out.
 Notnek
 charco
 Mr M
 Changing Skies
 F1's Finest
 RDKGames
 davros
 Gingerbread101
 Kvothe the Arcane
 TeeEff
 Protostar
 TheConfusedMedic
 nisha.sri
 claireestelle
 Doonesbury
 furryface12
 Amefish
 Lemur14
 brainzistheword
 Quirky Object
 TheAnxiousSloth
 EstelOfTheEyrie
 CoffeeAndPolitics
 Labrador99
 EmilySarah00
 thekidwhogames
 entertainmyfaith
 Eimmanuel
 Toastiekid
 CinnamonSmol
 Qer
 RedGiant
 Sinnoh