I'm doing a past paper (second year physics) and it's asking about the gradient of a cartesian coordinate z in terms of and . The question is as follows:
Okay, so I attempted to use the definition of gradient in spherical coordinates:
Where (I assume this is correct for spherical - it works in Cartesian at least!)
But this doesn't give the correct result when applied to the vector given in the question.
After a bit of playing around with it I realised that it works if
And this gives the correct value as provided in the mark scheme. I emailed my lecturer and he replied and said:
But taking what he said about a function of a function rule I deduce that
If I substitute df, dr etc for etc then it seems to work, but that doesn't feel 'right' as I'd have to define an infinitesimal change in these quantities to be equal to their gradient...which just sounds plain wrong to me.
So, anyone got any ideas about how this might be resolved?
Thanks for taking the time to read this - please don't let my beautiful latex-ing go to waste!
(oh and apparently this question is suppose to take under 5 minutes)
Gradient of unit vectors in Spherical Coords
|Last chance to win! Our 1984 city-break competition closes on Monday. Post now to enter.||22-07-2016|
(Original post by Jonny W)
Your lecturer's (vector) formula
is really, taking a component at a time, three scalar formulas in one:
Those all look to me like applications of the chain rule.
+rep heading your way.
RegisterThanks for posting! You just need to create an account in order to submit the post Already a member? Sign in
Oops, something wasn't right
please check the following: