The Student Room Group

The Proof is Trivial!

Scroll to see replies

Reply 2000
Original post by henpen
Solution 286
For even n,

Unparseable latex formula:

\displaystyle \prod_{m=1}^{2r}\Gamma \left(\frac{m}{2r+1} \right)=\displaystyle \prod_{m=1}^{r}\frac{\pi}{\sin \left(\frac{\pi m}{2r+1} \right)} \right)= \frac{\pi^r 2^{r}}{\sqrt{2r+1}}



For odd n, I'll try to generalise 24 later, but I need to sleep now.

Well done man, I'll say that was blagged! Once you do the even case, you will realize that it is essentially the same as the even case!
Hence, we have: k=1m1Γ(km)=(2π)m12m\displaystyle \prod_{k=1}^{m-1} \Gamma \left( \frac{k}{m} \right) = \frac{(2 \pi)^{\frac{m-1}{2}}}{\sqrt{m}} which implies that the answer was (2π)9210\frac{(2 \pi)^{\frac{9}{2}}}{\sqrt{10}}
which I suppose can be seen as yet another reason to introduce τ\tau as being equal to 2π2 \pi.

Does anyone have any strong views on this? Is π\pi more arbitrary than τ\tau?

This is essentially a consequence of the multiplication theorem in the case where we let z=1mz=\frac{1}{m}.
Original post by Jkn
Does anyone have any strong views on this? Is π\pi more arbitrary than τ\tau?


The only place I've seen where pi was more natural than tau was in the integral from -inf to inf of exp(-x^2). Then we have sqrt(pi).
Reply 2002
Original post by Smaug123
The only place I've seen where pi was more natural than tau was in the integral from -inf to inf of exp(-x^2). Then we have sqrt(pi).

'The only place'? There are hundreds! The most convincing is probably that it's the angle in a straight line (which, to me, seems less arbitrary than the angle in a circle). :tongue:

Problem 290
*

Evaluate cos2α+cos2β+cos2γ\displaystyle \cos^2 \alpha + \cos^2 \beta + \cos^2 \gamma in the case where α,β,γ\alpha, \beta, \gamma are, in 3-dimensional Euclidean space, the angles respectively between the line connecting the origin with any arbitrary point and any arbitrarily chosen orthonormal basis.

Show that this can be seen as a generalization of Pythagoras' Theorem (or, equivalently, a means of alternative proof).

Can the result be generalized to n-dimensional Euclidean space where the co-ordinates lie in Rn\mathbb{R}^n? (bonus-only ***)
(edited 10 years ago)
Original post by Jkn
'The only place'? There are hundreds! The most convincing is probably that it's the angle in a straight line (which, to me, seems less arbitrary than the angle in a circle). :tongue:

They're both so un-arbitrary that the choice between them is arbitrary :tongue:

(quibble with the wording of your Problem 290: "any three orthonormal bases" should be "any three orthonormal vectors" or "any orthonormal basis")
Reply 2004
Original post by Smaug123
(quibble with the wording of your Problem 290: "any three orthonormal bases" should be "any three orthonormal vectors" or "any orthonormal basis")

Cheers. How is "orthonormal bases" not correct wording? I essentially just mean axes as the application of vectors to the problem is not a necessity. :smile:

Btw, why have you not been solving my problems? :colone:
Original post by Jkn
Cheers. How is "orthonormal bases" not correct wording? I essentially just mean axes as the application of vectors to the problem is not a necessity. :smile:

Btw, why have you not been solving my problems? :colone:


An orthonormal basis is a set of orthogonal unit vectors - so "any three arbitrarily chosen ON bases" means "any three sets of three orthogonal unit vectors". I understood what you meant, though :smile:

Bleugh, I've got homework :frown: I'll have a go at some :smile:
Original post by Jkn
What I've been doing in some cases is trying to think of some myself; often generalisations of other problems I have tried/found :smile: Such as...

Problem 96*

Let x, y and a be positive integers such that x3+ax2=y2x^3+ax^2=y^2. Given that 1xb 1 \leq x \leq b where b is a positive integer, find, in terms of a and b, the number of possible pairs (x,y)(x,y) that satisfy the equation.


Is this not just a+b\lfloor \sqrt{a+b} \rfloor? Since the expression is x2(a+x)=y2x^2 (a+x) = y^2, so a+x is square; there are sqrt(a+b) square numbers available for the a+x term.
Not taken much brainpower over that, though, so I might be wrong :P
Reply 2007
Original post by Jkn
Well done man, I'll say that was blagged!


Sort of, I can't seem to generalise result 24 when the '1006' is odd: the argument of k=12r1eik2π2r\prod_{k=1}^{2r-1}e^{\frac{ik2 \pi}{2r}} isn't equal to 0, so the result isn't as simple.

Wait, I may have gotten that wrong earlier in my sleep-induced stupor, it may very well generalise.
(edited 10 years ago)
Reply 2008
Original post by Jkn

Problem 290
*

Evaluate cos2α+cos2β+cos2γ\displaystyle \cos^2 \alpha + \cos^2 \beta + \cos^2 \gamma in the case where α,β,γ\alpha, \beta, \gamma are, in 3-dimensional Euclidean space, the angles respectively between the line connecting the origin with any arbitrary point and any arbitrarily chosen orthonormal basis.

Show that this can be seen as a generalization of Pythagoras' Theorem (or, equivalently, a means of alternative proof).

Can the result be generalized to n-dimensional Euclidean space where the co-ordinates lie in Rn\mathbb{R}^n? (bonus-only ***)


'Solution' 290
alp

Take a vector such that v=1|\mathbf{v}|=1. cos(α1)\cos(\alpha_1) is its projection onto the x1x_1-axis, cos(α2)\cos(\alpha_2) onto the x2x_2-axis ... .

Thus cos2(α1)+cos2(α2)\sqrt{\cos^2( \alpha_1)+\cos^2( \alpha_2)} is its projection onto the x1x2x_1x_2-plane, and cos2(α1)+cos2(α2)+cos2(α3)\sqrt{\cos^2( \alpha_1)+\cos^2(\alpha_2)+\cos^2(\alpha_3)} is its projection onto the x1x2x3x_1x_2x_3-'3D space' (what word do I use here, R3\mathbb{R}^3?), and k=1ncos2(αn)\sqrt{\sum_{k=1}^n\cos^2( \alpha_n)} is its projection onto the x1x2...xnx_1x_2...x_n-'nD space'. But this projection is a line of length 11, so k=1ncos2(αn)=1\sqrt{\sum_{k=1}^n\cos^2( \alpha_n)}=1 and k=1ncos2(αn)=1\sum_{k=1}^n\cos^2( \alpha_n)=1.

I hope you understood what I meant. Could you help formalise the bits where I was hopelessly off with regards to the mathematical language?
(edited 10 years ago)
Reply 2009
Original post by Smaug123
An orthonormal basis is a set of orthogonal unit vectors - so "any three arbitrarily chosen ON bases" means "any three sets of three orthogonal unit vectors". I understood what you meant, though :smile:

Bleugh, I've got homework :frown: I'll have a go at some :smile:


Can you not also have sin(kx),cos(kx),kN\sin(kx),\cos(kx), k \in \mathbb{N} as orthonormal basis vectors? I hear that definition's got something to with them being orthogonal under integration (between certain limits).
(edited 10 years ago)
Reply 2010
Original post by Smaug123
Is this not just a+b\lfloor \sqrt{a+b} \rfloor? Since the expression is x2(a+x)=y2x^2 (a+x) = y^2, so a+x is square; there are sqrt(a+b) square numbers available for the a+x term.
Not taken much brainpower over that, though, so I might be wrong :P

Looks right! I think people just tried to over-complicate things and so got put off :lol:

Why not try some of my fiendish integral/E-M/Gamma function problems? (I'm hoping people will solve them so that I wont feel bad posting more :colone:). Why not post some yourself?

I'm getting quite in to my vectors now so look forward to some questions of that type :tongue:
Original post by henpen
Sort of, I can't seem to generalise result 24 when the '1006' is odd: the argument of k=12r1eik2π2r\prod_{k=1}^{2r-1}e^{\frac{ik2 \pi}{2r}} isn't equal to 0, so the result isn't as simple.

Wait, I may have gotten that wrong earlier in my sleep-induced stupor, it may very well generalise.

Why not try a different approach?

Actually, what I found most trilling is that we now have a rather strange alternate solution to problem 24 using the ever-powerful gamma function (i.e. multiplication theorem). Waaaay too hungover to type it up though.. :|
Reply 2011
Original post by henpen
'Solution' 290

For the sake of clear explanation, it would probably be helpful to note why it is that the choice of such a vector does not lose you any generality. Alternatively, you could just consider a vector of arbitrary length?

Also, your last part is right but it seems as though, at this point, it is merely a conjecture based on the initial geometric argument. Perhaps it would be better to use a more abstract algebraic argument based on a more sophisticated (and more abstract) understanding of vector principles?
I hope you understood what I meant. Could you help formalise the bits where I was hopelessly off with regards to the mathematical language?

Smaug seemed far more well-versed with vector terminology etc.. so I will leave it to him to formalize your language (seems fine to me :dontknow:).
Reply 2012
Original post by Jkn
For the sake of clear explanation, it would probably be helpful to note why it is that the choice of such a vector does not lose you any generality. Alternatively, you could just consider a vector of arbitrary length?

Also, your last part is right but it seems as though, at this point, it is merely a conjecture based on the initial geometric argument. Perhaps it would be better to use a more abstract algebraic argument based on a more sophisticated (and more abstract) understanding of vector principles?

Smaug seemed far more well-versed with vector terminology etc.. so I will leave it to him to formalize your language (seems fine to me :dontknow:).


Okay, how about: choose the vector v\mathbf{v} such that v=1|\mathbf{v}|=1 and let xk,k{1,...,n}\mathbf{x}_k,k \in \{1,...,n\} be the orthonormal basis vectors in Rn\mathbb{R}^n.

Expand the vector componentwise,:
v=i=1nxivi\displaystyle \mathbf{v}= \sum_{i=1}^n \mathbf{x}_i v_i

and its modulus:
v2=1=i=1nvi2\displaystyle |\mathbf{v}|^2=1= \sum_{i=1}^n v_i^2

But note that v,xi=vi\left \langle \mathbf{v},\mathbf{x}_i \right \rangle =v_i by the vector's expansion, and v,xi=vxicos(αxi,v)\displaystyle\left \langle \mathbf{v},\mathbf{x}_i \right \rangle = |\mathbf{v}||\mathbf{x}_i|\cos( \alpha _{\mathbf{x}_i, \mathbf{v}}) by the definition of α \alpha , and thus vi=cos(αxi,v)v_i= \cos( \alpha _{\mathbf{x}_i, \mathbf{v}}), giving

v2=1=i=1ncos2(αxi,v)\displaystyle |\mathbf{v}|^2=1= \sum_{i=1}^n \cos^2( \alpha _{\mathbf{x}_i, \mathbf{v}})
Original post by henpen
Can you not also have sin(kx),cos(kx),kN\sin(kx),\cos(kx), k \in \mathbb{N} as orthonormal basis vectors? I hear that definition's got something to with them being orthogonal under integration (between certain limits).


You can; they are the basis vectors of functions used when dealing with fourier series.

They are orthogonal under the inner-product defined by <f,g>=ππf(x)g(x) dx <f,g> = \displaystyle\int_{-\pi}^{\pi}f(x)g(x)\ dx

edit: Problem 291*

Prove the above statement that sines and cosines are orthogonal under the inner product given. This is to show the integral of sin(mx)cos(nx)\sin(mx)\cos(nx), cos(mx)cos(nx)\cos(mx)\cos(nx), or sin(mx)sin(nx)\sin(mx)\sin(nx), are zero if and only if mn m \not= n
(edited 10 years ago)
Original post by henpen
'Solution' 290


I think R^n is fine for "nD space" - although I would probably say "it is the length of the projection onto the span of x1,x2,x3" - not sure, though.
Reply 2015
Original post by henpen

and its modulus:
v2=1=i=1nvi2\displaystyle |\mathbf{v}|^2=1= \sum_{i=1}^n v_i^2

Yeah that;s what I had in mind! "inner product" I think it's called?

v=i=1nvi2|| \mathbf{v} || = \sqrt{\sum_{i=1}^{n} v_i^2}

I eliminated the need for geometry by interpreting cosine by its place in the dot product (I prefer to 'get on with' the algebra in questions of this sort). :tongue:

Post some questions bro?
Reply 2016
Original post by FireGarden
You can; they are the basis vectors of functions used when dealing with fourier series.

They are orthogonal under the inner-product defined by <f,g>=ππf(x)g(x) dx <f,g> = \displaystyle\int_{-\pi}^{\pi}f(x)g(x)\ dx

I remember that from QM! :colone: (one function should be conjugated btw, unless their codomain is real)
Original post by Jkn
I remember that from QM! :colone: (one function should be conjugated btw, unless their codomain is real)

Even if their codomain is real, it doesn't hurt :smile:
Problem 292***
Group summary:

Spoiler



Prove Lagrange's Theorem: if H is a subgroup of G, then |H| divides |G|.
Hint:

Spoiler


Hence prove that if g is in G, then o(g)o(g) the order of g (that is, the minimum n such that g^n is the identity) divides |G|.
Corollary: Prove Fermat's Little Theorem: that if (a,p)=1(a,p) = 1 then ap11(modp)a^{p-1} \equiv 1 \pmod{p}.
(edited 10 years ago)
Reply 2019
Original post by FireGarden
You can; they are the basis vectors of functions used when dealing with fourier series.

They are orthogonal under the inner-product defined by <f,g>=ππf(x)g(x) dx <f,g> = \displaystyle\int_{-\pi}^{\pi}f(x)g(x)\ dx

edit: Problem 291*

Prove the above statement that sines and cosines are orthogonal under the inner product given. This is to show the integral of sin(mx)cos(nx)\sin(mx)\cos(nx), cos(mx)cos(nx)\cos(mx)\cos(nx), or sin(mx)sin(nx)\sin(mx)\sin(nx), are zero if and only if mn m \not= n


I think you mean, mn |m| \not= |n| . :tongue:

As, if, m=n m = -n , we have, <sinmx,sinnx>=πm. <\sin {mx}, \sin {nx}> = - \frac{\pi}{m}.

Solution 291

Let, m be distinct from n. Then integrating by parts twice, we have,

ππsinmxsinnx dx=nmππcosmxcosnx dx=n2m2ππsinmxsinnx dx \displaystyle \int_{-\pi}^{\pi} \sin {mx} \sin {nx}\ dx = \frac{n}{m} \displaystyle \int_{-\pi}^{\pi} \cos {mx} \cos {nx}\ dx = \frac{n^2}{m^2} \displaystyle \int_{-\pi}^{\pi} \sin {mx} \sin {nx}\ dx

But, as m2n2ππsinmxsinnx dx=0 m^2 \not= n^2 \Rightarrow \displaystyle \int_{-\pi}^{\pi} \sin {mx} \sin {nx}\ dx = 0.

You get exactly the same result when you perform ππcosmxcosnx dx \displaystyle \int_{-\pi}^{\pi} \cos {mx} \cos{nx}\ dx .

Further, if m=n m=n , we simply have,

ππsin2mxdx=ππcos2mxdx=12mππ1cos2x dx=πm[br][br] \displaystyle \int_{-\pi}^{\pi} \sin^2 {mx} dx = \displaystyle \int_{-\pi}^{\pi} \cos^2 {mx} dx= \frac{1}{2m}\displaystyle \int_{-\pi}^{\pi} 1- \cos{2x}\ dx= \frac{ \pi}{m}[br][br]

And that should be enough to imply that the inner product is 0 iff mn |m| \not= |n| .

Quick Reply

Latest