Out of my depths with this. Watch

This discussion is closed.
AT82
Badges: 0
Rep:
?
#1
Report Thread starter 13 years ago
#1
With my poor grade GCSE maths :confused:

For an assignment we have to develop our own compression and encoding system using the Entropy theory

http://en.wikipedia.org/wiki/Information_theory

Those formula's may as well be written in Urdu.
0
trev
Badges: 12
Rep:
?
#2
Report 13 years ago
#2
That is hard. I think A-level math students would know. Can someone tell me how would these kinds of math stuff would be related to IT type jobs in the future?

It's quite interesting that you have to do math stuff in your course. I thought multimedia and internet technology course won't have that hard math content in it.
0
KingsComp
Badges: 6
Rep:
?
#3
Report 13 years ago
#3
Umm, I can help you wid understanding what all those symbols mean if you want, cos i did Shannons Law and the information content in my first and second years. Its really pretty basic and this stuff isnt really taught in GCSE anywayz.

What specifically do u need to know?
0
AT82
Badges: 0
Rep:
?
#4
Report Thread starter 13 years ago
#4
That is the way they like to market it becuase it would put students of it they thought it had any signifiecent maths content, however there is less maths content than the computer science.

That course is almost A level maths, in fact you have to sit an A level maths standard module in the first year in compu science if you haven't done it at university.

Up and till now this is the most advanced maths we have done, and I am not sure we need to understand those forumula's we still have to apply the theory to our own encoding system.

I know a lot of logertithims are involved it in but they are supposed to be quite easy. There is a lot of basic physics on my course, i.e you need to understand stuff like fret you don't need to understand all the maths behind it.
0
AT82
Badges: 0
Rep:
?
#5
Report Thread starter 13 years ago
#5
(Original post by KingsComp)
Umm, I can help you wid understanding what all those symbols mean if you want, cos i did Shannons Law and the information content in my first and second years. Its really pretty basic and this stuff isnt really taught in GCSE anywayz.

What specifically do u need to know?
Well basicaly what exactly is entropy? From what I gathered from that it is the amount of data that needs to be sent before it becomes information. (i.e enough to be encoded).

Just seen the rest of it, it looks like I need to go tot he library and read up on Huffman because there is too much stuff I need to know.

I didn't understand the first thing about that entropy formula btw. He might explain it all to use next week, its jsut background reading at this stage.
0
trev
Badges: 12
Rep:
?
#6
Report 13 years ago
#6
(Original post by AT82)
That is the way they like to market it becuase it would put students of it they thought it had any signifiecent maths content, however there is less maths content than the computer science.

That course is almost A level maths, in fact you have to sit an A level maths standard module in the first year in compu science if you haven't done it at university.
Some uni's (especially the good ones) have 2 math modules in the first year. One math module is for people who done GCSE math and have a grade C or above, while the second one is for people who have done A-level maths. Nevertheless, some uni's have A-level math stuff in the first year, and the entry requirments are sometimes either GCSE maths and/or A-level maths.

I compared the maths stuff in computing type courses. Most of them don't have calculus in it. It's stuff like set theory, discrete mathematics, etc... I think these math stuff would be useful.
0
Toy Soldier
Badges: 0
Rep:
?
#7
Report 13 years ago
#7
Oh god... formulae... damn. I used to be able to do this sort of thing, and I'm going to have to again when I go back to Uni. I've just realised that I've completely forgotten everything about maths I've ever learned. Uh-oh. And... err... pretty much everything else I learned at school. Taking a year out destroys your brain! AAH!
0
KingsComp
Badges: 6
Rep:
?
#8
Report 13 years ago
#8
oooh Huffman coding (that takes me back )

Ok lets see if i remember it all, basically most data will not have random distribution in the frequency count of the symbols. For instance in the English language the letter 'e' will occur more frequently than the letter 'h' which in turn will occur more frequently than 'q' and so on.
The Information theory (entropy linked) basically means that if you have say N symbols and a probablilty that symbol Si occuring in the data is Pi then in information content is bits of each symbol is equal to :

- sigma (i = 1, limit N) Pi log 2 Pi

remember that if all the symbols occur with equal probablility then Pi = 1/N and the sum is then -log 2 1/N = log2 N (to the upper bound).

Also remember that if the freq distribution in the data is non-random, then the information conveyed decreases, even if you are using the same number of bits to store the data.

Hope that makes sense, soz if i havent made it clear, and I hope i havent missed anything out, cos i did do it quite a while ago. You will def have to check out some books as well to give u a firm grounding in the topic, but overall its not too bad and is pretty basic once u get the overall concepts.
0
AT82
Badges: 0
Rep:
?
#9
Report Thread starter 13 years ago
#9
Thanks, I think I get it, I still don't get the forumula though, what does stigma mean?

I guess its like the Morse code example we were given in the lecture, the letter E appears more than the letter Z, so the letter E takes up less space to transmit.
0
trev
Badges: 12
Rep:
?
#10
Report 13 years ago
#10
(Original post by AT82)
Thanks, I think I get it, I still don't get the forumula though, what does stigma mean?

I guess its like the Morse code example we were given in the lecture, the letter E appears more than the letter Z, so the letter E takes up less space to transmit.
http://en.wikipedia.org/wiki/Stigma
0
KingsComp
Badges: 6
Rep:
?
#11
Report 13 years ago
#11
hehe, ok what ill do is, give u an eg,

assume u have a question:
the code set {A, B, C, D} and their respective probabilities are {0.5, 0.25, 0.125, 0.125}

then the information content will be:

using information content = - Sigma Pi log2 Pi

-(0.5 log2 (0.5) + 0.25 log2 (0.25) + 0.125 log2 (0.125) + 0.125 log2 (0.125))

which should equal in the region of 1.5 to 2.0 bits i think (soz dont have a calc on me)

sigma is basically the sum of, it has lower and upper bound limits which are shown at the bottom and above the sigma sign
0
KingsComp
Badges: 6
Rep:
?
#12
Report 13 years ago
#12
Also just to add, the huffman coding is basically to increase the efficiency of the encoding. For instance if you have a non-random frq distribution (such as the alphabet - which i said earlier) then u use huffman coding which will limit the number of bits used to encode the letter depending on its probabillites.

For instance, since E occurs very freq in the alphabet, it would be better to encode it with few bits, as it will occur frequently. While on the other hand Z occurs quite infrequenlty so its better to encode it with more bits as there is much smaller probiliity. And using this concept of probabilities the huffmann coding will allocate the bits to represent each symbol.

You shud also read up on run length encoding as its on the same lines, but it encodes series of data using bits themselves (same way fax machines work - 0 representing white space)
0
Chris87
Badges: 1
Rep:
?
#13
Report 13 years ago
#13
(Original post by AT82)
Thanks, I think I get it, I still don't get the forumula though, what does stigma mean?
Don't you mean 'sigma'? Sigma basically means the sum of all the values, or in other words, you add everything together (the stuff after the sigma symbol).
0
AT82
Badges: 0
Rep:
?
#14
Report Thread starter 13 years ago
#14
(Original post by Chris87)
Don't you mean 'sigma'? Sigma basically means the sum of all the values, or in other words, you add everything together (the stuff after the sigma symbol).
Ok thanks, that sounds easy enough, its amazing what you don't get taught at school (or at least what I didn't).
0
X
new posts
Latest
My Feed

See more of what you like on
The Student Room

You can personalise what you see on TSR. Tell us a little about yourself to get started.

Personalise

Were you ever put in isolation at school?

Yes (90)
25.86%
No (258)
74.14%

Watched Threads

View All
Latest
My Feed

See more of what you like on
The Student Room

You can personalise what you see on TSR. Tell us a little about yourself to get started.

Personalise