# TSR Philosophy Society (TSR PhilSoc)Watch

This discussion is closed.
13 years ago
#341
(Original post by Calvin)
The challenge then is: Can you prove that in the past you have been doing addition rather than quuaddition?
Can we use induction, or will you just complain that that's demonstrating a high probability for the fact that we've been adding all along, rather than conclusively (ie. somehow deductively) proving it?

Coz:
1) All instances of addition performed in the past have followed exactly the same principles and resulted in a number 'higher' or 'bigger' than the two added to form it;
2) the past is generally a reliable guide to the future (one could say especially in this case, as addition can be done abstractly and so does not depend upon environmental contingencies);
3) Therefore, it is at least likely that for any future addition sums, the same rules that we have been using successfully up until this point will apply;
4) Therefore, there is no need for this distinction betwee 'addition' and 'quaddition', as we have no reason to believe that there are in fact two different categories for addition.
5) ERGO ( ), we have been adding all along.

Ok, so maybe I haven't exactly proved it...

ZarathustraX
0
13 years ago
#342
(Original post by Zarathustra)
Can we use induction
No, it's a waste of time - you can't prove anything about the past because of the "5 minute" problem (i.e. that the world was created 5 minutes ago, complete with all your memories).
0
13 years ago
#343
(Original post by Calvin)
We learn how the plus function works by deriving a rule from examples given by a teacher.
This misses the point that addition doesn't exist as such - it's just a name relating two numbers. What we learn at school is the rule for all single digits, together with a rule for how to build bigger numbers in base 10. This rule can be shown to apply for any number, no matter how large. It's not sensible to argue that it can only be proved by induction, because it's built from two rules, not one. In any case, it's only a rule for getting from one number to another. To critique the rule, you'd need a meta-mathematical definition of number and addition. Then you need meta-meta-maths to critique that, repeat ad infinitem. In other words, it all collapses under Kantian pressure.

we haven't been given the answer for every possible set of additions.
Yes, you have.
But then we can argue the following: in the past you haven't been performing addition, instead you've been performing some other function. This other function is exactly the same as addition except if one of the numbers you are adding is over 253 then the answer to your sum will always be 5. Call this new function Quuaddition.
So 4 quus 7 equals 11. But 291 quus 62 equals 5.
Except I have added numbers bigger than 253 - it happens all the time in binary (255 = the biggest unsigned value for 8 bits), so this fails straight away. However, let's argue that there's a value n which is bigger than any number I've manipulated before. How could we tell that it's not quaddition for numbers bigger than n? Actually, that's pretty simple - addition is based on a strict set of rules which do not rely on rote learning. They apply for all numbers irrespective of whether we've encountered them. The 'paradox' is because Wittgenstein is arguing that we learn all numbers by example, rather than we learn the underlying rule, plus rote for the numbers 0 to 10.

Sorry - there isn't a paradox at all, just an error.
0
13 years ago
#344
(Original post by Calvin)
Ok I think I can explain this clearly now...

Wittgensteins Paradox (and no, I'm not obssessed with the man)

Essentially that no course of action can be determined by a rule because we can make any course of action fit with some rule or other.

Saul Kripke gives the following example:
Consider addition in maths. The 'Plus' function. You take two numbers, you plus them together and you get an answer. We've all done it.

We learn how the plus function works by deriving a rule from examples given by a teacher. The teacher gives us various numbers and performs this addition function and gives us the answer and from this we derive the rule for how addition works. Then we go off and add things together and enjoy ourselves.
Notably though we are learning this rule from having seen only a finite number of examples, we haven't been given the answer for every possible set of additions.

Now as you've only performed so many additions it must be the case that there is one number bigger than any other you've ever added before. For the sake of simplicity imagine the biggest addition you've ever performed is 253+12=265
But then we can argue the following: in the past you haven't been performing addition, instead you've been performing some other function. This other function is exactly the same as addition except if one of the numbers you are adding is over 253 then the answer to your sum will always be 5. Call this new function Quuaddition.
So 4 quus 7 equals 11. But 291 quus 62 equals 5.

The Paradox is this: Wittgenstein argues there is no fact you can point to to show that in the past you have been performing addition rather than quuaddition.
(the paradoxical part being that this seems really obviously wrong)

(This essentially boils down to the statement made at the beginning. You can't make adding conform to a rule because all finite examples of behaviour accord equally well with some other rule. Similarly you can invent a rule to explain anything, even something totally random.)

The challenge then is: Can you prove that in the past you have been doing addition rather than quuaddition?

Isn't this just the problem of induction? You can never be absolutely certain that a rule applies, you can only apply Occam's razor and use the simplest possible explanation.

As for Wittgenstein, does anyone find his 'private language argument' at all convincing? I'm almost convinced that I can't have understood it properly as it seems so weak.
0
13 years ago
#345
Yeek, that'll teach me to post a complicated problem and then go away for a day.

GrumballCake
Nice to see your assertive self again
I think you miss the point too. If your statement is that you rope learn the addition for all single unit numbers and then apply another rule to combine numbers together when they are larger I need only suggest a different 'quus' interpretation. Perhaps perhaps single digit numbers when quused together give a different answer if they are part of a whole number larger than the largest number you've so far added together.

On Using Induction

To clarify, there are two problems here for us:
1. What past fact is there that we have previously meant addition rather than quuaddition?
2. If there is no fact for this on what do we base our confidence for how we should answer a question like "271+12=?" now?

Induction:
As such induction alone will fail on this point. We cannot infer from past to present that we were using addition because the evidence we have is all equally compatible with both. In that sense this part is a little like a more generalised version of Goodman's Grue Paradox.

Induction + Occams Razor (after a lot of thinking which turned out to be on the wrong track) is a little trickier but still equally ineffective I think:

It doesn't answer (1) because it doesn't provide a fact about our past that shows we were using addition rather than quuaddition. I can still argue that in the past you meant quuaddition and so far there is no fact you can appeal to to show you didn't.

Private Language Argument
Is based on this paradox I think. If language cannot be governed by rules Wittgenstein argues it is governed by agreement. I see you doing something that looks like what I'm doing. You get the answers I would have got, so I say to you 'well done, you've learned how to add'. Adding isn't being able to follow a rule but rather getting the same answer everybody else does most of the time. The point then I think is that without other people we have no agreement and thus no language. What I don't see is how this refutes solipsism. Couldn't I have agreement from the mindless zombies I perceive? Why do these zombies need to have minds for me to be able to have an language? Especially condering Wittgenstein has been demonstrating language is about appearing to do something rather than having a rule in mind.
0
13 years ago
#346
(Original post by Calvin)
On Using Induction

To clarify, there are two problems here for us:
1. What past fact is there that we have previously meant addition rather than quuaddition?
2. If there is no fact for this on what do we base our confidence for how we should answer a question like "271+12=?" now?

Induction:
As such induction alone will fail on this point. We cannot infer from past to present that we were using addition because the evidence we have is all equally compatible with both. In that sense this part is a little like a more generalised version of Goodman's Grue Paradox.

Induction + Occams Razor (after a lot of thinking which turned out to be on the wrong track) is a little trickier but still equally ineffective I think:

It doesn't answer (1) because it doesn't provide a fact about our past that shows we were using addition rather than quuaddition. I can still argue that in the past you meant quuaddition and so far there is no fact you can appeal to to show you didn't.

1. Memory. I remember adding, not quadding - and I know what addition and quaddition are, and I remember which rule I was applying. Theres your past fact, although you'll have to take my word for it - but thats always a given with mental events anyway.

2.Simplicity in Occam's razor is a bit of a tricky concept, but its easiest when you're talking about avoiding inventing new extra rules to explain things that can be explained with less or no new rules. In this case, it applies because you have to invent quaddition to provide an alternative explanation, when the already existing rule of addition is explanation enough - you are unnecessarily complicating things.

(Original post by Calvin)
Private Language Argument
Is based on this paradox I think. If language cannot be governed by rules Wittgenstein argues it is governed by agreement. I see you doing something that looks like what I'm doing. You get the answers I would have got, so I say to you 'well done, you've learned how to add'. Adding isn't being able to follow a rule but rather getting the same answer everybody else does most of the time. The point then I think is that without other people we have no agreement and thus no language. What I don't see is how this refutes solipsism. Couldn't I have agreement from the mindless zombies I perceive? Why do these zombies need to have minds for me to be able to have an language? Especially condering Wittgenstein has been demonstrating language is about appearing to do something rather than having a rule in mind.
Exactly - I don't see why its so debated, as it doens't seem to effectively refute solipsism at all.
0
13 years ago
#347
(Original post by wanderer)
1. Memory. I remember adding, not quadding - and I know what addition and quaddition are, and I remember which rule I was applying. There's your past fact, although you'll have to take my word for it - but thats always a given with mental events anyway.

2.Simplicity in Occam's razor is a bit of a tricky concept, but its easiest when you're talking about avoiding inventing new extra rules to explain things that can be explained with less or no new rules. In this case, it applies because you have to invent quaddition to provide an alternative explanation, when the already existing rule of addition is explanation enough - you are unnecessarily complicating things.
What do you remember doing? By hypothesis you never remember giving yourself instruction that for the sum 253+12 your answer should be 265 because (by hypothesis) you have never performed this sum.
Similarly you cannot just have given yourself instructions that you should continue as you have been in the finite number of preceding cases because again by hypothesis that doesn't discriminate between quuaddition and addition. So precisely what is your memory of- what fact is exhibited by your memory to prove you have been plusing rather than quusing?

This I think also addresses your second point. I am not inventing new rules and saying, you've been following this rule rather than that rule. I'm saying, there is no fact of the matter that you have been using one rule rather than another. Although you have clearly been following some kind of procedure there is no fact of what that procedure is because we have only analyzed a finite number of cases of how that procedure operates. All past demonstrations are consistent with an infinite number of possible rules (though is does rule out some, for instance the rule 'always answer "1"').
You cannot argue I'm introducing a new rule because that begs the question that you have been using a different rule up until now and you have yet to show that this is the case.
So again:
What fact shows you have previously been using one rule rather than another?
0
13 years ago
#348
(Original post by Calvin)
What do you remember doing? By hypothesis you never remember giving yourself instruction that for the sum 253+12 your answer should be 265 because (by hypothesis) you have never performed this sum.
Similarly you cannot just have given yourself instructions that you should continue as you have been in the finite number of preceding cases because again by hypothesis that doesn't discriminate between quuaddition and addition. So precisely what is your memory of- what fact is exhibited by your memory to prove you have been plusing rather than quusing?
The subjective perception of applying a rule. You don't define addition by looking at all previous cases and copying them. The instructions given in the past were to add two numbers, in any particular case.

(Original post by Calvin)
This I think also addresses your second point. I am not inventing new rules and saying, you've been following this rule rather than that rule. I'm saying, there is no fact of the matter that you have been using one rule rather than another. Although you have clearly been following some kind of procedure there is no fact of what that procedure is because we have only analyzed a finite number of cases of how that procedure operates. All past demonstrations are consistent with an infinite number of possible rules (though is does rule out some, for instance the rule 'always answer "1"').
You cannot argue I'm introducing a new rule because that begs the question that you have been using a different rule up until now and you have yet to show that this is the case.
So again:
What fact shows you have previously been using one rule rather than another?
I don't mean you're introducing a new rule instead of a previous explanation for my past additions, I mean you're inventing a new kind of mathematical process to explain them - addition is already defined, as one of the basic tools of number theory. To claim that I haven't been adding, you have to invent a new process for that purpose, which is unnecessary and so violates Occam's razor.
0
13 years ago
#349
(Original post by wanderer)
The subjective perception of applying a rule. You don't define addition by looking at all previous cases and copying them. The instructions given in the past were to add two numbers, in any particular case.
Perhaps you could give an example of what these instructions might be?
The best I can think of is something like 'to add two numbers collect an equivilant amount of marbles for each number, combine the two piles of marbles into one pile and then count the total number of marbles. This is your answer'
But this only pushes the problem back a step. The same sceptical problem Wittgenstein is posing can be repeated for 'combining'. What is the rule for combining? Or counting? Ultimately rules on their own cannot justify what you doing one thing rather than another. "Quombining", "Quounting"

(Original post by Wanderer)
I don't mean you're introducing a new rule instead of a previous explanation for my past additions, I mean you're inventing a new kind of mathematical process to explain them - addition is already defined, as one of the basic tools of number theory. To claim that I haven't been adding, you have to invent a new process for that purpose, which is unnecessary and so violates Occam's razor.
...Hmm, I think I (Well, Kripke and Wittgenstein) have dealt with this in the previous paragraph. I'm only introducing a new process if you can demonstrate that addition is already defined- and in such a way that we cannot simply repeat the sceptical problem at a 'more fundamental level'. To argue I am changing the process is to presuppose that it is a fact that you have a fixed rule governed process for me to change. But you have yet to show that you do. Occams Razor can't come into effect until you can show this (and by then I think Wittgenstein would give up anyway.)
0
13 years ago
#350
(Original post by Calvin)
Perhaps you could give an example of what these instructions might be?
The best I can think of is something like 'to add two numbers collect an equivilant amount of marbles for each number, combine the two piles of marbles into one pile and then count the total number of marbles. This is your answer'
But this only pushes the problem back a step. The same sceptical problem Wittgenstein is posing can be repeated for 'combining'. What is the rule for combining? Or counting? Ultimately rules on their own cannot justify what you doing one thing rather than another. "Quombining", "Quounting"

...Hmm, I think I (Well, Kripke and Wittgenstein) have dealt with above. I'm only introducing a new process if you can demonstrate that addition is already defined- and in such a way that we cannot simply repeat the sceptical problem at a 'more fundamental level'. To argue I am changing the process is to presuppose that it is a fact that you have a fixed rule governed process for me to change. But you have yet to show that you do. Occams Razor can't come into effect until you can show this (and by then I think Wittgenstein would give up anyway.)
Ah-hah, this leads back to the whole meaning thing. Off the top of my head, I'd say you define it in terms of concepts that are 'basic' - i.e, we have a simple subjective perception of them. In that case, i'd refer you to addition as joining two sets (basic mathmatical definition) and claim that I have a basic idea in my head of what joining is.
0
13 years ago
#351
(Original post by Calvin)
perhaps single digit numbers when quused together give a different answer if they are part of a whole number larger than the largest number you've so far added together.
Hang on a minute, why is there any obligation on me to show that I used addition? Why should Wittgenstein occupy a privileged position to define what constitutes proof? What fact does he have to show that anyone, anywhere has ever used quaddition? What's sauce for the goose is sauce for the gander.

It falls into the traditional inconsistency of empiricism - if you say that only facts count, then what facts do you produce to prove that only facts count?

2. If there is no fact for this on what do we base our confidence for how we should answer a question like "271+12=?" now?
That's not a meaningful question. Mathematics is a tautologous system and the rules for how we perform operations are fixed. Our confidence lies in the fact that mathematics also provides a way of testing whether we have followed the rules.

If language cannot be governed by rules Wittgenstein argues it is governed by agreement.
I think you need to look at more modern work like MacIntyre's stuff on community. Language is an example of how knowledge resides in a community of thought. It has rules which can be codified, but those rules are derived from the usage in the community. A language is not an objective thing as such - it is always in a state of flux within a community.

One refutation of the private language argument is when you say a word, but meant something else. I expect we can all think of examples where people have looked at us strangely and said "Did you mean x?" and I reply "Of course, that's what I said, didn't I?". The failure is between your private language (of thought) and the public lnaguage you use to express that thought.

Another example: Do babies think? Of course they do - they can show joy, anger etc. Do they have public language? No.
0
13 years ago
#352
(Original post by wanderer)
Ah-hah, this leads back to the whole meaning thing. Off the top of my head, I'd say you define it in terms of concepts that are 'basic' - i.e, we have a simple subjective perception of them. In that case, i'd refer you to addition as joining two sets (basic mathmatical definition) and claim that I have a basic idea in my head of what joining is.
It does in fact lead back to meaning, but only tangentially. It's more about the use of rules in general.

What is a 'basic idea' supposed to be? The complete idea of the rule for counting/joining? I can certainly see when two sets are joined, but I can't see the rule for joining so it doesn't help you to make it perceptual. Similarly once joined you need to count the totals to provide you answer and we can produce the same sceptical problem for following the rule for counting.

Essentially you seem to want to boil this down to an ultimate rule for joining which doesn't need reference to any other rule. But in that case you admit you have no justification for joining like this rather than joining in some other way. Your decision to join in one way is completely arbitrary and not justified by previous cases of joining because you admit there is no further underlying rule you are following. When joining you are just doing something, not following a rule. If you were following a rule then we can apply the sceptical problem and say 'what rule?'. In short, by reducing the rules for counting, adding etc to something basic without rules seems to be to conceed the point I think...
0
13 years ago
#353
(Original post by grumballcake)
Hang on a minute, why is there any obligation on me to show that I used addition? Why should Wittgenstein occupy a privileged position to define what constitutes proof? What fact does he have to show that anyone, anywhere has ever used quaddition? What's sauce for the goose is sauce for the gander.

It falls into the traditional inconsistency of empiricism - if you say that only facts count, then what facts do you produce to prove that only facts count?
Because you say you used addition. Wittgenstein isn't saying you used quuaddition. He's asking what shows you were using addition. If you were using addition then there should be some fact of the matter. If there isn't a fact of the matter then how can you claim to be following the rule for addition?
"Things apart from facts count in this case"? That sounds fascinating. So your use of addition isn't demonstrated by anything to do with yourself or the world but by something else?

(Original post by Grumballcake)
That's not a meaningful question. Mathematics is a tautologous system and the rules for how we perform operations are fixed. Our confidence lies in the fact that mathematics also provides a way of testing whether we have followed the rules.
Fixed as what?

(Original post by Grumballcake)
I think you need to look at more modern work like MacIntyre's stuff on community. Language is an example of how knowledge resides in a community of thought. It has rules which can be codified, but those rules are derived from the usage in the community. A language is not an objective thing as such - it is always in a state of flux within a community.
I think you need to re-read Wittgenstein and/or my post. Perhaps it's my clarity that is at fault but despite some confident looking claims you have yet to address the problem effectively anywhere that I can see.

I'm not cliaming anything about the nature of language. It's a paradox. It's supposed to challenge our claims and make us realise something. I'm not saying what that something is at all. You're free to escape it in any legitimate way you want.

(Original post by Grumballcake)
One refutation of the private language argument is when you say a word, but meant something else. I expect we can all think of examples where people have looked at us strangely and said "Did you mean x?" and I reply "Of course, that's what I said, didn't I?". The failure is between your private language (of thought) and the public lnaguage you use to express that thought.
This argument only works if you can dismiss the sceptical paradox given above. The Private language argument is based on Wittgensteins response to the sceptical paradox. Your responses to him ignore that paradox and so are ineffective until you can deal with this paradox some other way.

(Original post by grumballcake)
Another example: Do babies think? Of course they do - they can show joy, anger etc. Do they have public language? No.
Actually that's a pretty good question. Do babies think? Thinking certainly isn't required to show anger- that can happen reflexively. And thinking is something more active than just being conscious. I think in words but I wonder if there is a more basic level of thinking where you don't use words. Hmmm....
0
13 years ago
#354
(Original post by Calvin)
(Original post by Grumballcake)
Originally Posted by Grumballcake
One refutation of the private language argument is when you say a word, but meant something else. I expect we can all think of examples where people have looked at us strangely and said "Did you mean x?" and I reply "Of course, that's what I said, didn't I?". The failure is between your private language (of thought) and the public lnaguage you use to express that thought.
This argument only works if you can dismiss the sceptical paradox given above. The Private language argument is based on Wittgensteins response to the sceptical paradox. Your responses to him ignore that paradox and so are ineffective until you can deal with this paradox some other way.
Sorry for the triple post. But it just occured to me that actually this argument fails to touch the private language argument at all. Wittgenstein says that it's not "if somebody is is following the same rule as me thenthe answer they should give is X" but rather "If somebody gives the same answer as me then we should say they are following the same rule"

But this is a long term thing. It depends on giving the same or very similar answers to the rest of the community over a long period of time. If I use a word in the wrong circumstances for that word to be what somebody else has said then that is only one example- and in many cases- like freudian slips- it's understandable why the mistake has been made so we allow that the same procedure was being followed
0
13 years ago
#355
(Original post by Calvin)
Your decision to join in one way is completely arbitrary
Well, you'd better set out a fully justified reasoning as to why 'arbitrary' is unacceptable.
0
13 years ago
#356
(Original post by Calvin)
What is a 'basic idea' supposed to be?
Plantinga says that a 'basic' rule is something which I believe, without having to justify it. It's an axiom. Anyone who thinks that you can operate without axioms needs to read Kant's critique of pure reason.

Addition is axiomatic in maths. Your whole argument assumes that quaddition exists at all, but it misses the point of maths. Maths is a tautology. It relies on no facts whatsoever, it simply relies on certain self-consistent axioms.

The second question would be: what difference did it make whether I used addition or quaddition as long as the numbers concerned were less than n? There is no functional difference between the two below the boundary condition.
0
13 years ago
#357
(Original post by Calvin)
If you were using addition then there should be some fact of the matter.
Why? What rule are you proposing here? That all events should be capable of a single explanation?

You keep begging the question, as far as I can see.
Not unless you can give an account of why there must be 'facts' (whatever they are). Can you really not see how recursive this argument is?
You're free to escape it in any legitimate way you want.
Escape what? An imagined paradox? Firstly you have to show that there is necessarily a paradox to escape. Only then need I bother trying.
0
13 years ago
#358
Yes thank you for the repetition.
A rock is falling. There is no rule that says the rock is falling. What there is is a fact of the matter.

Take the grue paradox as a different familiar example. What fact shows that something has been green rather than grue? Your response is: "ahh... but what rule says their must be a fact of the matter?" But I think what you want to say is "what fact of the matter makes it the case that there must be a fact of the matter?"
A rule is something like "In case X, do Y". A fact of the matter is something like "there is an X".

So I think in fact it is you who is being circular
.
0
13 years ago
#359
(Original post by grumballcake)
Plantinga says that a 'basic' rule is something which I believe, without having to justify it. It's an axiom. Anyone who thinks that you can operate without axioms needs to read Kant's critique of pure reason.

Addition is axiomatic in maths. Your whole argument assumes that quaddition exists at all, but it misses the point of maths. Maths is a tautology. It relies on no facts whatsoever, it simply relies on certain self-consistent axioms.
Then please define it for me. But you should realise now that I will produce the same argument again if you try to define it based on a rule of some kind.
Surely you see this paradox extends beyond maths, it works for any case of claiming to follow a rule be that in Law, Maths, Logic, Social interaction etc etc.

(Original post by Grumballcake)
The second question would be: what difference did it make whether I used addition or quaddition as long as the numbers concerned were less than n? There is no functional difference between the two below the boundary condition.
The point is not what difference has it made, because clearly it would not have made any difference. The point is that contrary to what we might have thought we don't appear to be able to say we have used one or the other. And on top of that we don't have any justification therefore for how we should answer the maths question we are confronted with now.
0
13 years ago
#360
(Original post by Calvin)
It does in fact lead back to meaning, but only tangentially. It's more about the use of rules in general.

What is a 'basic idea' supposed to be? The complete idea of the rule for counting/joining? I can certainly see when two sets are joined, but I can't see the rule for joining so it doesn't help you to make it perceptual. Similarly once joined you need to count the totals to provide you answer and we can produce the same sceptical problem for following the rule for counting.

Essentially you seem to want to boil this down to an ultimate rule for joining which doesn't need reference to any other rule. But in that case you admit you have no justification for joining like this rather than joining in some other way. Your decision to join in one way is completely arbitrary and not justified by previous cases of joining because you admit there is no further underlying rule you are following. When joining you are just doing something, not following a rule. If you were following a rule then we can apply the sceptical problem and say 'what rule?'. In short, by reducing the rules for counting, adding etc to something basic without rules seems to be to conceed the point I think...
What do you mean by arbitrary here? Just because my use of a rule is arbitrary doesn't mean that it isn't a rule. I don't define a rule based on previous cases, I have an abstract conceptualisation of it. I know how to count without referring to previous cases of counting, and there is no question as to whether in previous cases I was counting because I remember counting, not doing something else.
0
X
new posts
Latest
My Feed

### Oops, nobody has postedin the last few hours.

Why not re-start the conversation?

see more

### See more of what you like onThe Student Room

You can personalise what you see on TSR. Tell us a little about yourself to get started.

### University open days

• University of East Anglia
All Departments Open 13:00-17:00. Find out more about our diverse range of subject areas and career progression in the Arts & Humanities, Social Sciences, Medicine & Health Sciences, and the Sciences. Postgraduate
Wed, 30 Jan '19
• Aston University
Wed, 30 Jan '19
• Solent University
Sat, 2 Feb '19

### Poll

Join the discussion

Remain (1170)
79.38%
Leave (304)
20.62%