mercredi 23 octobre 2013

Human and Machine Logic

Quoting Good via J. R. Lucas:

We can imagine a human operator playing a game of one-upmanship against a programmed computer. If the program is Fn, the human operator can print the theorem Gn, which the programmed computer, or, if you prefer, the program, would never print, if it is consistent. This is true for each whole number n, but the victory is a hollow one since a second computer, loaded with program C, could put the human operator out of a job.... It is useless for the `mentalist' to argue that any given program can always be improves since the process for improving programs can presumably be programmed also; certainly this can be done if the mentalist describes how the improvement is to be made. If he does give such a description, then he has not made a case.


Here is J. R. Lucas' summary:

Good, together with Turing and Benacerraf, argues against `mentalism' by denying that there could be any peculiarly mental powers, since if there were, they could be described, and if they could be described, a computer could be programmed to simulate them.


Here is my reply (not J. R. Lucas's):

An abacus can be used so as to simulate our grasp on arithmetical truth. This does not mean an abacus knows arithmetic as well as I do. It knows nothing. And we all know that. The capacity of doing arithmetic is not a proof that arithmetic knowledge belongs to each and every object having that capacity. The capacity of an abacus to do or facilitate the doing of arithmetic is dependent not on its but on its maker's knowledge of arithmetic.

Arithmetic for abacus and man, logic for game-playing machines and men, language for google translate or man are not at all predicated in the same way, and we all know that.

Just as "healthy" or "medical" is not said in the same way about a prescription and the material means it applies. Only the doctor making the prescription has the knowledge. Only the material substance will do (without "bothering to" think) the cure. The doctor is only indirectly healthy by prescribing what is healthy. The cure is only indirectly medical by being prescribed by someone having that knowledge.

A man like Turing (or his disciple Good) would reply "you say you know, but you cannot verify it".

I do not need to "verify" an orange is not a baby, as soon as I know it is an orange I know it is no baby.

And I do not need to "verify" an abacus has no arithmetic knowledge or understanding, though it has arithmetic functioning, since I know an abacus is an object put together out of inanimate parts.

But as to google translate, it is even easy to verify it for any man knowing two languages. Put in a short phrase of two words - it will give one correct translation. But what if the two words are ambiguous? Google translate will not be able to tell the two meanings apart, it will give one. Use a longer text, any words not programmed into google translate will be left untranslated (and that includes spelling varieties, by spelling my Swedish an oldfashioned way, I make google translate inapplicable to some more of it). And if the sentences are long, google translate will not be able to separate the phrases the right way every time.

You can say this applies to learners too. But a learner will by told what he did wrong and he will say "aha".

A computer programme could of course simulate the sounds "aha" or its spelling, but there would be no meaning to it.

And a learner of any language will certainly progress to a better understanding of it than google translate.

Hans-Georg Lundahl
Nanterre UL
St John Capistran
23-X-2013

Aucun commentaire:

Enregistrer un commentaire