Ramble, Part 10: A Crash of Symbols
Jan. 2nd, 2007 06:55 pm"It is a profoundly erroneous truism, repeated by all copy books and by eminent people when they are making speeches, that we should cultivate the habit of thinking of what we are doing. The precise opposite is the case. Civilization advances by extending the number of important operations which we can perform without thinking about them." - Alfred North Whitehead
Whitehead goes on to explain that thinking should be reserved for the decisive moments, rather as a general holds back the cavalry until their charge can have greatest effect. Whitehead wrote those words in 1911, and cavalry was about to become obsolete as an instrument of war, but I still like the metaphor.
Terminology matters. ("It's just semantics" - as if what your words mean wasn't a matter of supreme importance!) Notation matters. Good terminology and good notation (and those are two versions of the same thing) make thinking easier; they contain within themselves the compressed thinking of the past, so that we may expend our thinking-time in new areas. That's going to be the main theme of the next few Ramble posts, in several ways. I'm afraid, though, that the first round is likely to be rather dry, as the symbol-systems I want to look at are, well, things one learns in grade school. I hope that what I have to say will be of some interest; as a precaution, though, it's under the cut.
India, during the European Middle Ages, was the original home of quite a bit of interesting mathematics; it appears that trigonometry was born there, and Indian mathematicians of that era proved a number of nice theorems in geometry. But their greatest contribution, in terms of overall impact, was the development of what are now called Hindu-Arabic numerals. I'd like to spend a little time talking about numeration systems in general, before getting into the reasons why this was such a big deal.
First off, a stipulation. Numbers are concepts; numerals are symbols used to represent those concepts. In the broadest sense, the words we use to refer to numbers are numerals, of a sort; so are the - heh - digital configurations we call "counting on your fingers". I'm going to concentrate on the various written (but non-phonetic) numeration systems. Most Westerners are familiar with several such systems: Hindu-Arabic numerals, Roman numerals, various sorts of tally-marks, and the non-decimal relatives of Hindu-Arabic numerals, in particular binary and hexadecimal notation. There's also "scientific notation", which is an offshoot of Hindu-Arabic numeration, but has some distinctive properties.
Consider the various uses of a numeration system. Numbers can be used simply as labels; hence it is desirable to be able to easily distinguish the numerals which represent different numbers. Numbers can be used to establish ordering; hence we want to be able to compare two numerals and determine which represents the larger number. Most importantly, we need to be able to compute with them; there must be algorithms for addition and multiplication, and (of somewhat less importance) for subtraction and division as well. The ease of these tasks provides us with ways to evaluate different numeration systems. (Every numeration system can be used to carry out these tasks, within certain practical limits; not every system handles them equally well.)
It is also desirable that the system be (at least in principle) indefinitely extensible, and relatively compact. Tally marks are extensible ad libitum, but not particularly compact. The ancient Egyptians used a system which involved a separate symbol for each power of ten; up to nine of each symbol could be used to represent a desired number. Extending this system required the coining of more power-of-ten symbols, which is a bit of a handicap. Roman numerals were similar in concept, with the added feature of symbols representing 5 times a power of ten. (The subtractive principle - "IV" for "4" - was a medieval development, as far as I know.) At some point, Roman numerals were made extensible by the convention that placing a bar over a numeral had the effect of multiplying it by a thousand. This works in principle, but aesthetically it's a kludge. In any case, the Roman system (and, even more, the Egyptian system) was not particularly compact. Hindu-Arabic numerals were designed to be extensible from the beginning; the principle of extension is transparent, once the basic idea is grasped. The system is considerably more compact than Roman numeration, as well.
Probably the most important point of comparison between the various familiar systems has to do with the algorithms used to add and multiply numbers. We all spent a good deal of time in grade school "learning our plus and times tables", and the amount of memorization required for this isn't trivial. Algorithms for computations with Roman numerals are easy to devise, and do not require as much memorization; on the other hand, knowing that V times V is XXV doesn't carry over well to, say, L times L. There's a somewhat higher memory-load, then, for Hindu-Arabic notation, but once learned it applies everywhere. (The corresponding memory-load for, say, binary numerals is almost trivial, but the cost is a loss of compactness in notation - the binary numeral for a given number is roughly three times as long, on average, as the corresponding Hindu-Arabic numeral.)
Hindu-Arabic notation also has another outstanding feature: there are relatively simple error-checking procedures, most notably "casting out nines", which was already known to the Indians of that era. Given the importance of such techniques in, say, bookkeeping, the implications for finance - for banks and merchants, and for governments as well - are enormous. It is not a coincidence that double-entry bookkeeping was developed in Europe soon after Hindu-Arabic numerals came into common use; it may not be a coincidence that governments became stronger and more centralized at roughly the same time. (One reason for the perennial weakness of medieval European states was their absolutely atrocious financial recordkeeping, or so I have read.)
And yet, and yet... If Hindu-Arabic notation had such obvious benefits, why did it take so long for Europeans to adopt it? Gerbert of Auvergne, later Pope Sylvester II, was familiar with Hindu-Arabic numerals in the late tenth century, but nothing came from that; it was not until the turn of the thirteenth century that the notation finally broke through. The reasons for that delay, and the way in which it ended, will be the subject of the next Ramble.
Previous Next
Ramble Contents
Whitehead goes on to explain that thinking should be reserved for the decisive moments, rather as a general holds back the cavalry until their charge can have greatest effect. Whitehead wrote those words in 1911, and cavalry was about to become obsolete as an instrument of war, but I still like the metaphor.
Terminology matters. ("It's just semantics" - as if what your words mean wasn't a matter of supreme importance!) Notation matters. Good terminology and good notation (and those are two versions of the same thing) make thinking easier; they contain within themselves the compressed thinking of the past, so that we may expend our thinking-time in new areas. That's going to be the main theme of the next few Ramble posts, in several ways. I'm afraid, though, that the first round is likely to be rather dry, as the symbol-systems I want to look at are, well, things one learns in grade school. I hope that what I have to say will be of some interest; as a precaution, though, it's under the cut.
India, during the European Middle Ages, was the original home of quite a bit of interesting mathematics; it appears that trigonometry was born there, and Indian mathematicians of that era proved a number of nice theorems in geometry. But their greatest contribution, in terms of overall impact, was the development of what are now called Hindu-Arabic numerals. I'd like to spend a little time talking about numeration systems in general, before getting into the reasons why this was such a big deal.
First off, a stipulation. Numbers are concepts; numerals are symbols used to represent those concepts. In the broadest sense, the words we use to refer to numbers are numerals, of a sort; so are the - heh - digital configurations we call "counting on your fingers". I'm going to concentrate on the various written (but non-phonetic) numeration systems. Most Westerners are familiar with several such systems: Hindu-Arabic numerals, Roman numerals, various sorts of tally-marks, and the non-decimal relatives of Hindu-Arabic numerals, in particular binary and hexadecimal notation. There's also "scientific notation", which is an offshoot of Hindu-Arabic numeration, but has some distinctive properties.
Consider the various uses of a numeration system. Numbers can be used simply as labels; hence it is desirable to be able to easily distinguish the numerals which represent different numbers. Numbers can be used to establish ordering; hence we want to be able to compare two numerals and determine which represents the larger number. Most importantly, we need to be able to compute with them; there must be algorithms for addition and multiplication, and (of somewhat less importance) for subtraction and division as well. The ease of these tasks provides us with ways to evaluate different numeration systems. (Every numeration system can be used to carry out these tasks, within certain practical limits; not every system handles them equally well.)
It is also desirable that the system be (at least in principle) indefinitely extensible, and relatively compact. Tally marks are extensible ad libitum, but not particularly compact. The ancient Egyptians used a system which involved a separate symbol for each power of ten; up to nine of each symbol could be used to represent a desired number. Extending this system required the coining of more power-of-ten symbols, which is a bit of a handicap. Roman numerals were similar in concept, with the added feature of symbols representing 5 times a power of ten. (The subtractive principle - "IV" for "4" - was a medieval development, as far as I know.) At some point, Roman numerals were made extensible by the convention that placing a bar over a numeral had the effect of multiplying it by a thousand. This works in principle, but aesthetically it's a kludge. In any case, the Roman system (and, even more, the Egyptian system) was not particularly compact. Hindu-Arabic numerals were designed to be extensible from the beginning; the principle of extension is transparent, once the basic idea is grasped. The system is considerably more compact than Roman numeration, as well.
Probably the most important point of comparison between the various familiar systems has to do with the algorithms used to add and multiply numbers. We all spent a good deal of time in grade school "learning our plus and times tables", and the amount of memorization required for this isn't trivial. Algorithms for computations with Roman numerals are easy to devise, and do not require as much memorization; on the other hand, knowing that V times V is XXV doesn't carry over well to, say, L times L. There's a somewhat higher memory-load, then, for Hindu-Arabic notation, but once learned it applies everywhere. (The corresponding memory-load for, say, binary numerals is almost trivial, but the cost is a loss of compactness in notation - the binary numeral for a given number is roughly three times as long, on average, as the corresponding Hindu-Arabic numeral.)
Hindu-Arabic notation also has another outstanding feature: there are relatively simple error-checking procedures, most notably "casting out nines", which was already known to the Indians of that era. Given the importance of such techniques in, say, bookkeeping, the implications for finance - for banks and merchants, and for governments as well - are enormous. It is not a coincidence that double-entry bookkeeping was developed in Europe soon after Hindu-Arabic numerals came into common use; it may not be a coincidence that governments became stronger and more centralized at roughly the same time. (One reason for the perennial weakness of medieval European states was their absolutely atrocious financial recordkeeping, or so I have read.)
And yet, and yet... If Hindu-Arabic notation had such obvious benefits, why did it take so long for Europeans to adopt it? Gerbert of Auvergne, later Pope Sylvester II, was familiar with Hindu-Arabic numerals in the late tenth century, but nothing came from that; it was not until the turn of the thirteenth century that the notation finally broke through. The reasons for that delay, and the way in which it ended, will be the subject of the next Ramble.
Previous Next
Ramble Contents
no subject
Date: 2007-01-03 02:27 pm (UTC)I did think that it might count under there. Good to know I can still sometimes think ahead to a "next step" in concepts like this. I really think my brain's atrophying these days... ;-)
Thanks for the book! I figured just asking you would be much easier!
I'll be looking forward to reading your next installment!