Wednesday, March 21, 2012

Zero

At a young age we enter into the world of numbers. We learn that 1 is the first number and that it introduces the counting numbers - 1, 2, 3, 4, 5 ... Counting numbers do exactly what their name implies; they count real things - apples, oranges, sheep, etc. It's only later that we learn how to count the number of apples in an empty box.

Even the early Greeks, who brought about giant leaps in science and mathematics, and the Romans, the greatest engineers in the ancient world, didn't have a way to deal with the number of apples in an empty box.They both failed to give "nothing" a numerical name. The Romans had their own ways of combining I, V, X, L, C, D, and M but where was 0? They never tried to count "nothing."

The use of a symbol to desinate "nothingness" probably originated thousands of years ago. The Maya of Meso-America used zero in various forms. The astronomer Claudius Ptolemy, influenced by Babylonians, used a symbol similar to our modern 0 as a placeholder in his number system. As a placeholder, zero could be used to distinguish between numbers such as 75 and 705 instead of relying on context as in Babylonian mathematics. This is somewhat like the introduction of the comma into written language as a way of clarifying meaning. And like the comma, zero comes with its own set of rules for usage.

The seventh century Indian mathematician Brahmagupta treated zero as number, not merely a placeholder and set up rules for dealing with it. These included "...the sum of zero and a positive number is positive" and "...the sum of zero and zero is zero." The Hindu-Arabic numbering system that included zero as both a number and a placeholder was introduced into Western mathemics in the early 1200's by Leonardo of Pisa, better known as Fibonacci.

The use of zero in our number system posed a problem that Brahmagupta had addressed: how was this new number to be treated? How could zero be integrated into into the existing system of arithmetic algorithms in a precise way? When it came to addition and multiplication, 0 fitted in neatly, but was more problematic in subtraction and division.

Adding and multiplying with zero is straightforward and presented little challenge. While you can add (i.e. attach) 0 to 10 to get 100; when used in an algorithm as a number, 0 + a number leaves the number unchanged as you're adding nothing to it and 0 * a number = 0 as you are combining a given number of sets with nothing in them. Subtraction is a simple operation, but may lead to negative numbers, 7-0=7, but 0-7=-7.

Division using zero raised some difficulties. Imagine a length of 24 feet to be measured with a standard 3 foot yardstick. Division involves finding out how many yardsticks we can lie along that 21 foot path. The answer is 24 divided by 3, or symbolically, 24/3=8

What, then, can be made of 0 divided by 3? Using algebraic symbology, that could be written as 0/3 = x. By using cross multiplication, this expression is equivalent to 0= 3 * x. Since this is the case, the only possible value for x is zero itself because if the product of two numbers is zero, one of them has to be zero.

This still isn't the main difficulty with zero. The sticking point is division by zero. If we treat 3/0 in the same way that we treated 0/3, we would have the equation: 3/0 = x. Cross multiplication would produce 0 * x = 3 leading to the nonsensical conclusion that 0=3. By admitting the possibility that 3/0 is a number opens the door for mathematical mayhem! The way out was to say that 3/0 is undefined. We can't arrive at a sensible answer by dividing any nonzero number by zero, so we simply don't allow the operation to take place. This is similar to not allowing a comma to be placed in the middle of a word to avoid the creation of linguistic nonsense.

The 12th century Indian mathematician Bhaskara suggested that a number divided by zero was infinite. He asked how many times could one go to a man with five stones in his hand and take nothing away from him? By adopting this form of reasoning, we become sidetracked into the concept of infinity. This offers no mathematical resolution. Infinity doesn't conform to the usual rules of arithmetic and isn't a number in the usual sense.

Considering 0/0 is also an interesting avenue of exploration. By using cross multiplication on the equation 0/0 = x we arrive at the conclusion that 0 = 0. While this isn't nonsense, neither is it particularly illuminating. It also leads to the conclusion that the value of x could be anything. So we use the term "indeterminate" to describe 0/0. All in all, when we consider dividing by zero we arrive at the conclusion that it's best to exclude that operation from our system of algorithms. Math can get along perfectly well without it.

So what use is zero? Simply put, we can't get along without it. It is one of our base concepts that makes the number system, algebra, geometry, along with all their related disciplines functional. On the number line 0 is the number that separated positive and negative integers. In the decimal system 0 makes it possible to write and use both huge and microscopically small numbers.

When 0 was introduced it must have seemed like an exceedingly odd idea, but mathematicians have a habit of fastening onto strange concepts which are proved to be useful much later. The acceptance and utilization of zero was just such a concept that became one of the greatest inventions of man.

No comments:

Post a Comment