DO the math, DON'T overpay. We make high quality, low-cost math resources a reality.

Wednesday, August 7, 2019

A Brief History of Zero

A Brief History of 0
Today, the existence of zero is a given. After finishing up some dessert, people say things like "there are zero cookies left" and "I have zero energy to get up" without thinking much about throwing zero into their sentence. (Presumably their minds are far more occupied on all the sugar they just ate) However, up until relatively recently in human history, zero as a concept was unheard of!

All across the world, people thought of number systems soon after they needed to keep track of how much stuff they had. If you have five sheep and give one away, it is helpful to write down that you now have four sheep so you won't frantically wonder where the missing sheep is. Number systems popped up from India to Greece to Rome to Central America. However, these number systems were usually only limited to positive integers. There was no need to account for negative livestock!

The first use of zero came about in a few number systems as placeholders. This didn't occur in all number systems. The Roman system didn't need placeholders; they used letters to identify a given quantity, and a letter was added to determine differences in quantity. However, the Arabic number system, for example, needed placeholders in order to determine the difference between numbers like 404 and 44. Without the 0 indicating that there are no tens in 404, 404 and 44 would be indistinguishable. While the Mayan number system developed this independently, the adoption of zero as a placeholder throughout much of Asia came about through trade routes.

Up until the 600s, the use of zero was still as a placeholder rather than a unique integer of its own right. However, a mathematician named Brahmagupta argued that zero should be treated as an integer, with all the properties of an integer - including the tricky situation of division. This idea gained traction throughout most of Asia, but failed to immediately make its mark in Europe, which still held on tightly to the Roman system of counting. Only in the 1200s, when European mathematicians such as Fibonacci advocated for the more useful Arabic numerical system, did Europe adopt zero as an integer and concept.

This adoption ended up proving useful a few hundred years later, during the time of Isaac Newton. As anyone who has taken calculus knows, the limit definition of a derivative requires dividing an expression by something that approaches zero. This fundamental idea of calculus
wouldn't have been possible without the idea that the absence of something should be something that should be written down, and then processed as an integer.

No comments:

Post a Comment