On nothing in particular

Students seem amazed that the idea of "0" as a number is a relatively new concept, but students also see "0" as a numeral, as solid and inevitable as a "3" or a "7"--as most of us do. How could zero be invisible to the Greeks? (It wasn't, of course...but we only think so because we misunderstand our own concept of "0".)




XKCD rules!


That's not how we use it, though, at least not most of the time in the simple measurements or arithmetic we use in class. Zero serves mostly as a placeholder.

Take the number "306"--the "0" means there are no chunks of tens beyond those already captured by the three hundreds signified by the "3". We have 6 units, true, not quite enough to form a chunk of ten, so we need a way to show this: 3_6.

This is glaringly obvious to many of us who grew up without electronic calculators. The calculator we occasionally used in elementary school was the abacus, which requires grasping the concept of placeholders in order to manipulate the beads, and the slide rule, which requires the use of mental placeholders to make any sense at all.


If you do not grasp the very real difference between zero and the other nine digits, subtle problems arise, which become not so subtle when playing with significant figures ("sig figs"), those numerals that carry any real meaning when we apply math to the natural world.

We too often teach sig figs as a set of rules, which make little sense if the student has a poor grasp of zero, which many--through no fault of their own--do. The rules seem arbitrary and arcane, when, in fact, they define the limits of numerical reality when we measure the natural world.

"Numerical reality" comes off a bit too abstract. I should rephrase this: significant figures define the limits of observable mathematical relationships found in the natural world. The relationships are real, inasfar as they can be measured by independent observers. In class, however, sig figs become an exercise in futility for many.

I think it gets down to the zero. Electronic calculators add a level of abstraction to numeracy, leading to the ridiculous assertion science teachers hear at least once a week: "But my calculator says..."

And indeed, the calculator says exactly what it's supposed to say, when its operators plug in numbers without any feeling for what they represent.

And how could they? The Greeks had the concept without the symbol. We, alas, now have the symbol without the concept.






Yes, I know, the Mayans and Indians and Phoenicians and on and on and on had it down pat.
Thank Fibonacci for getting it to the western folk.


XKCD comic used wtihin his guidelines--very broad ones at that. Thanks! 

And yes, the sig fig rules are a tad arbitrary--range of error makes more sense.

Blog Archive