Reconciling Calendar Controversies and Debugging the Y2K "Humind" Computer. The year 2000 was as controversial as it was confrontational. A prominent area of disagreement was in the significance of the Y2K problem and the exact starting of the Millennium Mark. First the dates in computers were supposed to roll over to zeros and all the data in them were supposed to go haywire. Then there was a heated debate as to whether the 3rd Millennium begins in 2000 or 2001. Rippling beneath the surface is a trans-Atlantic mutual snub as to how dates should be written: whether "old continentally" like 13th April 2000 (13/04/2000 for short) or the American way as in April 13th, 2000 (04/13/2000) or as recommended by the ISO (International Standards Organisation) 2000, April 13 (2000/04/13). I find it necessary that we find a solution to this problem because even serious authorities on the subject are misleading the general public. Leaving the Y2K problem aside to consider the Millennium of Confusion, I often wonder if the real bug is not in the human mind itself. There have been many arguments in the media that the 3rd Millennium started from 2001 and not from 2000. The reason normally given is that our present calendar system started not from (or with) zero but from year one, apparently because zero was not in use when Monk Dionysius the Short drew up our present calendar system. In one heated panel discussion of two opposing sides, they all agreed on the "zero source" of the problem, but the sides proposed two different solutions. One group insisted that we should simply accept 2001 as the start of the Millennium. The other group wanted a year zero to be "inserted" at the beginning of our era so that the millennium started at 2000. They further justified their position from calculations, which showed that Dionysius the Short actually "mis-fixed" Christ's birthdate by a few years later than the actual one. In anycase, they continued, one always starts abstract things like time or distance from zero. "Not true" was the repost - the week starts with a first day not with a day zero! To make my position clear from the onset, I sympathise with the year 2000 group but feel that their explanation for a solution is faulty. I am inclined to believe that Monk Dionysius did not make a mistake! It is our understanding of the concepts of zero, of mental-scale demarcations and of counting and naming virtual objects, which are mixed up. The obvious reason for the later addition of zero to our number system is to facilitate artificial (mental) segmentation of infinite/intangible things with a starting mark and other divisions for their easy count-quantization. Note the phrase "beginning mark". One could start anywhere and "draw" the demarcations before counting the segments without the need for zero or even for explicitly written numerals. But this requires a real-time recount each time we need the quantity of the thing so that there is a clear advantage in storing a reminder of the count at the end-demarcations of each segment for easy and instant retrivability (i.e. a need for scale construction). We need a digit zero at the origin of the scale (before there is any segment) for reasons of consistency, starting (or transition) point recognizability and for synchronization-calls of paralleled counting/processing. Zero represents any (shiftable) infinitesimal reference point in space or time as the start-demarcation of a real or mental counting scale. As such, it initiates the count by helping our perception but is not part of the count itself. Unlike the concept of zero, "zero anything" is non-existent (if we want to preserve the original meaning of the word). The confusion comes about because people conflate or are unaware of a mental scale (i.e. automated segment count at demarcations) with reference-naming or listing of the scale's intervals. This is because named intervals are more concrete than the scale itself. Nobody tries to conceptualize the magnitude and meaning of an infinitesimal point of time. It is normally marked for us externally by a clock scale. On the other hand, we associate a particular interval with an idea of its (processing) size and with particular events. Therefore, intervals are almost like concrete things that we name and refer to. Such tricky mix-ups can be observed in our communication of clock-time to others. Before the clock strikes one, we do not have an hour yet, neither do we call it hour zero. We have a fraction of an hour in minutes and seconds. In other words, there is no hour count but there is a fractional reading. To list/name or refer to that interval, we can call it the first hour even though an hour has not yet passed. This gives us an advantage of being able to pre- reference something (before it materialises) thanks to our anticipation of the count. When the first hour has passed, we now have a count and a reading. The count is retained as memory while the reading is given by a perceived "process front" (e.g. position of clock hands). Even though we say, for example 1:30 a.m. thus mixing count and reading, the interval name is not the first- but the second-hour. This conjunction of a passed count with a reading (apart from an associated passed interval, e.g. first) is a source of confusion as to which interval we are situated within. We usually tend to "de-numerate" our time-telling by saying "half pass one" for the second interval. The possibility of confusion is worsened by, for example, "quarter to one" which is totally within a different interval. If one does not hear the preposition well, there is a great possibility of misunderstanding the time. Thus, the English language is syntactically flawed in telling time. In Germanic languages and in Russian things are a little different. An interval that is not current is never mentioned in reading time. When they see 1:30, it is read as "half of the second (hour)." 1:45 is communicated as "quarter before two (hours)". Thus, it is more difficult to confuse the interval even if one does not hear the short, often noun-incorporated, preposition properly. Coming back to the year-dating of our era and the Millennium mark problem, I believe it is such a confusion that makes us misunderstand Dionysius' calendar construction of our present era. Here we have no "clock scale" and it will be impractical to draw up one for hundreds, if not thousands, of years. Therefore, it is only a mental scale. To represent it on paper, Dionysius gave a listing of the year counts from accounts of history. This can also be done by just giving a number sub-range e.g. 1 - 600. Zero is neither a count nor an interval name but an instant in time. Translating this retrospective listing of years to a contemprorary ongoing calendar scale is what causes the confusion. Technically years have no names apart from ordinal ones (1st, 2nd, etc.) and a year like 1986 is actually an instant count-change at the end of the 1986th year (i.e. December 31, 1985)! The Y2K roll-over explains this instantness better. You do not count something before you have finished processing it. Counting is posterior while naming is current. People have been writing the counted year on published calendars of the next unfolding year. Though the practice is alright (count plus reading), this gives the false impression that it is a reference to (or a name for) the current year. We try to read dates as days, weeks, months "OF that year" instead adding the fractions TO the year counted. This is just like the time-telling problem where we avoid the deception by a clear perception of an actual clock scale. Dionysius had a better understanding of numeracy but unfortunately this "agility" has now reduced with our present wave of digitisation. To make sure that, (year) one is not taken as the first demarcation of his calendar scale but as the first count, he coined the phrase "(in the) year of our lord" (Anno Domini, A.D. in Latin). This can equate to the year zero or zero-th year which people wanted but which will be misnomers. So after Anno Domini we then have the count of years: 1,2,3, etc. Other historians used to tag this phrase with all year counts and translated it correctly as e.g. 600 A.D. = 600 years "AFTER (the year of) the Birth of Christ". This means that the 600 years (=count with units) have passed already (not year 600 =name without units) and if we want an ordinal name of the year of interest, it will be the 601st year! The confusion has not been helped with our use of everyday language. When we say: "in 1999", we are trying to name the year with what is on our calendars instead of saying something like "in the year after 1999" (i.e. in the 2000th year, ending 31st December with a count roll-over)! So there is a mismatch between our writing of years and the year intervals of interest. We have all learned in school history that the 19th century was from 1800 to 1899. Clearly we have not overcome our initial puzzle at this, even though we accept that there was no zero-th century. The irrelevance of zero as a count or a name, instead of an initiating demarcation of processes, can be appreciated during a cout-down when it is replaced by the actual action: 3, 2, 1, FIRE! Suggestions for rectifying these apparent dating errors are to construct a new calendar scale, Common Era (C.E.), that includes a year zero which will be pushed closer to the actual birth of Christ. All B.C. years in histroy will then have 1 year subtracted from them. This suggestion, just like those at the beginning of this article, is not optimal. We should simply accept the 3rd Millennium to have started from 2000. Afterall, what is in a name that we should throw consistency to the dogs? If convenient, we should also restart tagging all year counts with a prepositional phrase when we refer to the following and unfolding year interval. We could use the other name for C.E., which is A.P. (Aera Principium, our principal era). A.P. could also be "Latinized" as "Anno Posteri" so that A.P. 1999 will be read as "(in the) year after 1999". As to the complete writing of dates, the ISO recommendation is ideal. Our number system usually starts with higher order digits or units, with those to the right standing for sub-units: thus years, months, days. This is what we do in writing time: hours, minutes, seconds. Again differences in writing dates may have arisen from practical considerations. When the current year is understood in a date, we often just write the month and day, the year probably being added in future at the back. In that case, the American system is closer to the standard. On the Old Continent, even the month is often understood to be the current one. We hear questions like "what day (number) is today?" It is therefore understandable that the day is written down first, followed by the month and the year as after-thoughts. [a search in the Internet will give you a horde of pages whose logic about this problem is plainly suspicious. e.g http://www.astronomyboy.com/millennium/ ] Cosmas Taabazuing. Rue du Cloitre, 1020 Bruxelles, Belgique. GSM:0032.496.957.106
Disclaimer: "The views/contents expressed in this article are the sole responsibility of the author(s) and do not neccessarily reflect those of Modern Ghana. Modern Ghana will not be responsible or liable for any inaccurate or incorrect statements contained in this article."
Reproduction is authorised provided the author's permission is granted.