Probably the first practical use to which early civilizations put their astronomical knowledge was the devising of calendars, which were essential for organizing the social, economic, and religious life of their society. Prehistoric stone circles, such as those at Stonehenge in Britain and Carnac in France, for example, almost certainly functioned as calendars.
There is, however, a fundamental problem in devising an accurate calendar: the principal units of time (the day, month, and year) are not simple or integral fractions of each other. In simple terms, a day is the time taken by the earth in completing one revolution on its axis. A month was originally regarded as the time taken for the moon to make one orbit of the earth, or the period from one full moon to the next. A year is the time taken for the earth to complete one of its orbits around the sun. But there are several ways of measuring and defining each of these basic units. As a result, numerous different calendars have been developed through the ages, and even today, a number of cultures use calendars that differ from the usual Western kind, such as the Islamic calendar, the Jewish calendar, and several Oriental calendars.
In many calendars, the moon serves to divide the year into months and the phases of the moon are often used to mark religious festivals. The word month itself is derived from moon. The Chinese of the Shang dynasty (c.1766-c.1122 B.C.) used a calendar based on a 30-day lunar month, and even in the modern Islamic calendar, the beginning and end of Ramadan (the month of fasting) are fixed by sightings of the new moon. The way in which the Christian Church calculates the date of Easter also depends on the lunar cycle. Priests of the Mayan civilization, which flourished in Central America from about A.D. 250 to the ninth century, devised several elaborate calendars, one of which was of 360 or 365 days and was based on the periodicity of the orbit of the planet Venus. The first calendar that related days, months, and years was the Metonic calendar. This was based on the lunar cycle of 19 years, after which the moon’s phases recur on the same days of the year. Devised by the ancient Greek astronomer Meton in about 432 B.C., this calendar was later adopted by the Persians. It is still used today to define Pass-over in the Jewish calendar.
The calendar most used today is based principally on the sun. A day consists of 24 hours and a normal year of 365 days, divided into 12 months of between 28 and 31 days each. This leads to certain discrepancies, the most significant resulting from the fact that the solar, or tropical, year is 365.262 days long. Compensation is made by adding an extra day to February every fourth year, creating a “leap year” of 366 days (except certain century years).
As civilization developed, people began to travel for trade and exploration and it became increasingly important to be able to navigate accurately, particularly when a ship was out of sight of land. By the third century B.C., there was great competition among the merchants and traders who plied the Mediterranean Sea in sailing ships. Sailors navigated mainly by using the positions of the stars. As a result, astronomy came to be used for navigational purposes, principally because it is relatively easy to calculate latitude by measuring the declination of stars.
Today a navigator can calculate his latitude by measuring the altitude (in degrees above the horizon) of the Polestar, Polaris, which is less than 1 ° away from the north celestial pole. But when people first began to travel extensively 2,000 years ago, Polaris was a considerable distance from the celestial pole. It is only in the last 300 or 400 years that the procession of the earth’s axis has brought Polaris usefully near the pole. Instead, the early navigators had to calculate latitude from the position of Kochab, which was then the nearest star to the celestial pole.
The beginning of cartography
With increased trade and travel by land and by sea, maps also became increasingly important. The early maps of the Mediterranean Sea (around which many of the ancient civilizations developed) are relatively accurate with regard to latitude, but many are grossly in error concerning longitude. This is because differences in longitude are best calculated from the differences in time at which the sun reaches its highest point. The early sailors did not have clocks sufficiently accurate for this purpose. They therefore had to use dead reckoning to estimate longitude, which they calculated from estimates of their speed and the number of hours that they had been under sail. In fact, the determination of longitude remained largely a matter of guesswork until the first accurate chronometers were constructed in the eighteenth century.
Fortunately for the early sailors, the Mediterranean Sea is relatively safe. The Atlantic Ocean, however, is much more hazardous, and accurate navigation is much more important. Despite this difficulty, the Phoenicians successfully navigated out of the Mediterranean and sailed as far as Britain, where there is evidence that they went to trade before the Roman occupation.
By about A.D. 140, a navigators’ fraternity had developed. In each major port scribes were employed to sketch charts of the ports and to copy out tables of star declinations and other astronomical information, from which latitudes could be calculated. For instance, it was well known that the stars of Ursa Major, the Big Dipper, appeared to just dip into the sea at their lowest point when viewed from the latitude off Alexandria. But early sailors soon learned not to rely on the planets and the moon for navigational purposes. They thought that the “restless stars” (planets) were deceptive because they appeared to move about the night sky, and the moon was regarded as a “wanton woman and a mystery.”