You might have seen the cartoon: two cavemen sitting outside their cave knapping stone tools. One says to the other: ‘Something’s just not right – our air is clean, our water is pure, we all get plenty of exercise, everything we eat is organic and free-range, and yet nobody lives past 30.’
This cartoon reflects a very common view of ancient lifespans, but it is based on a myth. People in the past were not all dead by 30. Ancient documents confirm this. In the 24th century BCE, the Egyptian Vizier Ptahhotep wrote verses about the disintegrations of old age. The ancient Greeks classed old age among the divine curses, and their tombstones attest to survival well past 80 years. Ancient artworks and figurines also depict elderly people: stooped, flabby, wrinkled.
This is not the only type of evidence, however. Studies on extant traditional people who live far away from modern medicines and markets, such as Tanzania’s Hadza or Brazil’s Xilixana Yanomami, have demonstrated that the most likely age at death is far higher than most people assume: it’s about 70 years old. One study found that although there are differences in rates of death in various populations and periods, especially with regards to violence, there is a remarkable similarity between the mortality profiles of various traditional peoples.
So it seems that humans evolved with a characteristic lifespan. Mortality rates in traditional populations are high during infancy, before decreasing sharply to remain constant till about 40 years, then mortality rises to peak at about 70. Most individuals remain healthy and vigorous right through their 60s or beyond, until senescence sets in, which is the physical decline where if one cause fails to kill, another will soon strike the mortal blow.
So what is the source of the myth that those in the past must have died young? One is to do with what we dig up. When ancient human remains are found, archaeologists and biological anthropologists examine the skeletons and attempt to estimate their sex, age and general health. Markers of growth and development, such as tooth eruption, provide relatively accurate age estimates of children. With adults, however, estimates are based on degeneration.
From our partners:
We are all able to instinctively label people as ‘young’, ‘middle-aged’ or ‘old’ based on appearance and the situations in which we encounter them. Similarly, biological anthropologists use the skeleton rather than, say, hair and wrinkles. We term this ‘biological age’ as our judgment is based on the physical (and mental) conditions that we see before us, which relate to the biological realities of that person. These will not always correlate with an accurate calendar age, as people are all, well, different. Their appearance and abilities will be related to their genetics, lifestyle, health, attitudes, activity, diet, wealth and a multitude of other factors. These differences will accumulate as the years increase, meaning that once a person reaches the age of about 40 or 50, the differences are too great to allow any one-size-fits-all accuracy in the determination of the calendar age, whether it is done by eye on a living person or by the peer-preferred method of skeletal ageing. The result of this is that those older than middle age are frequently given an open-ended age estimation, like 40+ or 50+ years, meaning that they could be anywhere between forty and a hundred and four, or thereabouts.
The very term ‘average age at death’ also contributes to the myth. High infant mortality brings down the average at one end of the age spectrum, and open-ended categories such as ‘40+’ or ‘50+ years’ keep it low at the other. We know that in 2015 the average life expectancy at birth ranged from 50 years in Sierra Leone to 84 years in Japan, and these differences are related to early deaths rather than differences in total lifespan. A better method of estimating lifespan is to look at life expectancy only at adulthood, which takes infant mortality out of the equation; however, the inability to estimate age beyond about 50 years still keeps the average lower than it should be.
Archaeologists’ age estimates, therefore, have been squeezed at both ends of the age spectrum, with the result that individuals who have lived their full lifespan are rendered ‘invisible’. This means that we have been unable to fully understand societies in the distant past. In the literate past, functioning older individuals were mostly not treated much differently from the general adult population, but without archaeological identification of the invisible elderly, we cannot say whether this was the case in non-literate societies.
My colleague Marc Oxenham and I wanted to understand early societies more fully so we developed a method for bringing to light the invisible elderly. This method is applicable only to cemetery populations that have seen little change over the life of the cemetery, and without massive inequality between the inhabitants. That way it can be assumed that the people ate similar foods, and behaved in similar ways with their teeth. One such cemetery is Worthy Park near Kingsworthy, Hampshire, where Anglo-Saxons buried their loved ones some 1,500 years ago. It was excavated in the early 1960s.
We measured the wear on the teeth of these people, and then seriated the population from those with the most worn teeth – the oldest – to those with the least worn. We did this for the whole population, not just the elderly, to act as a control. We then matched them against a known model population with a similar age structure, and allocated the individuals with the most worn teeth to the oldest ages. By matching the Worthy Park teeth to the model population, the invisible elderly soon become visible. Not only were we able to see how many people lived to a grand old age, but also which ones were 75 years or older, and which were a few years past 50.
Seeing the invisible elderly has led to other discoveries. It has often been suggested that more men than women lived to older age in the past because of the dangers of pregnancy and childbirth, but our study suggests otherwise. We applied our method to two other Anglo-Saxon cemeteries as well – Great Chesterford in Essex and the one on Mill Hill, in Deal, Kent – and found that, of the three oldest individuals from each cemetery, seven were women and only two were men. Although not conclusive proof, this suggests that older age spans for women might be part of the human condition.
We also looked at the treatment of the elderly in their graves. Anglo-Saxon men were often buried with weapons while women were buried with brooches and jewellery including beads and pins. This suggests that men were identified by their martial qualities, while women were admired for their beauty. Men also maintained or increased their status in their graves well into their 60s, while women’s ‘value’ peaked in their 30s and declined further as they aged. Intriguingly, the class of item most likely to be found in the graves of the elderly rather than younger individuals was the grooming tool. The most common of these was tweezers, and most of these were buried with old men. Did this mean that old men were concerned about their looks? Or that old women were too far from beauty for tweezers or other grooming items to help? Findings such as these provide a glimpse into the lives of people of the past, a glimpse that was impossible without identifying the invisible elderly.
The maximum human lifespan (approximately 125 years) has barely changed since we arrived. It is estimated that if the three main causes of death in old age today – cardiovascular disease, stroke, and cancer – were eliminated, the developed world would see only a 15-year increase in life expectancy. While an individual living to 125 in the distant past would have been extremely rare, it was possible. And some things about the past, such as men being valued for their power and women for their beauty, have changed little.
Christine Cave
This article was originally published at Aeon and has been republished under Creative Commons.