They’ve had


P O X!




M.B. O’Hare


Contact DiG-Press via email for more information:

© M. B. O’ Hare. 2018.

May be used for educational purposes without written permission but with a citation and copyright notice linked to this source.




Whatever happened the Bubonic Plague?


Chickenpox May Be The Reason Why We Don’t Die in Our Millions from the Plague!


The Many ‘Typhoid Marys’


Typhus: Filling in the Gaps


Cholera: The Disease that Inspired Bram Stoker to Write Dracula?


Scarlet Fever Returns: but it is a lot less deadly


Don’t Count Your Children ’til they’ve had the POX) Killer Plagues of Our Modern Era

Plummet from Natural Causes..?


Don’t Count Your Children Before They’ve Had the Pox



Fig. 1: Physician attire for protection from the Black Death by Paul Fürst; Source: (CC-BY), commons.wikimedia. Vaccine added by author.



Whatever happened the Bubonic Plague?

BBC History

‘there were hardly enough living to care for the sick and bury the dead’
…scarcely a tenth of mankind was left alive.
James, 2011, ‘Black Death: The lasting impact’  

A plague tsunami swept across Europe in the year 1348 wiping out more than a third of its population. By the late summer, the Black Death descended upon Irish shores with a particularly harsh impact in the urban centres which took its greatest toll in the dead of winter. Spreading out in a second wave reaching its tentacles beyond to the most desolate hills in search of fresh victims, this plague knew no social boundaries as Nobles, Clergy, Merchants and Peasants either survived the disease – gaining immunity, or succumbed and died.

‘1348: A Medieval Apocalypse – The Black Death in Ireland’

1348 was one of the darkest years in European history. The most deadly of all diseases – the Black Death – swept across the continent reaching Ireland in the late summer. Within twelve months over one-third of the population had died. Towns and villages were abandoned.
Dwyer, 2016. Book Synopsis 

…But, Rats and their fleas may not be guilty after all…

rat heap

Fig. 1: The Grime Reaper Rat on a pile of rat corpses. Illustration of what archaeologists should have found, but didn’t – thus indicating that rats and their fleas may be innocent in being the true cause of the rapid and widespread devastation of the Plague during the middle ages.

Now looking at the archaeological evidence, as it turns out, there is a serious lack of evidence for great heaps of dead black rats.

‘Black Death? Rats and fleas finally in the clear’

…Archaeologists and forensic scientists … have examined 25 skeletons … Analysis of the bodies and of wills registered in London at the time has cast serious doubt on “facts” that every school child has learned for decades: that the epidemic was caused by a highly contagious strain spread by the fleas on rats…
Mortality continued to rise throughout the bitterly cold winter, when fleas could not have survived, and there is no evidence of enough rats… In sites beside the Thames, where most of the city’s rubbish was dumped and rats should have swarmed, and where the sodden ground preserves organic remains excellently, few black rats have been found.
Thorpe, 2014, 

As noted above, we also know that the plague was spectacularly lethal during the winter months and the fleas that are supposed to be the culprits in spreading it via black rats, would not have been able to survive in this late season. Similarly, this was also the case in Ireland judging by the historical records.

‘Unheard of mortality’ The Black Death in Ireland

The plague raged in Dublin between August and December, setting a pattern for the terror it would spread through other parts of the country…
Thorpe 2014, ‘Black Death? Rats and fleas finally in the clear’

So if no amount of cleaning up rat infestations could have curtailed the spread – as the rats and their fleas now appear to be off the hook, what caused such a devastatingly rapid and severe spread of this great pestilence?


Fig. 2: Rats may be innocent after all! Designed by the author.

A clue might lie in the following. If anyone is old enough to remember actually singing this little rhyme below, having no idea whatsoever what we were singing about, then you might be interested to know what it was all about.

Ring a’ ring a rosies,

A pocket full of posies,

A Tishoo,a tishoo,

We all fall down.

Bugl, 2008, 8, ‘History of Epidemics and Plagues’ 

Apparently, according to Bugl in the History of Epidemics and Plagues’, the common interpretation of the Rosies refers to rosary beads (the religious item used for prayer – presumably in the hope that this would provide protection from the plague and imminent death) and the part of the rhyme when the children hold hands, forming a circle, presumably indicates the ring.

It seems that the pocket full of posies suggests wildflowers to mask the odour of plague victims. However, further research on the medical interventions of the era would indicate that flowers and herbs (posies) may have been used as protection as well. But, perhaps the most interesting part of the rhyme which gives us the big clue, is that last line (the fun part for children if you didn’t know any better, as they all collapse unto the ground in a giggle at the end), relates to a mock sneezing “A Tishoo, A Tishoo,” – which strongly points to the pneumonic (lung) form of transmitting the plague rather than the bubonic means of transmission, as they all fall down after being sneezed on.

As the article excerpt below indicates, the pneumonic and bubonic plague are essentially the same disease; and are simply manifested in different ways within the body.

Risk of Person-to-Person Transmission of Pneumonic Plague

Bubonic plague never spreads directly from one person to another. The bacteria may reach the lungs of people through hematogenous spread… Pneumonic plague is the only form of plague that can be transmitted from human to human.
Kool and Weinstein, 2005

Therefore, the sneezing form (pneumonic) can originate from the bubonic plague by making its way into the lungs via the bloodstream (hematogenous – meaning to spread by the blood) (see below),, as suggested above. This lends greater support to the means by which the Plague could have spread with such initial devastation – now having a way to directly pass from person-to-person. In other words, we do not necessarily require rats and their fleas to explain the wildfire type spread of this disease, at least during the Middle Ages.



1. Producing blood.

2. Originating in or spread by the blood.

Stedman’s Medical Dictionary (2002)

We can now begin to imagine how the Plague could have possibly arisen  and gotten a foothold – at least initially via infected rats in its bubonic form (hence the term Black Death in some cases), and if it had found a way of entering the bloodstream which, we now know is medically possible and common enough judging by modern studies, we can see just how more easily spreadable the pneumonic form (via sneezing etc) would be compared to the rat/flea-borne bubonic plague form – which would have been more easily spotted than the sneezing form.

We could, therefore, suggest that the flea-infested rats may have been the spark, but the uninitiated population (new virgin hosts for the pathogen) provided the kindling and it all went up like a tinderbox as the growing European shipping commerce fanned the flames of the burgeoning metropolitan centres during the Middle Ages.

However, the good news is that, just as rapidly as this devastation spread, it also appears to have burnt itself out almost as swiftly. Seemingly, it had consumed just about all that it could find in its path and those that remained standing would have become immune because of it.

The Black Death in Ireland


Today we have the benefit of hindsight. We know, as fourteenth-century people suspected, that the mortality caused by the bubonic plague of the Black Death was the worst demographic disaster in the history of the world. We also know that the mortality came to an end in the first outbreak soon after 1350; contemporaries could not have known this would happen – so far as they were concerned everyone might well die…
Kelly, 2001, ‘Unheard-of Mortality’ 

As the plague was virtually unheard of after the 1350s in Ireland and most other parts of Europe, one might wonder where it went and would we survive the Plague if it rekindled itself? As it turns out, it did come back within historical living memory as documented for Ireland as we prepared for its reemergence as described below:

1900: Ireland’s last bubonic plague scare

While bubonic plague evokes images of the Middle Ages, Ireland has had more than one brush with the dreaded disease. As recently as the year 1900, ports across Ireland prepared for an imminent outbreak of the Black Death…
…The last great plague scare in Ireland began after the illness broke out in Glasgow in August 1900. Ireland with its constant and frequent traffic with the Scottish port was immediately at risk of infection… As the death toll in Glasgow reached 13 by September 8th 1900, petty politics in Ireland hamstrung preparations to prevent an outbreak… Nevertheless in spite of such attitudes all vessels arriving in Ireland from Glasgow continued to be subjected to rigorous checks. Meanwhile the Glaswegian authorities, not only isolated those who contracted the disease but also those who lived in close proximity to them. This drastically reduced contagion and by the end of September there was a dwindling number of new cases.
Dwyer  2016, ‘1900: Ireland’s last bubonic plague scare’

Fortunately, the new cases began to quickly fade as indicated above, and perhaps it was to do with the containment measures, but more recent outbreaks would suggest otherwise. For instance, modern-day incidences and more recent historical experiences with outbreaks are rather puzzling as the Plague itself, even in its pneumonic form doesn’t appear to have the great impact that it once had. I.e., even if it does escape out into the unsuspecting public, which it has done on a number of occasions in more recent times, it has thankfully turned out to be surprisingly tame.

Plague has received much attention because it may be used as a weapon by terrorists. Intentionally released aerosols of Yersinia pestis would cause pneumonic plague. In order to prepare for such an event, it is important, particularly for medical personnel and first responders, to form a realistic idea of the risk of person-to-person spread of infection…
The disease resulting from direct infection of the airways is usually called primary pneumonic plague. This form would also occur after an intentional release of aerosolized Yersinia pestis…
Since 1925, person-to-person transmission of pneumonic plague has not been documented in the United States. From 1925 to 2003, there were 447 cases of plague reported to the CDC, and 48 developed into secondary pneumonic plague. Thirteen cases of primary pneumonic plague were reported in the same period; 5 of these were caused by cats with plague pneumonia, 1 was associated with caring for a sick dog, and 3 cases were laboratory-acquired.
In 4 cases, the origin of the infection remained unknown (CDC; unpublished data) … None of the contacts of these 61 patients with pneumonic plague seem to have developed the disease.
Kool and Weinstein,  2005  

One of these cases is worth reviewing in order to demonstrate just how tame the plague appears to have become compared to when it was circulating in the Middle Ages. For instance, although its only victim did unfortunately die, it was discovered after the fact that she had the real pneumonic Plague form – (they thought she had actual pneumonia at first as Plague is so rare these days) and it was sometime later when she had already exposed quite a number of people, including small infants in the day-care centre where she worked, that the penny finally dropped when the medical people realised what had just happened. This is described in the excerpted article below:

Discover Magazine

‘Will the Black Death Return?’

On October 2, 1980, a 47-year-old woman from south lake Tahoe, California, lost her 9-month-old pet cat to an acute infection. Three days later, the woman’s own temperature shot up, but she still went to her job at a day-care center. The fever worsened; she developed chest pains and shortness of breath. Two days later she drove herself to the hospital. The diagnosis was pneumonia, and she was treated with tetracycline. Shortly afterward the woman died.
Not until four days later did anyone realize that the woman had died of plague…
….Fearing that treatment might arrive too late, doctors rushed prophylactic antibiotics to the children and staff at the day-care center.. Luckily, no one exposed to the woman fell ill.
Orent, 2001

Under the same conditions of the Middle Ages, those around the infected and caring for the sick died in large numbers and the Plague spread rapidly to hundreds more, leading to thousands and ultimately millions, yet, in our more recent era, it is thankfully a completely different story.

Of course, this then raises the question: well, the plague must have changed somehow, as we know those clever bacteria can adapt very rapidly, unlike our relatively fixed genetic code. Perhaps this wasn’t the once deadly plague? Maybe it had become tamer over the centuries? This very question has recently been answered by extracting DNA from the original once deadly Plague:

Black Death? Rats and fleas finally in the clear’ Discovery Magazine

By extracting the DNA of the disease bacterium, Yersinia pestis,.. [T]o their surprise, the 14th-century strain, the cause of the most lethal catastrophe in recorded history, was no more virulent than today’s disease. The DNA codes were an almost perfect match…
Thorpe, 2014

Just think about that for a moment. This is the same once lethal plague that once killed millions some few hundred years previously and now, even if some mad terrorists release it into the public, we might not even notice that it is the pneumonic form of the Plague at all as no one would have black bubonic type lumps to say otherwise. Of course, this is really good news for most of us – I am sure we wouldn’t all escape, but we simply shouldn’t worry too much about such an event in our day and age.

However, it still doesn’t explain how we became so resistant to such previously deadly contagions like the Plague in the first place, and especially, because it hasn’t changed (genetically speaking) one jot since its devastation of the Middle Ages. A clue lies in the following excerpt.

Scientific America

How Black Death Kept Its Genes but Lost Its Killing Power

The newly sequenced genome of the plague-causing bacterium Yersinia pestis suggests human adaptations are what have kept this disease in check… The global population has likely built up some immunity from centuries of exposure to the pathogen.
Harmon, 2011 ‘How Black Death Kept Its Genes but Lost Its Killing Power’ [Video], 

Seemingly we have become immune over the centuries and the mechanism may lie in a very unlikely source: another friendlier pathogen in the form of some familiar viral characters that most of us have had some experience with over the generations. This is where Chickenpox and its relatives come into the saga of explaining whatever happened to the Plague per se.

References to Part One

[1] James, T. (2011) Black Death: The lasting impact, BBC, (17th Feb, 2011).[Available online at]
[2] Dwyer, F (2016) 1900: Ireland’s last bubonic plague scare (5th Jan, 2016), Irish History Podcast [Available online from]
[3] Thorpe, V. (2014) Black Death? Rats and fleas finally in the clear, (30th March 2014). [Available online at],
[4] Thorpe, V. (2014) Black Death? Rats and fleas finally in the clear, (30th March 2014). [Available online at],
[5] Bugl, P (2008) History of Epidemics and Plagues’ p.8. [Available online at as PDF]]
[6] Kool, J.L., and Robert A. Weinstein R.A (2005) Risk of Person-to-Person Transmission of Pneumonic Plague, Clinical Infectious Diseases, Vol.40, Issue 8, (15 April 2005), Pp. 1166–1172, [Available online ]
[7] Stedman’s Medical Dictionary (2002) Definition of ‘hematogenous’,  American Heritage, Houghton Mifflin Company, U.S.A. [Available online at Link]
[8] Kelly, M. (2001) Unheard-of Mortality….The Black Death in Ireland, History Ireland: Issue 4 (Winter 2001), Medieval History (pre-1500), Vol. 9, [Available onlin at]
[9] James, T. (2011) Black Death: The lasting impact, BBC, (17th Feb, 2011) [Available online at]
[10] Findwyer (2016) 1900: Ireland’s last bubonic plague scare (5th Jan, 2016), Irish History Podcast [Available online from]
[11] Orent, W. (2001), Will the Black Death Return?, Discover Magazine, (1st Nov, 2001), [Available online at]
[12] Thorpe, V. (2014) Black Death? Rats and fleas finally in the clear, (30th March 2014). [Available online at],
[13] Harmon, K (2011), How Black Death Kept Its Genes but Lost Its Killing Power, Scientific America, (12th Oct. 2011), [Available online at Video]



Chickenpox May Be The Reason Why We Don’t Die in Our Millions from the Plague!

Recent molecular studies have revealed something rather surprising about the fate of the once much more lethal Plague. In essence, scientists are now finding that if mice can combat the deadly form of the PLAGUE with common viruses most of us are already infected with, it is quite possible and highly likely, that we do the same.

The Good Thing About Herpes

     The herpes family of viruses can have a surprising upside–it can protect against the bubonic plague and other bacterial contagions, at least in mice. …Nearly all humans become infected with multiple herpes virus family members during childhood. These germs not only include the herpes simplex viruses, which lead to cold sores and possibly genital herpes, but also the diseases responsible for chickenpox and “mono,” as well as several less well-known ailments. Herpes infections have bedeviled animals for more than 100 million years…
…The scientists discovered latent infections with these viruses could protect mice from bacterial infections, including Yersinia pestis, which causes bubonic plague… findings detailed in the May 17 issue of the journal Nature. The herpes viruses spur the immune system to boost levels of a protein hormone called interferon gamma “that in effect puts some immune system soldiers on yellow alert, causing them to patrol for invaders with their eyes wide open and defense weapons ready,” Virgin said. As a result, the bacteria grew more slowly and were less likely to kill the mice.
Choi, 2007

The Herpes family of viruses are, as noted above, very common and most of us already have these living inside us. Essentially, it looks like the host (that’s us) learns to defend itself making it impossible for critters to continue their destruction and it does this by slowing down the virus’ s ability to replicate in the highjacked cells and stave off the worst effects of the bacterial Plague, giving our own immune systems time to respond. The more the immune system does this: the more efficient it becomes and it does this, apparently, via the boosting of our immunity through simple exposure to things such as Chickenpox.

Basically, the Plague bacteria have to adapt or suffer the consequences of our mighty defence. They usually learn to behave themselves and become less invasive and quite tame. This immunity comes essentially from allowing the typically benign childhood disease of Chickenpox, for example, to circulate naturally so that all ages keep boosting their immune systems as you will see in the article below.

As Chickenpox is part of this family that seems to be implemented in the reason why we no longer are dying in our millions from the plague, it may be important to keep it around as it now appears that it is quite important to let it circulate naturally for the reasons outlined below. The article relates to the fact that the UK is considering introducing a vaccine into the childhood schedule to tackle Chickenpox and looking to the experience in the U.S. where the vaccine has been included in the infant and childhood schedule for some time now. The situation in Ireland is somewhat similar to that of the UK as we have never had a routine vaccine given for Chickenpox.

Chickenpox, chickenpox vaccination, and shingles

Chickenpox in the United Kingdom, where vaccination is not undertaken, has had a stable epidemiology for decades and is a routine childhood illness. Because of vaccination, chickenpox is now a rarity in the USA. In the UK vaccination is not done because introduction of a routine childhood vaccination might drive up the age at which those who are non‐immune get the illness (chickenpox tends to be more severe the older you are), and the incidence of shingles may increase. The United Kingdom is waiting to see what happens in countries where vaccination is routine.
…We know that exposure to chickenpox can significantly prevent or delay shingles (by … boosting of immunity)… Increased annual chickenpox rates in children under 5 are associated with reduced shingles in the 15–44 age group. Having a child in the household reduced the risk of shingles for about 20 years, the more contact with children the better, and general practitioners and paediatricians have a statistically significant lowering of risk,.. possibly because of their contact with sick children (teachers did not have a significantly reduced rate)…
If there is less chickenpox in children then there will be no boosting of immunity by exposure to chickenpox for middle and older aged people and thus there will be more shingles, at least until all the elderly have been vaccinated as children but this assumes that immunity conferred by vaccination is lifelong… The greater the chickenpox vaccination rates the higher the initial incidence of shingles would be until everyone was vaccinated (in other words until those of us my age who harbour varicella zoster virus in our nervous ganglia die off). It may be that a less than 100% cover by vaccination might reduce the combined chickenpox and shingles morbidity by allowing the virus to circulate in the population with only minor increases in the age of chickenpox while boosting immunity to shingles…
The extent of decline in vaccination induced immunity to chickenpox over future years is not, of course, known and neither is the proportion of those vaccinated in the USA from 1995 that will become susceptible to “geriatric chickenpox.”
Welsby, 2006, 351-52 


Remember that, based upon the above evidence, all in all, it now appears that no amount of capturing rats and cleaning up flea infestations would have helped reduce the death toll of the disease even in our modern era, as the evidence now strongly supports the black rat’s innocence – along with their flea companions – in being the harbingers of death and destruction during the Middle Ages.

Furthermore, our medical interventions cannot account for the rapid decline of deaths in such a short space of time in most regions as we never had a vaccine to try to eradicate the disease or antibiotics back then. And I don’t think the plague doctors going around with prodding sticks and bird mask with beak filled with herbs and poisons did anything other than protecting them against getting the disease itself – if it protected them at all – we don’t know. Seemingly, the plague resolved itself naturally as there are still cases in parts of the U.S. annually that typically don’t end in death and their contacts are not falling down in their millions after going: “A Tishoo…A Tishoo…” thinking it was a bad cold or the flu that turned to a case of pneumonia.

It looks like Nature has tamed this once much more deadly beast which – although it is the same disease as that of the Middle Ages (genetically speaking) that once killed millions, in more modern times via our exposure to natural viral infections with the Herpes Family over many generations  our defences against the Plague have gotten fairly robust. This seems a reasonable trade-off as it beats having the Plague compared to providing a home for something as benign (for the most part) as cold sores (just don’t go kissing any new-born infants), or childhood Chickenpox.

Perhaps, therefore, we are very fortunate here in Ireland and indeed, the UK and the other regions in Europe where they do not currently have the Chickenpox vaccine along with several boosters in the childhood schedule, as most children still  get the disease naturally, gaining life-long immunity (which is incredibly long compared to the much more variable and significantly shorter immunity we now know that all vaccines studied can offer).

Exposure to the real Herpes Family disease seems a much more beneficial option all round, as having something like Chickenpox, not only seemingly boosts natural resistance to the potential latent virus eruption later in life in the form of fairly painful Shingles but, by allowing the disease to circulate naturally, we also expose children at the appropriate age – thus reducing complications and confer life-long immunity against ever having Chickenpox again. Perhaps, of greatest importance, we may be most fortunate in allowing the Herpes Family of viruses, (of which Chickenpox is one member) to circulate naturally as it seems to be doing a fairly good job at keeping the more lethal impact of the Plague at bay.

So, if the Plague does escape from the lab again, as it has done on a number of cases in the past, try to find some children having a pox party and see if they’ll let you in.

[1] Choi, C.Q. (2007) The Good Thing About Herpes, Live Science (16th May 2007), [Available online from livescience]
[2] Welsby, P. D. (2006). Chickenpox, chickenpox vaccination, and shingles. Postgraduate Medical Journal, Vol. 82, [967], pp. 351–352. [Available online].


The Many ‘Typhoid Marys’

Typhoid fever facts, information, pictures | articles about typhoid fever

Typhoid fever is passed from person to person through poor hygiene, such as incomplete or no hand washing after using the toilet. Persons who are carriers of the disease and who handle food can be the source of epidemic spread of typhoid. One such individual gave her name to the expression “Typhoid Mary,” a name given to someone whom others avoid.
 Gale, T. (2006)
Typhoid Fever, Gale Encyclopedia of Medicine, 3rd ed.

As it turns out, Mary Mallon, the silent carrier of Typhoid Fever, was not actually as deadly as her infamy would lead us to believe. It is simply that her case was the most publicised at the time.


“Mary Mallon was born in 1869 in Cookstown, Co Tyrone. She emigrated from Ireland to the United States in 1884.

Mallon became the focus of one of the best-known episodes in the history of communicable disease when U.S. health officials identified her as a healthy carrier of the organism causing typhoid fever. Mallon, who refused to acknowledge her role in spreading the disease as a cook, is known to have infected at least 53 people, resulting in three deaths. Unable to stop her from cooking for others, New York City authorities confined her for 26 years…”
Stair na hÉireann/History of Ireland (2014)

I am sure if we could explain to Mary back then (but no one knew at that time), that she simply could not feed anyone uncooked food – no matter how delicious her recipe for frozen peaches was, and still would have been able to maintain herself through work and not spend more than a quarter of her life in captivity with threats of removing her gallbladder.

Mary mostly cooked for wealthy families, some of which she inadvertently infected, and as you will see below, she also infected other well-scrubbed and well-fed individuals such as, doctors, nurses and medical staff at a maternity hospital. Essentially, people who would have had the least exposure from their environment of such a pathogen.

Refusing Quarantine: Why Typhoid Mary Did It

Health officials lost track of her for a few years, but found her again in the midst of another typhoid outbreak, this time at a Manhattan maternity hospital where 25 people, mostly doctors and nurses, were infected. Mary had been cooking there under a fake name, but fled before health officials could catch her. They traced her to a house in Queens, where they had to sneak in through a second-story window, using a ladder, to apprehend her, according to the Times report on the event.
Latson, J (2014) , Time Magazine (Nov., 11th 2014)

In some ways, as tragic as the few deaths were that Mary inadvertently caused, look how many people she infected (particularly the most hygienic health professionals as they probably had managed to avoid such an unhygienic infection up to this point in their lives), and did survive to tell the tale, thus leaving them with immunity from the disease for life. Moreover, her spate of silently spreading the disease to previously unexposed individuals should be viewed in the context of the bigger picture of the devastation the disease itself had caused amongst populations both within Ireland, her homeland, and across the Atlantic.

Typhoid Fever Mortality Ireland

Fig. 1: Chart of the annual number of deaths in Ireland from Typhoid Fever since records for this disease began. Note the dramatic decline throughout the late 19th Century through to the mid 20th Century. Source: Chart generated using annual statistics reports since records began – “Annual Reports on Marriages, Births and Deaths in Ireland, from 1864 to 2000” courtesy of An Phríomh-Oifig Staidrimh, Central Statics Office CSO, link. © Copyright

When the rate of annual deaths from Typhoid Fever are compared between the U.S. and Ireland we find a near identical decline for a similar timeframe where the main difference is simply a matter of scale. I.e., the U.S. population being much greater than that of a tiny country like Ireland, the proportions of deaths per population from Typhoid would have been similar, but of course the actual total numbers of deaths per year in the U.S would have been in the tens of thousands.

For instance, a graph [4], Figure 16, p. 82, Vital Statistics Rates, Death rates for Typhoid Fever: Death –registration States, 1900-32, and United States, 1933-60, based upon rates per 100,000 population show that at its height,  there were 32 deaths per 100,000 per year starting at 1900 (in a population of a couple of hundred million, that’s a lot of deaths) and declining to less than 20 deaths per 100,000 dramatically dropping from 1910 onwards and from the 1920s, this downward trend continues until deaths from Typhoid Fever becomes very rare by the 1960s.

In Ireland, (See Fig. 1), the deaths per year from Typhoid Fever had become almost a rarity by the 1920s and 30s, with a slight peak in the 1940s and then practically unheard of after this period. The Irish had seemingly won their battle against the disease, as did Americans. In other words, in the scheme of things, Typhoid Mary was a drop in the Atlantic Ocean when it comes to being responsible for the spread of Typhoid Fever in the States.

By the time Mary Mallon had landed as an Irish immigrant on the U.S. shores, deaths from Typhoid Fever were already on the brink of free-fall, suggesting that the vast majority of people were well on their way to becoming immune to the disease already. Almost everyone had most likely already been exposed to the Typhoid, whether they knew it or not, and had become immune for life. This is reflected in the statistical data, indicating that generations had already become resilient and hardened to the disease, making it increasingly more difficult for the pathogen to gain a foothold in these populations as a whole. Was Mary alone in not remembering that she had the disease, or perhaps she was correct when she said she never had any symptoms at all?

As it turns out, she was not the only ‘Typhoid Mary’ type – apparently there were many others just like her, but of course, they weren’t all called Mary. Although her case is certainly the most famous, being a silent, or latent carrier of Typhoid was not actually that uncommon.

Typhoid and Paratyphoid Fever

About 1 in 300 people infected with typhoid fever may have a low-grade infection.
They may not develop any significant symptoms, and then become carriers of the disease’.
Tidy, C. (2018)

That’s not taking into account the ones that were infected but didn’t know it for some time after as symptoms develop late in this disease as also indicated in the rest of the article excerpted above. So you can perhaps begin to imagine how much of the population in both Ireland and the United States had already had exposure to the disease and built up community immunity from having survived it.

This begins to make sense of the steep decline in deaths in both nations over a similar course of time, and how the pathogen was dramatically losing its grip on the population as a whole – it was running out of options to colonise new hosts – fresh victims now only remained in small pockets of previously unexposed individuals. But what if we imaged for a moment that Mary was put on a boat and set out to sea, maybe to keep her company they gathered up all the known silent carriers. ‘Typhoid Mary’ and her merry band of exiles, although, not as lethally infectious in the scheme of things back home, due to so many having already built up resistance, if they happened upon a previously naive native population who had absolutely no immune resistance or experience to these foreign pathogens, their arrival would be devastating, at least initially.

The vessel carrying the bunch of ‘Typhoid Mary’ types, would not be that different from what has been all too commonly documented historically, particularly when explorers and colonisers of the so-called New Worlds of the Americas and Southern Latitudes arrived as silent carriers showing no signs of diseases or particular ill-health themselves, yet little did anyone know until the previously uninfected Indigenous Peoples begin dying in great numbers that what had happened began to sink in – old pathogens within previously exposed and now resilient populations had found new virgin territory to colonise.

Charles Darwin – the great evolutionist, directly quotes and comments on such incidences in relation to his own experience whilst on his adventures to the South seas in the Beagle.


Australia in 1836:

The Rev. J. Williams, in his interesting work, … says, that the first intercourse between natives and Europeans, “is invariably attended with the introduction of fever, dysentery, or some other disease, which carries off numbers of the people.” Again he affirms, “It is certainly a fact, which cannot be controverted, that most of the diseases which have raged in the islands during my residence there, have been introduced by ships; … and what renders this fact remarkable is, that there might be no appearance of disease among the crew of the ship which conveyed this destructive importation.”
This statement is not quite so extraordinary as it at first appears; for several cases are on record of the most malignant fevers having broken out, although the parties themselves, who were the cause, were not affected.
Darwin, C (1839)
In :Alice Bergfeld – Rolf Bergmann – Peter v. Sengbusch BOTANY Online – The Internet Hypertextbook 2004.

Historically, this was an all too common phenomenon, but thankfully, it would seem that most of the world has had experience with many of these much deadlier infectious contagions and they are now incredibly rare, at least in our more developed nations. Interestingly, there is an indication from the literature in general that these same diseases are becoming rarer in less developed nations too. It looks like Nature is on the case and as you will see below, there is strong evidence to suggest they too will soon be looking back at their mortality statistics and seeing a similar pattern of plummeting deaths as we have thankfully experienced now in the developed world.

For example, the article excerpt below describes the historical devastation from Europeans – who were like Typhoid Mary and her band of silent carriers of plague-like proportions as given in accounts from when these new settlers and adventures began to colonise the Americas from the 16th Century onwards. However, it also offers hope, as it describes the natural resilience that can come about from having exposure to such pathogens circulating in general.

Rationalizing epidemics: meanings and uses of American Indian mortality since 1600

Europeans encountered new populations, in Hispaniola and Mexico in the 1500s, in New England and Quebec in the 1600s, and even in Alaska and the Amazon in the 1900s, they witnessed terrible mortality. Epidemics of smallpox, measles, and influenza took the highest toll. These diseases, endemic in Europe, had not been present in the Americas before European arrival. Europeans, exposed as children, developed immunity that protected them as adults. American Indians, without this immunity from prior exposure, and stressed by the chaos of European colonization, were dangerously vulnerable. They died in great numbers…
 Jones, D.S. (2004)
Rationalizing Epidemics: Meanings and Uses of American Indian Mortality since 1600. Cambridge: Harvard University Press, p. 26

The above excerpt reveals that seemingly, the Europeans themselves, due to exposure as children had become rather resilient to such previously devastating plagues, to the point where they all became ‘Typhoid Mary’ types, and although it was initially devastating to the previously unexposed natives, there is actually a positive side to this story as that surely, if the Europeans had become immune generations before, then so could the natives themselves. The Europeans would have once been in the same boat generations before. The article excerpt below lends some support to this idea:

Immune Aspects of First Contact Epidemics

Isolated island populations were clearly subject to disastrous outcomes when new infectious diseases were first introduced, but this often did not extend to subsequent epidemics by the same pathogen…
Mathematic models make it clear that whatever the reason the extreme mortality rapidly decreased after the first-contact epidemics on Pacific islands, it was not due to Darwinian selection of disease-resistance genes. The time interval of only 2–3 generations is simply too short to involve such putative disease-resistance genes.
Shanks, G.D. (2016)
Am J Trop Med Hyg. 2016 Aug 3; 95(2): 273–277. doi: 10.4269/ajtmh.16-0169
Lethality of First Contact Dysentery Epidemics on Pacific Islands

We have only in more recent times began to gain more insight into just how adaptable and responsive the immune system actually is. Seemingly, just about every living thing can rapidly respond and defend itself from danger and threats, particularly disease, without having to wait around for millions of years in hope that we might end up with the lucky genes that will save us in the end. In other words, as indicated in the excerpt above, adaptation and resistance to disease can be handed down through generations and goes directly against our current dogma of genetically-driven adaptation as the following excerpt title highlights:

Your Immune System Is Made, Not Born

New research dispels the belief that the strength of the body’s defense system is genetically programmed
…genes themselves need instructions for what to do … Those instructions are found not in the letters of the DNA itself but on it, in an array of chemical markers and switches, known collectively as the epigenome, that lie along the length of the double helix. These epigenetic switches and markers in turn help switch on or off the expression of particular genes. Think of the epigenome as a complex software code, capable of inducing the DNA hardware to manufacture an impressive variety of proteins, cell types, and individuals.
Landhuis, E. (2015)

You see, Nature may not be in the business of only allowing the fittest to survive, but ensuring that we all survive via a very different flexible non-genetic epi-genetic code and one of the ways this is done in via our mothers.

Immune Priming: Mothering Males Modulate Immunity

Non-genetic transfer of immunity from mother to offspring is a well-recognized phenomenon known as transgenerational immune priming. Mammals, for instance, exchange immunological information on abundance and composition of pathogens to offspring via the placenta and antibody-rich mother’s milk…
The paradigm is that offspring who are destined to be raised in a similar disease environment to their mothers will benefit from a maternal enhancement of offspring immunity that reflects the current environmental challenges.
Keightley M.C., Wong B.B.M, and G J. Lieschke (2013)

It also now looks like we can inherit this hard-fought-for immunity, not just from our mother’s directly, but from their mother’s and perhaps generations of mothers before them as suggested by the following study – at least in pigeons.

Grandmothers can pass immunity to their grandchildren, at least in pigeons

At the moment of birth, a newborn leaves behind its safe protective environment and enters a world teeming with bacteria, parasites, viruses, and infectious agents of all sorts. However, the babies do have one trump card: antibodies and immune compounds passed across the placenta from their mothers. These short-lived molecules can dip into mom’s immunological experience to protect the newborn until the immune system gets up to speed. Now, a new study in pigeons suggests that some baby birds owe their early immunity not just their mothers, but to their grandmothers as well.
…previous research has suggested that these early maternal immune compounds may have “educational effects” on the newborn’s developing immune profile—that they may somehow be priming the system to be on the lookout for common local diseases or parasites…
Shultz. D (2015)

In other words, it is now looking quite likely that our immune systems can memorise past battles with pathogens – epigenetic imprinting which is generational – i.e. these adaptations can be inherited along with your genes and we can pass those experiences and expertise gleaned from the battles to our offspring.

You see, your immune system, as we are coming to appreciate in more recent times, as we gain more insights into tiny biochemical and molecular world beyond the relatively fixed and unchanging genes, is just as clever and flexible as the pathogens themselves.  Most of us have heard of antibiotic-resistant strains of bacteria and how bugs can become superbugs due to their rapid adaptation to our efforts to eradicate them? Well, our immune systems also adapt rapidly to pathogens, it is just that we are more complex and bigger than bacteria so it might take us a few generations for the whole community to become robustly immune as suggested by the mathematically model study highlighted earlier.

Unfortunately, there are casualties, particularly when a pathogen that our immune systems are not familiar with gains a foothold, as illustrated by the large number of deaths from Typhoid Fever in the earlier days since records began as seen in the Irish chart and discussed for the U.S. Our immune systems are built to adapt to and withstand all sorts of pathogens as illustrated so clearly, not just in the mortality charts for Typhoid, but for all the other highly infectious disease that were once significantly more deadly as the record clearly shows.

Therefore, it seems that the more the immune system is exposed to the pathogen, directly, or indirectly as it silently circulates in the natural background (bear in mind that the Typhoid causing pathogen never went extinct, it is still with us today), the more adept our immune system becomes at defending us. In effect, each exposure – even if we are not sometimes aware of the pathogen circulating, trains and educates our immune systems to deal with just about everything in the end.  And now, it looks like this immunity is not just for life if you survive the ordeal, or even have to experience the disease directly to gain general resistance,  but that immunity may be inherited as well.

And just one more point on this long-term, non-genetic generational immunity. This means that there is hope for us even today, if for some reason one of these much deadlier contagions did erupt out into our modern-day world (perhaps via an intentional bioweapon), we may not have to fall about dying to build up community immunity all over again. This is supported in the following excerpt, which suggests that the disease doesn’t even need to be circulating in an obvious way for us to still build up and retain resistance to it. In other words, we may have now forgotten those diseases, but our immune systems have not.

Rethinking the Origin of Chronic Diseases

Some modern-day diseases reflect the capacity of organisms to “memorize” responses to external signals and transmit them across generations; …
 the original causative agent may not be extant today, but “memory” of the infection has persisted.
Shoja, M.M  et al, (2012)
BioScience, Volume 62, Issue 5, 1st May 2012,

So perhaps now you might see Mary Mallon’s story with a slightly different lens when we realise that being exposed to pathogens, in the broader scheme of things, although tragic for the few that lost their lives, there is actually a positive aspect underlying this means of infection via a healthy silent carrier of the disease.

It seems that we have our ancestors to thank for taking the greatest hit for us. Recall that each person who Mary infected (remember most survived), not only gained life-long immunity, but they gained something of even greater value – generationally immunity that they could then pass on to their own offspring for the future. We are that future.


References to Part Three

[1]  Gale, T. (2006) Typhoid fever facts, information, pictures, Gale Encyclopedia of Medicine, 3rd ed. [Available online]
[2]  Stair na hÉireann/History of Ireland (2014)  1938 – Death of Mary Mallon (Born in Cookstown, County Tyrone), Also known as Typhoid Mary [Available online]
[3Latson, J (2014) Refusing Quarantine: Why Typhoid Mary Did It, Time Magazine (Nov., 11th 2014) [Available online]
[4], Grove, R.D. and Hetzel, A.M.  (1968) Vital Statistic Rates in the United States, 1940-1960, U.S. Department of Health Education, and Welfare Public Health Service, Washington D.C. National Centre for Health Statistics (Figure 16, p. 82 Vital Statistics Rates, Death rates for Typhoid Fever: Death –registration States, 1900-32, and United States, 1933-60), [Available Online].
[5] Tidy, C. (2018), Typhoid and Paratyphoid Fever [Available online]
[6] Bergfeld, A, Bergmann R. and Sengbusch, P.V. (2004), Charles Darwin (1839), CHAPTER XIX :Australia in 1836,  BOTANY Online – The Internet Hypertextbook 2004.[Available online]
[7] Jones, D.S. (2004)  Rationalizing Epidemics: Meanings and Uses of American Indian Mortality since 1600. Cambridge: Harvard University Press, p. 26
[8] Shanks, G.D. (2016) Immune Aspects of First Contact Epidemics, Lethality of First Contact Dysentery Epidemics on Pacific Islands, Am J Trop Med Hyg. Vol. 95 [2], pp. 273–277. doi: 10.4269/ajtmh.16-0169 [Available Online]
[9] Landhuis, E. (2015), Your Immune System Is Made, Not Born, Discovery Magazine (Jan., 29th 2015)  [Available online]
[10] Keightley M.C., Wong B.B.M, and G J. Lieschke (2013) Immune Priming: Mothering Males Modulate Immunity, Current Biology, Vol. 23, [2] pp. 76-78 [Available online] doi:org/10.1016/j.cub.2012.11.050
[11Shultz. D (2015) Grandmothers can pass immunity to their grandchildren, at least in pigeons, (Nov. 10th 2015), [Available Online]
[12]Shoja, M.M  et al, (2012), Rethinking the Origin of Chronic Diseases, BioScience, Vol, 62, [5], [Available online]


Typhus: Filling in the Gaps

Typhus’ Rise to Prominence

Typhus deaths Ireland

Fig. 1: Chart of the annual number of deaths in Ireland officially recorded from Typhus Fever since records began for this disease. Note the large spike at the beginning.  Source: Chart generated using annual statistics reports since records began – “Annual Reports on Marriages, Births and Deaths in Ireland, from 1864 to 2000” courtesy of An Phríomh-Oifig Staidrimh, Central Statics Office CSO, link. © Copyright

The death toll from Typhus may have been significantly greater prior to 1864 in Ireland. This is indicated by the rather large spike at the beginning of the graph above (Fig. 1). This spike accounts for some 2000 officially recorded deaths from Typhus and may just be the tip of the iceberg, or, in other words, the very tame tail-end of a much more devastating disease of the decades preceding it.

For instance, as Peterson has suggested  –

“…Mortality is incredibly high under epidemic conditions, nearing 100%…”

(2018, 156)


– and now we can perhaps begin to make a stab at the real impact of Typhus preceding 1864 by applying the estimates given in the following for what seems to be Ireland’s first major encounter with epidemic Typhus going back to 1816.


Typhus had been recognized in Ireland as early as 1652, but it was not until 1816 that a major epidemic of the disease produced 700,000 cases out of a population of 6,000,000. Three more major epidemics thereafter, in 1821 and 1836, accompanied harvest failures.
Conlon, J. M., (2007, 13)

In an attempt to fill in the historical gaps, we can extrapolate the proposed fatality rate from Typhus (see Peterson, 2018, 156) and apply it to the figure of well over half a million cases estimated for the initial devastation of the 1816 Irish epidemic. Now, in the context of a total population of about six million, this means that Typhus may have been much more deadly than previously recognised. Almost one person out of every six of the entire population of six million had seemingly been infected and most of them probably didn’t survive. This begins to echo the scale of deadly impact within a naive population that is unfamiliar with a particular pathogen as discussed previously.

However, as also discussed thus far, in the case of any previously unexposed population, as devastating as a population’s initial encounter with an unfamiliar pathogen, it also appears to be the case that these same populations rapidly adapt to even the most deadly of pathogens and within a few short generations, become effectively resistant, and ultimately immune – via general background exposure (rebalance has been achieved perhaps?). ( See, previous discussion here).

Typhus, it seems, is no different as we are encouraged to see that when we look at the official data of recorded deaths annually in Ireland from this once mighty pathogen, we can clearly see that the annual number of deaths dropped dramatically from around 2000 deaths to an average of 500 deaths annually after the large spike of 1864, followed by a little rise again and then its fatality within the population as a whole becomes significantly less throughout the rest of the 19th Century. Finally, deaths from Typhus becomes a rarity by the first quarter of the 20th Century, never to be heard of again in this country (Fig. 1).

Once again, we seem to be looking at a natural rise and fall pattern of deaths from a previously much more deadly contagion. This is well supported by the historical accounts of the disease as you will see throughout the discussion that follows. But first, we will explore some other proposals, that are commonly offered as explanations for such devastating epidemic outbreaks and their ultimate resolution.

One suggestion is given above (see Conlon, 2007, 13),  in terms of the disease coming in most fiercely on the back of harvest failures. This would seem a reasonable suggestion, however, when we drill down to the details, it doesn’t quite stand up to scrutiny as history has shown as argued by Dirks 1993, and Carmichael 1983:

“It should be pointed out that recent research has questioned the widely held assumption that malnutrition inevitably leads to increased susceptibility to infection.” (in Mokyr, J., and Ó Gráda, C., 2002, 21). [3].

This insight is further reiterated by other researchers who have concluded something similar regarding this widely held assumption, and also support the general proposal presented here in pointing out that the pattern of disease can best be explained from a biological perspective:

Famine, Mortality, and Epidemic Disease in the Process of Modernization

Chambers carried his case to the point of questioning the medical concept that under-nutrition and malnutrition interfere with the ability to resist infectious contagion, acknowledging, however, that nutritional deficiencies will exacerbate endemic illnesses …
Since European famines were not invariably followed by serious epidemics, it is possible that plague and similar crises of public health were essentially biological in origin and not directly related to problems of subsistence.
Thus Fernand Braudel has stated that “every disease has its own autonomous life, independent of the endless correlations” historians suggest, and that correlations with economic crises are “at most only minor accidents in a history linked with other factors”…
Post, J.D., (1975, 14)

As highlighted above, seemingly outbreaks of any of the major contagious diseases within such widely disparate regions, do not necessarily correlate with famine, crop failures or malnutrition in general when viewed together and within the broader context of the impact and spread of the disease itself.

For instance, famines or crop failures are noted a number of times as occurring within Ireland at around the same time as various epidemic eruptions, including such diseases as Typhus (See Creighton, 1894) [5]. However, as highlighted above, this may not actually be causal to either its rise or its demise, but merely a coincidental historical observation.

For instance, the United States did not have any particular crop failures when it experienced its first major Typhus outbreak of the 1830s, around the same time as Ireland was experiencing its own Typhus epidemic, nor did many parts of Europe that experienced near-simultaneous Typhus outbreaks have corresponding widespread crop shortages.

Now, take, for example, London (see excerpt below), where it also seems that although a major Typhus epidemic was underway at around this time as the Irish epidemic of the 1830s,  crop failure is not the main factor in its eruption or spread?

Health and Hygiene in the Nineteenth Century

In the 1830s the “new fever,” typhus, was isolated. During its worst outbreak, in 1837-38, most of the deaths from fever in London were attributed to typhus, and new cases averaged about sixteen thousand in England in each of the next four years.
Douglas, L., (Victorian Web, 11th Oct., 2002)

Another example that famine wasn’t a major factor in, or direct cause of, the rise of Typhus is highlighted in the following in relation to Scotland, which experienced the same type of epidemic Typhus of the same general period and experienced fairly large death tolls as Ireland; yet, as the excerpt points out:

A History of Epidemics in Britain, Volume II

“The increase of fever in Glasgow,” says Cowan, “during the seven years prior to 1837, had taken place, not in years of famine or distress, but during a period of unexampled prosperity, when every individual able and willing to work was secure of steady and remunerating employment.
Creighton, C., (1894, 191)

The evidence presented above lends good support to the idea that Typus was what we would call pandemic, i.e., it hit a great many nations, on different continents of the world, at about the same time, with a similar virulence. Thus, a localised crop failure in one region cannot possibly explain the similar eruptions in other nations not suffering a similar plight.

Therefore, as resultant malnutrition are not necessarily linked directly to massively increased deaths from highly infectious diseases such as Typhus, instead – perhaps indirectly, the common body louse has been implicated in the disease – and by association – its spread as indicated below:

Epidemic Diseases of the Great Famine

It is now known that the vector of fever was not famine, nor social distress… but pediculus humanus, the human body louse.
Geary L., (Spring 1996, History Ireland Magazine)

Does that mean that crowded conditions and lice infections were the cause? Perhaps not entirely either, as another historical reference suggests.

A History of Epidemics in Britain, Volume II

The lesson of the history is unmistakable: with all the inducements to typhus from neglect of sanitation in the midst of rapidly increasing numbers, there was surprisingly little of the disease…
Creighton, C., (1894, 167)

It is therefore fairly unlikely that this pattern of disease within populations can be directly caused by any of the main factors that we would normally attribute to such spread and high mortality from such contagions.

The idea that neither famine, starvation, nor necessarily overcrowding and generally lousy conditions can fully explain the initial devastation of Typhus in its plague-like devastation is further supported by the evidence emerging from historical accounts of what is often referred to as the Great Irish Famine of the mid-1840s. (it was much more socially and politically complex than simply crop failure, but that is another story).

For instance, the estimates of deaths from Typhus at this time from the so-called famine relief workhouses (that kept fairly comprehensive records for the main famine years) are rather low relative to the high mortality from all the other famine-related causes of death – particularly, considering that the workhouses would have been hotbeds for body lice makes the following figures even more surprising.


Approximately 190,000 Irish citizens died from typhus contracted in the louse-infested workhouses they were forced to inhabit….
Conlon, J. M., (2007, 13)

No other circumstances should have been more conducive to encouraging the ravages of Typhus and other widely infectious diseases than this. But, as suggested in the figures given above (almost 200,000 deaths from Typhus over the course of the few years of the Great Famine which, broadly speaking would average out to less than 50, 000 deaths from Typhus over each of the famine years), is still significantly lower than the estimated death toll for the initial epidemic of 1816 (perhaps as much as half a million in a single outbreak) when living conditions were significantly better than during the Famine years of the mid 1840s onwards.

In other words, it looks like the later outbreaks of Typhus – (post-early 1800s), the widespread Typhus pandemics of the 1830s, and onwards until the Great Irish Famine era (even at the height of great famine,  pestilence and recorded louse infestations) and certainly after this point, Typhus, as a harbinger of near-certain death in the early days to an estimated one in every six of the entire population of relatively healthy and hardy people, had become increasingly tame over the decades that followed; even during the nation’s most dreadful famine years of the mid to late-1840s.

Indeed, there are historical accounts that support the taming of this once more deadly disease of Typhus, as supported in the following:

A History of Epidemics in Britain, Volume II

It was from 1830 to 1834 that a change in the reigning type of fever began to be remarked in London, Dublin, Edinburgh and Glasgow, the new type becoming more and more evident as fevers became more prevalent in the ‘ thirties ‘ and ‘forties.’
Relapsing Fever the common type in 1847, …
Mayo and Galway, and in Gweedore, Donegal, not more than one in a hundred cases of relapsing fever proved fatal. In Limerick the mortality was ” very small.” In many places it is given at three in the hundred cases, in some places as high as six in the hundred. When deaths occurred, they were often sudden and unexpected…
Creighton, C., (1894, 189)

Relapsing Fever as it turns out is actually the less lethal form of Typhus and it is notable that this phenomenon is near-simultaneous – from the 1830s onwards – throughout many regions, including  Ireland. The fatalities indicated in the excerpt above for those that got the relapsing form of the fever are significantly less (one or as many as six per hundred of cases of the disease and the cases themselves do not seem to be that many) than those estimated for the initial 1816 epidemic.

Interestingly, although communities were finally starting to sigh a breath of relief as Typhus (relapsing fever) was seemingly becoming less deadly, after the mid 19th Century, people on the ground – so to speak, were beginning to feel its replacement in the form of another contagion: namely, Enteric Fever, which is another name for Typhoid Fever (see previous article here) as referenced below:

A History of Epidemics in Britain, Volume II

The cases of enteric fever increased decidedly after 1865… ….The disappearance, during the last twenty years, of typhus and relapsing fevers from the observation of all but a few medical practitioners in England, Scotland and Ireland, is one of the most certain and most striking facts in our epidemiology.
…Still more recently, the relative proportions of typhus and enteric fever have been reversed, so that there have been years with little or no typhus but with a good deal of enteric fever… Typhus declined, and typhoid rose…
Creighton, C., (1894, 202)

Typhus and Typhoid deaths Ireland compared                              

Fig. 2: Chart of the annual number of deaths in Ireland from Typhus Fever (black) compared to the annual number of deaths officially recorded from Typhoid Fever (light grey and translucent) in Ireland since records for these diseases began. Source: Chart generated using annual statistics reports since records began – “Annual Reports on Marriages, Births and Deaths in Ireland, from 1864 to 2000” courtesy of An Phríomh-Oifig Staidrimh, Central Statics Office CSO, link. © Copyright

Figure. 2 above, clearly shows this rise in deaths of Enteric Fever (Typhoid) as deaths from Typhus decline. Now we know why the deaths from Typhoid Fever were only recorded from 1881 onwards, as Typhoid hadn’t much hope of getting in while Typhus was wrecking havoc. It also puts the comparatively small number of deaths from Typhoid annually in Ireland into some sharp relief when we begin to calculate the estimated mass devastation that Typhus seems to have caused at its height – prior to when official records began. The graph above (Fig. 2) also clearly illustrates the dramatic and ultimate decline in deaths from both diseases in the end.

This historical investigation into the fate of Typhus revealed that this once mighty disease soon lost its grip throughout much of the 19th Century in Ireland and elsewhere and this is strongly indicated by the historical estimates as well as the documented observations of a  shift from the more deadly form of Typhus to what came to be known as relapsing fever (the less deadly form). This evidence is indicative of natural resistance to the pathogen due to exposure over generations, even in the greatest years of hunger and destitution, which finds further support in the excerpts that follow.

A History of Epidemics in Britain, Volume II

… The best illustrations of the greater severity and fatality of typhus among the well to do come from Ireland, in times of famine…
But it may be said here, so that this point in the natural history of typhus fever may not be suspected of exaggeration, that the enormously greater fatality of typhus (of course, in a smaller number of cases) among the richer classes in the Irish famines, who had exposed themselves in the work of administration, of justice, or of charity, rests upon the unimpeachable authority of such men as Graves, and upon the concurrent evidence of many…
Creighton, C., (1894, 189)

This historical excerpt begins to explain the possible reason why the death toll was not much more massive during the Irish Famine of the 1840s, as it seems that most of the Irish who were impacted the greatest by malnutrition and just about everything else that could be thrown at them, were for the most part, essentially resilient to Typhus’ worst effects  (and those that did survive had to face the onslaught of  an unfamiliar pathogen that was implicated in Typhoid).

On the other hand, as reiterated below, those who were the least exposed to such conditions – the most hygienic, best-fed and generally healthiest individuals – the professionals and philanthropists trying to care for the destitute Irish, were the most vulnerable to Typhus’ worst effects.

Epidemic Diseases of the Great Famine

“During the Great Famine, relapsing fever was the prevalent disease among the general population, while the higher social classes tended to contract the more deadly typhus fever, especially those who were most exposed to infection, notably clergymen, doctors, members of relief committees and those connected with the administration of the poor law. The mortality rate from typhus was also more pronounced among the middle and upper classes than it was among the poor, who may have developed some immunity through long-term exposure”.

Geary L., (Spring 1996, History Ireland Magazine)

It seems that it is the less well-washed and most exposed to body lice populations that were able to build up the most formidable resistance to the disease, whilst, unfortunately, it was the people who had the least resistance that ended up with the poorest immunity, those with the greatest intentions, that ended up more frequently with the decidedly deadlier form of Typhus than those they were trying to help.

The key point from this historical investigation is that it is obviously quite important to be exposed to such pathogens in the first place in order to build up familiarity and ultimate immunity, particularly as our ancestors have seemingly already taken the hit for us. Thus, by not being too squeaky clean and allowing a little clean dirt into our lives, we are actually building up robust resistance to all sorts of pathogens – whether we know it or not.

Remember, just like the pathogen that appears to trigger Typhoid or, the pathogen implicated in the Plague, these microbes never actually went anywhere – they still exist all around us today and can, very occasionally, find a little chink in our armour. And particularly under extreme conditions that can and have sometimes erupted in later history into sporadic epidemics – but nowhere ever on the scale of previous times. Nor, has the common body louse and its own particular parasite that is supposed to be the main cause of  Typhus gone extinct  – it still exists today.  

So, obviously, it helps not to invite the return of such critters to test just how robust our ancestral immunity may actually be, but, as we generally do not – thankfully – hear much about Typhus in our own developing nations these days, perhaps we are more generationally resistant to it than we realise. At this stage, after two major world wars and the amount of travelling that we have done over the past few generations, I would say that our immune systems are all pretty much now familiar with the Typhus pathogen. The fact that we do not often hear about the disease, is probably due to our immune systems doing such a good job all these years. Long may it continue. And presumably, the less developed nations currently having their own battle with such pathogens, as we once did not that many generations ago, will soon follow suit in terms of their own more robust immunity.

Hopefully, this has filled in some of the gaps in our knowledge regarding Typhus.

References for Part Four

[1] Peterson, K.D., (2018) Typhus, Entomology Group Insects, Disease, and History, Montana State University, p.156. [Available online]
[2] Conlon, J. M., (2007) The Historical Impact of Epidemic Typhus, Entomology. Montana Education, History of the Bug, p. 13. [Available Online Google].
[3] Mokyr, J., and Ó Gráda, C., (2002) Famine Disease and Famine Mortality: Lessons from the Irish Experience, 1845-50. in: eds., Tim Dyson & Cormac Ó Gráda. Famine Demography: Perspectives from the Past and Present, Oxford University Press, p. 21. [Available Online]
[4] Post, J.D., (1976) Famine, Mortality, and Epidemic Disease in the Process of Modernization, The Economic History Review. Vol. 29, [1], pp. 14-37 [Available online at JSTOR.ORG ]
[5] Creighton, C., (1894) A History of Epidemics in Britain, Volume II, Cambridge, University Press. [Available online at ]
[6] Douglas, L., (2001) Health and Hygiene in the Nineteenth Century, The Victorian Web (11th Oct., 2002),
[7] Creighton, C., (1894) A History of Epidemics in Britain, Volume II, p. 191, Cambridge, University Press. [Available online at ]
[8] Conlon, J. M., (2007) The Historical Impact of Epidemic Typhus, Montana Education, History of Bugs. p. 13 [Available online as PDF from ]
[9] Geary L., (1996) 18th – 19th Century History/Epidemic Diseases of the Great Famine.., History Ireland Magazine, Vol. 4. [1] (Spring 1996) [Available online ]
[10] Creighton, C., (1894) A History of Epidemics in Britain, Volume II, p. 167, Cambridge, University Press. [Available online at ]
[11] Creighton, C., (1894) A History of Epidemics in Britain, Volume II, p. 167, Cambridge, University Press. [Available online at ]
[12] Creighton, C., (1894) A History of Epidemics in Britain, Volume II, p. 189, Cambridge, University Press. [Available online at ]
[13] Creighton, C., (1894) A History of Epidemics in Britain, Volume II, p. 202, Cambridge, University Press. [Available online at ]
[14] Geary L., (1996) 18th – 19th Century History/Epidemic Diseases of the Great Famine.., History Ireland Magazine, Vol. 4. [1] (Spring 1996) [Available online ]


Cholera: The Disease that Inspired Bram Stoker to Write Dracula? And a Tale of Two Pathogens 

The Lancaster County Cholera Epidemic of 1854 …

Cholera, which was endemic to India, escaped the subcontinent in 1817, striking Moscow in September 1830. It then spread westward across Europe, reaching England in 1831 and North America in 1832. The pandemic would return to Europe and America in 1849, 1854, and 1866, each time filling the population with terror and revulsion; the mystery surrounding the cause of the disease only exacerbated the situation. Its effects were both rapid and devastating, and death was agonizing to those who succumbed to the disease…
Osborne, J.D., (2009)

A disease that becomes pandemic (impacts nations world-wide often with deadly impact) is obviously one that can be passed from person to person. And Ireland, on the edge of Europe, was certainly not immune to Cholera’s devastation as highlighted in the following excerpt:

Why does cholera have such a reputation?

Cholera was once one of the biggest killers in Irish society. The 1832 epidemic killed 30,000 people… Folk memories of the cartloads of bodies during the 1832 epidemic helped inspire Bram Stoker to write the novel ‘Dracula’.
MedMedia Group (2018)

30,000 deaths in a single Cholera outbreak of the 1830s pandemic in Ireland (the first and seemingly, the worst), although not as major in terms of the death toll estimated for the initial impact of Typhus in the early days (which could have been well over half a million in a population of only 6 million), is still a massive impact to a community of several million at this time. Take, for example, just one well-documented experience within a single county in the whole country of Ireland, as outlined in the following:

The Sligo epidemic that stoked Bram’s interest in all things

Stoker’s mother, Catherine Thornley, came from Sligo town, and witnessed at first hand the devastating cholera epidemic that swept the county in 1832. Bram — or Abraham, as he was christened — would avidly listen to Catherine’s sobering accounts of what she had witnessed in Sligo before he was born…
And Stoker experts believe Catherine’s vivid descriptions of the suffering she had seen stayed with young Stoker and helped fuel his macabre novel later on in life…
It’s not known how it started but the first signs of the disease were noted shortly after a heavy thunderstorm. A market was being held that day and thanks to the large concentration of people in a comparatively contained space, it struck with a brutal swiftness.
A farmer was infected as he mounted his horse on one side of the town and was dead by the time he reached the other. Another man who attended the funeral of an employee in the morning became ill during the burial and was dead by evening. One family saw six of its members die in the course of a single night. The death rate was so rampant that carpenters ran out of wood for making simple coffins and the dead had to be wrapped in pitched sheets and rolled into mass graves.
Local legend has it that some people were buried alive, so great was the haste to dispose of the corpses. The scenes at night around Sligo only served to heighten the sense of dread in the town. Tar barrels were lit in the streets in a misguided attempt to purify the air…
Doctors valiantly attempted to stem the outbreak, and had to contend with widespread ignorance about the disease. There was also suspicion that the medics themselves may have brought the disease upon the town — they had conducted tests on the water to see if the epidemic had started there, and word spread that the water had been tampered with. Even when five of the doctors contracted cholera and died, the allegations continued.
Some 15,000 people were forced to flee the county and it is thought more than 1,500 people died from the epidemic. The events of 1832 would scar Sligo for generations, and the suffering of those who survived would be exacerbated by the Great Potato Famine, which struck just 13 years later.
John Meagher, J., (Independent, Apr., 22nd 2012)

The most revealing part of this narrative is perhaps the swiftness of the spread of Cholera – seemingly person to person – with no particular trigger such as the locals all drinking contaminated water (note that Cholera is supposed to be a waterborne disease). We have other historical documentation to support this very point as seen in the excerpt below. It relates to a later pandemic of 1854 as Cholera broke out in a small pocket of Columbia, U.S.

The Lancaster County Cholera Epidemic of 1854 and the Challenge to the Miasma Theory of Disease

September 6, 1854, when two German immigrants, sick with cholera, were left at the railroad terminus in Columbia while their party continued west. The men died the next day. Four Columbians who had tried to aid them came down with cholera and died shortly thereafter…
By September 9, cholera had spread to almost every section of the town, and 30 people had died, many of whom had visited the stricken immigrants. Physicians had no doubt that the disease that they were witnessing was cholera. The virulence of the epidemic that struck Columbia caused Jackson to observe that two-thirds of the victims died within five hours of showing symptoms of the disease…
Although only 127 victims died in Columbia—out of a population of five thousand—Dr. Wilson Jewell of the College of Physicians of Philadelphia and president of the Philadelphia Board of Health estimated that if a similar outbreak had occurred in Philadelphia, it would have killed 75 people an hour.
Osborne, J.D., (2009)

This third recorded Cholera pandemic of 1854 strongly suggests that no amount of cleaning up sewers could have stopped the disease in its tracks, at least not at a national and transnational level once it had found a means of jumping from person to person, as also implied by the title of article excerpt above. And supporting this concept, as noted at the beginning – the very nature of pandemics goes against the idea that Cholera was spread due to the level of hygiene within any of our far-flung and emerging nations.

 In other words,  although this goes against our commonly and firmly held belief that Cholera and other old diseases like it, came to prominence and ultimately declined due to our level of cleanliness, does not appear to fit the historical facts. A case in point is documented in the following:

No better example of this belief exists than in the now famous case of the Broad Street pump incident as outlined below:

John Snow and the Broad Street Pump: On the Trail of an Epidemic

The first cases of cholera in England were reported in 1831, about the time Dr. Snow was finishing up his medical studies at the age of eighteen…
Dr. Snow believed sewage dumped into the river or into cesspools near town wells could contaminate the water supply, leading to a rapid spread of disease.
In August of 1854 Soho, a suburb of London, was hit hard by a terrible outbreak of cholera. Dr. Snow himself lived near Soho, and immediately went to work to prove his theory that contaminated water was the cause of the outbreak.
“Within 250 yards of the spot where Cambridge Street joins Broad Street there were upwards of 500 fatal attacks of cholera in 10 days,”
…Officials contended there was no way sewage from town pipes leaked into the pump and Snow himself said he couldn’t figure out whether the sewage came from open sewers, drains underneath houses or businesses, public pipes or cesspools.
The mystery might never have been solved except that a minister, Reverend Henry Whitehead, took on the task of proving Snow wrong.
… Reverend Whitehead interviewed a woman, who lived at 40 Broad Street, whose child who had contracted cholera from some other source.  The child’s mother washed the baby’s diapers in water which she then dumped into a leaky cesspool just three feet from the Broad Street pump, touching off what Snow called “the most terrible outbreak of cholera which ever occurred in this kingdom.”
A year later a magazine called The Builder published Reverend Whitehead’s findings along with a challenge to Soho officials to close the cesspool and repair the sewers and drains because “in spite of the late numerous deaths, we have all the materials for a fresh epidemic.”  It took many years before public officials made those improvements.
Tuthill, K., (2003)

Although the Broad Street outbreak may have greatly exacerbated the ongoing pandemic of 1854 at a local level within this tiny area of the London suburb, I think we can safely say that this episode cannot account for the pattern of deaths from Cholera – the same pandemic of 1854 – experienced simultaneously throughout our diverse nations.

Recalling that it took years for the contaminated sewer leak at Broad Street, London to be repaired, it is also worth noting that such hygiene measures and attempts to control the spread of Cholera were not always successful,  or, even necessarily viewed as a good thing as encapsulated in the excerpt from a letter to the ‘Times’ around the time of the 1854 pandemic.

Letter to the ‘TIMES’ 1854

‘ We prefer to take our chance with cholera than be bullied into health. There is nothing a man hates so much as being cleansed against his will or having his floor swept, his hall whitewashed, his dung heaps cleared away and his thatch forced to give way to slate. It is a fact that many people have died from a good washing.’
Child, J and Shuter, P. (1992, 101)

Obviously, some people were willing to take their chances with the disease itself, which appears by all historical accounts to be becoming significantly less deadly as time went on – and certainly, from the time when Cholera first impacted our developing nations around the 1830s.

Perhaps that is why people were beginning to get a little irked by this zealous cleanliness as I am sure that many people on the ground knew quite well that the Cholera that had, not that long ago, swept across their homelands (certainly in living memory) was becoming relatively tame compared to the earlier days of the deadlier Cholera of plague-like proportions that swiftly spread devastatingly from person to person and the fact that people may have noticed that the limp corpses of entire families  being tossed unto cartloads of putrifying wretched bodies to be buried alive or dead amidst the tar filled sulphur piles, were not a common sight in the Cholera outbreaks of 1854.

An indication of this relative tameness of later Cholera pandemics, is already clear when we look to the preceding pandemic recorded a few years earlier dating to the late 1840s – generally known as Asiatic Cholera. What is interesting about this particular pandemic in the context of Ireland, is that it erupted on the heels of what is commonly referred to as the Great Hunger (or Irish Famine) of the 1840s.

One would think that this would have greatly increased the numbers dying from the disease, given our very poor nutritional status and overall dreadful living conditions, however,  as it turns out, seemingly, starvation, famines and overall impoverishment and squalid conditions do not appear to have played a major role in increasing the death toll as one might expect; although, it certainly didn’t help matters as indicated in the excerpt that follows:

History Ireland

Epidemic Diseases of the Great Famine

The arrival of Asiatic cholera as a pandemic in 1848-49 exacerbated the situation. This fearsome disease added to the physical and mental suffering of the beleaguered population and increased the overall mortality.
Greary, L., (1996)

Instead, it actually looks like Cholera is losing its grip – even at the height of the Great Irish Famine of the mid-1840s, as indicated by the data recorded in the following excerpt based upon the national census over the course of the main famine years:

Famine Disease and Famine Mortality: Lessons from the Irish Experience, 1845-50

… the census reported a total of 1,376 cholera deaths in the years 1841-47 (plus a further 2,502 in 1848).
Mokyr, J., and Ó Gráda, C., (1999, 7)

Therefore, if we compare the Cholera pandemic of this Asian type of the late 1840s, with the first major and preceding Cholera pandemic of the early 1830s, we can clearly see that the estimated 30,000 deaths, in a single outbreak of Cholera in Ireland alone, compared to the incredibly reduced estimates – even with all its inaccuracies and difficulties – reported from the Irish census of a few thousand deaths from Cholera over the worst of the Famine years ever recorded in this small nation, we truly begin to see that starvation, poverty and dreadfully stressful conditions all round does not necessarily lead directly to increased mortality from Cholera circulating as part of a broader pandemic at the time.

Moreover, this significant taming of Cholera is clearly seen from statistics recording deaths from the final major pandemic of Cholera – which occurs simultaneously throughout our developing nations in 1866. This is the first Cholera pandemic captured by the official statistics of deaths from Ireland which began two years previously (Fig. 1). The death figures for the 1866 pandemic in Ireland shows a significant decline in the number of deaths for several decades when it re-erupts briefly for another decade – but is not as deadly, and is suddenly never heard of again (Fig. 1). 

Cholera deaths Ireland

Fig. 1: Chart of the annual number of deaths in Ireland from Cholera since records for this disease began in Ireland. Source: Chart generated using annual statistics reports since records began – “Annual Reports on Marriages, Births and Deaths in Ireland, from 1864 to 2000” courtesy of: An Phríomh-Oifig Staidrimh, Central Statics Office CSO, link. © Copyright

Once again, we may be looking at natural resistance to the disease over generations of exposure as indicated in a more modern assessment of the disease of Cholera as seen in the more recent excerpt given in the following.

Etiology and Epidemiology of Cholera

When cholera first appears in epidemic form in an unexposed population, it can affect all age groups. In contrast, in areas with high rates of endemic disease, most of the adult population have gained some degree of natural immunity because of illness or repeated asymptomatic infections.
CDC (1999, 39)

In other words, Cholera, and such diseases that are typically viewed as hygiene preventable, appear to be marginally impacted by any of our regional hygienic responses and attempts at intervening in the natural circulation of this pathogen –  except perhaps at a local level.

The historical archives and the statistical data presented above indicate a very similar natural patterning over national boundaries for corresponding timeframes for the fate of Cholera. We see the near-simultaneous rise of Cholera to deadly prominence in pandemic proportions, rapidly spreading from person to person, but within a few short generations – the massive devastation  as epitomised by the macabre accounts of the initial pandemic of the 1830s, just as quickly lost its lethal grip on our fledgling nations to become a relatively benign disease once again.

Cholera, a waterborne pathogen still exists. It never went extinct, but instead, it seems that we have built up natural resistance and immunity to this pathogen via generational exposure and as such, this pattern is indicative of a natural biological interplay between the pathogen itself and us as its host.

This brings us to another interesting pattern as seen in the following section which lends further support for the ultimate demise of Cholera from its much deadlier impact in the earlier part of the 19th Century, which can be seen from the interplay between the pathogens themselves, attempting to compete with each other for us as their own particular host.

A Tale of Two Pathogens

Screen Shot 2018-08-15 at 04.35.28

Fig. 2: Chart of the annual number of deaths in Ireland from Cholera and Dysentery since records for this disease began. Note the large spike at the beginning, (from when records for both these diseases first began). These diseases decline somewhat thereafter, only to erupt again during the 1880s (Dysentery first, and then Cholera takes prominence from 1900 for almost a decade),– but both diseases stop abruptly with the first decade of the 20th Century. Source: Chart generated using annual statistics reports since records began – “Annual Reports on Marriages, Births and Deaths in Ireland, from 1864 to 2000” courtesy of An Phríomh-Oifig Staidrimh, Central Statics Office CSO, link. © Copyright

Notice from Figure 2, that there is a dramatic drop in deaths from Cholera (Dark Grey) just after the 1866 pandemic (shortly after official records began to record deaths from this disease). Note that deaths from Dystentery (Light Grey) since official records began, fill the space where Cholera seems suppressed somewhat by the predominance of Dysentery for a number of decades.  Note also, the slight gap in the graph that shows very few deaths from either Cholera or Dysentery compared to their loftier heights before records began as significantly more deadly diseases. As it turns out, this void of pathogen destruction appears to be filled by deaths from Typhus  (see the previous discussion on this disease) (Fig, 2).

Taking a closer look at Dysentery, we can see from (Fig.  2) that the deaths registered from this disease stop abruptly in 1900. Like Cholera, and all the other diseases discussed thus far in the context of our now developed nations, the historical records clearly show that Dysentery had a much deadlier beginning within our developing nations of the past.

Now, if we inspect the fate of Cholera (Fig. 2), we can now perhaps see what stopped Dysentery in its tracks. But even so, Cholera’s final reign only lasts a further decade before it too suddenly became an awful lot less deadly (note that these figures do not go beyond 1910, as there were simply no deaths officially being recorded annually in the register).

This interplay between the rise and fall of these two diseases is supported by the historical reference from the closing decade of the 19th Century as seen in the excerpt below. Note, however, that when this observation was written, little did the author know that even Cholera would also, like its older country cousin, be a thing of past just over a decade later.

A History of Epidemics in Britain, Volume II

Dysentery, the old ” country disease,” has steadily declined to about a hundred deaths in the year, while the considerable mortality from diarrhoea, nearly two thousand deaths in a year, is nearly all from the cholera infantum or summer diarrhoea of children in the large towns.
Creighton, C., (1894, 296)
[10] ]

As investigated in the previous sections on Typhus and Typhoid, a similar interplay of the rise and fall seen in the graphs as the predominance of one disease over the other presumably reflects their attempts to opportunistically colonise us as their hosts.  This pattern strongly suggests a biological cause – more to do with the vying for dominance between the pathogens themselves – than any hygiene measures that our particular developing nations may have (or not) implemented at any given time, within any particular region to try and prevent these pathogen’s bid for dominance.

Irrespective of this interesting interplay of pathogens fighting amongst themselves for top dog position, it seems that our own mighty immune systems won the day and managed to bring them all under control in the end. 

References to Part Five
[1] Osborne, J.D., (2009) The Lancaster County Cholera Epidemic of 1854 and the Challenge to the Miasma Theory of Disease, Edward Hand Medical Heritage Foundation [Available online at]
[2] MedMedia Group (2018) Why does cholera have such a reputation? Irish Health, [Available Online ]
[3] Meagher, J., (2012) The Sligo epidemic that stoked Bram’s interest in all things, Independent, (Apr., 22nd 2012) [Available online
[4] Osborne, J.D., (2009) The Lancaster County Cholera Epidemic of 1854 and the Challenge to the Miasma Theory of Disease, Edward Hand Medical Heritage Foundation [Available online at]
[5] Tuthill, K., (2003) John Snow and the Broad Street Pump: On the Trail of an Epidemic, Cricket Magazine, Cricket Vol, 31, [3], pp. 23-31, (Nov. 2003); Carus Publishing Company. [Available online republished with permission in full online: ]
[6] Child, J and Shuter, P. (1992, 101), Letter to the ‘TIMES’ 1854, Understanding History, Vol. 2, Heinemann [Available Online on Google Books ]
[7] Geary L., (1996) 18th – 19th Century History/Epidemic Diseases of the Great Famine.., History Ireland Magazine, Vol. 4. [1] (Spring 1996) [Available online ]
[8] Mokyr, J., and Ó Gráda, C., (1999) Famine Disease and Famine Mortality: Lessons from the Irish Experience, 1845-50. PDF, p. 7. [Available Online as PDF at ]]
[9] CDC (1999) Etiology and Epidemiology of Cholera, in, Laboratory Methods for the Diagnosis of Epidemic Dysentery and Cholera, Centers for Disease Control and Prevention, Atlanta, Georgia, Chapt., 5, p. 39. [Available online as PDF ]
[10] Creighton, C., (1894) A History of Epidemics in Britain, Volume II, p. 296, Cambridge, University Press. [Available online at ]




Scarlet Fever Returns: but it is a lot less deadly

We don’t know much about Scarlet Fever from the earlier era (pre-1800s), but we do, however, understand that it was fairly widespread and existed in some form as it relates to another fairly similar disease – Diphtheria, as documented in the excerpt below taken from Charles Creighton’s 1894 history of epidemics.

A History of Epidemics in Britain



Scarlatina and diphtheria have to be taken together in a historical work for the reason that certain important epidemics of the 18th century, both in Britain and in the American colonies, which were indeed the first of the kind in modern English experience, cannot now be placed definitely under the one head or the other, nor divided between the two.

Creighton, C.  (1894, 678)

Creighton also documents regular and widespread outbreaks of throat infections that often led to fatalities throughout Ireland, Britain and in parts of the U.S. over the course of the 18th Century (the 1700s) that were akin to what we would call Scarlet Fever. However, it wasn’t really until the 19th Century (the 1800s) that we begin to see specific epidemics erupt around the same time within Ireland, Scotland, England and Wales that Scarlet Fever becomes understood as a more distinct disease as indicated in the excerpt that follows.

A History of Epidemics in Britain

Vol. II.


… The general prevalence of malignant scarlet fever in the first years of the 19th century is farther shown by the accounts from Ireland, which were recalled by Graves in a clinical lecture of the session 1834-35, during the prevalence of a scarlet fever as malignant as that of thirty years before…

“In the year 1801,” he says, “in the months of September, October, November and December, scarlet fever committed great ravages in Dublin, and continued its destructive progress during the spring of 1802. It ceased in summer, but returned at intervals during the years 1803-4, when the disease changed its character; and although scarlatina epidemics recurred very frequently during the next twenty-seven years, yet it was always in the simple or mild form, so that I have known an instance where not a single death occurred among eighty boys attacked in a public institution.

The epidemic of 1801-2-3-4, on the contrary, was extremely fatal, sometimes terminating in death (as appears by the notes of Dr Percival kindly communicated to me) so early as the second day. It thinned many families in the middle and upper classes of society, and even left not a few parents childless. Its characters seem to have answered to the definition of the scarlatina maligna of authors.”

The long immunity from malignant scarlatina which Graves asserts for Ireland after 1804, is made probable also for England and Scotland after 1805…

It is not until 1831 that we begin to hear much of malignant scarlatina again. But it is clear that scarlet fever was common enough all through that interval, probably in its milder form. It was now the usual epidemic trouble of schools.

Creighton, C.  (1894, 722-3)

Supporting this observation of Scarlet Fever taking a turn for the worse to become a much more lethal contagion of pandemic proportions throughout many of our developing nations, and bearing in mind that we also had to contend with other fairly consistently deadly contagions of the era such as, Measles, Diphtheria and Whooping Cough (Pertussis) alongside Scarlet Fever, as indicated below,  Scarlet Fever became so deadly that it began to supersede all of these  other contagions. Scarlet fever–past and present


In the early nineteenth century, the clinical presentation of the disease appears to have changed for the worse. Lethal epidemics were seen in Tours, France, in 1824; in Dublin, Ireland, in 1831; and in Augusta, Georgia, during 1832-33. Similarly, in Great Britain, the fatality rate from scarlet fever increased from between 1 and 2 % to more than 15% in 1834. From 1840 until 1883, scarlet fever became one of the most common infectious childhood disease to cause death in most of the major metropolitan centers of Europe and the United States, with case fatality rates that reached or exceeded 30% in some areas–eclipsing even measles, diphtheria, and pertussis.
Smith, T.C.(2011)

 In other words, you can imagine how parents felt as they not only were losing their children to all the other highly lethal contagions of childhood, but now they had to contend with the lesser of these killers rising to prominence as an even greater plague of destruction in the form of a particularly virulent increase of Scarlet Fever.

Scarlet Fever knew no social boundaries. When Scarlatina came to visit, it didn’t matter how poor or well off you were, she could still knock on your family’s door. For instance, as highlighted earlier in Creighton’s (1884) [4] discussion of the devastation of Scarlet Fever in Ireland of the beginning of the 1800s, he highlights the fact that even some of the more affluent families had been thinned by its impact. And indeed, the devastation often included some of our best-known figures of historical renown as highlighted in the following excerpt:

Scarlet fever–past and present 


Children were always the worst affected, and proved to be highly susceptible. Charles Darwin lost two children to scarlet fever in the 1850s. Scarlet fever is also believed to have caused the 19-month old Helen Keller to lose her hearing and sight. John Rockefeller lost a two-year old grandson to scarlet fever, which is why Rockefeller University remains one of the world’s leading biomedical research centers in the world today.

Smith, T,C, (2011)

Scarlet Fever was so common and dreaded in childhood that it worked its way into children’s literature such as Little Women and  ‘Velveteen Rabbit’. Below is a short excerpt from the latter.


The Velveteen Rabbit, or How Toys Become Real tells the story of a stuffed rabbit made of velveteen…

And then, one day, the Boy was ill.
His face grew very flushed, and he talked in his sleep, and his little body was so hot that it burned the Rabbit when he held him close.  Strange people came and went in the nursery, and a light burned all night and through it all the little Velveteen Rabbit lay there, hidden from sight under the bedclothes, and he never stirred, for he was afraid that if they found him some one might take him away, and he knew that the Boy needed him…
Presently the fever turned, and the Boy got better.  He was able to sit up in bed and look at picture books, while the little Rabbit cuddled close at his side.  And one day, they let him get up and dress
The Boy was going to the seaside tomorrow.  Everything was arranged, and now it only remained to carry out the doctor’s orders.  They talked about it all, while the little Rabbit lay under the bedclothes, with just his head peeping out, and listened.  The room was to be disinfected, and all the books and toys that the Boy had played with in bed must be burnt.
“Hurrah!” thought the little Rabbit.  “Tomorrow we shall go to the seaside!”
…Just then Nana caught sight of him.
“How about this old Bunny?” she asked.
“That?” said the doctor.  “Why, it’s a mass of scarlet fever germs!– Burn it at once.  

Williams, M., (1922, 33-36)


Thankfully, the boy (and seemingly the rabbit) survive the ordeal with Scarlet Fever – thus, leaving the reader with some hope that the disease could be gotten over without any bad effects. And most certainly, they needed hope, as you might imagine, at the coal face so to speak, there was a great deal of pessimism as we felt wholly helpless and unable to prevent this slaughter of innocence. Below are some quotes from the period which reveal this bewilderment and hope that our medical advances will hold the key to such senseless destruction of life sometime in the not too distant future.

Chapter 5

The Historiography of Social Medical Improvement

… the situation in February 1885 remained bleak:

The prevention of scarlet fever is as yet an unsolved problem. I trust such men as Pasteur and Koch will turn their attention to it; my only hopes of a satisfactory answer lie in that direction…

Davies was not alone in his pessimism…:

Yet, as knowledge and administrative resources now stand, official powers of preventing this murderous disease are, practically speaking, insignificant; and such general advice as may be given for individual preventive purposes has so little likelihood of being applied except in select cases, that, as regards the main mass of sufferers, it may seem almost insincere and derisory…

Bristol Historical Resource (2000)

Creighton draws our attention to the remarkable pattern of  this disease as it rose from being so meek to such lethal virulent proportions for a number of decades from the mid-19th Century, and he also describes how it began to return to its milder form by the 1880s within England and Wales, but bear in mind that Ireland, Scotland and many other regions follow this pattern of peaks and troughs for the same period.

A History of Epidemics in Britain

Vol. II

The enormous number of deaths from scarlatina during some thirty or forty years in the middle of the 19th century will appear in the history as one of the most remarkable things in our epidemiology. There can be no reasonable doubt that this scarlatinal period was preceded by a whole generation with moderate or small mortality from that disease, just as it is now being followed by annual death-rates which are less than a half, perhaps not more than a third, of the average during forty years before 1880.

Creighton, C.  (1894, 72)

As tabulated by Creighton (1894), although Scarlet Fever’s reign of terror is shown to continue ebbing, the statistics also demonstrate that Measles, and to some extent, Diphtheria, began to pick up the reins where Scarlet Fever left off up to the time of Creighton’s review published in the last decade of the 19th Century. This pattern is illustrated in the following graph for the era under discussion (Figure 1).

Scarlet Fever, Measles, Diphtheria Eng and Wales

Fig. 1 Graph generated using tabulations for individual annual death statistics in England and Wales from 1837 to 1880 compiled by Creighton, C. (1894) A History of Epidemics in Britain, Volume II, From the Extinction of Plague to the Present Time, p. 722-3, Cambridge University Press, Cambridge.

As we move into the 20th Century, we discover something that the earlier statisticians could not have foreseen. Scarlet Fever ceases to be a major threat to children and becomes a relatively benign disease of childhood.

Scarlet Fever Deaths Ireland

Fig. 2: Chart of the annual individual number of deaths in Ireland from Scarlet Fever since records began going beyond the period when deaths from this disease were no longer registered. Source: Chart generated using this tumultuous statistics reports since records began – “Annual Reports on Marriages, Births and Deaths in Ireland, from 1864 to 2000” courtesy of: An Phríomh-Oifig Staidrimh, Central Statics Office CSO, link. © Copyright

Scarlet fever–past and present 


Historical data suggest at least three epidemiologic phases for scarlet fever. In the first, which appears to have begun in ancient times and lasted until the late eighteenth century, scarlet fever was either endemic (always present at a low level) or occurred in relatively benign outbreaks separated by long intervals.

In the second phase (~1825-1885), scarlet fever suddenly began to recur in cyclic and often highly fatal urban epidemics. In the third phase (~1885 to the present), scarlet fever began to manifest as a milder disease in developed countries, with fatalities becoming quite rare by the middle of the 20th century.

In both England and the United States, mortality from scarlet fever decreased beginning in the mid-1880s. By the middle of the twentieth century, the mortality rate from scarlet fever again fell to around 1%.

Smith, T,C, (2011)

A drop to c. 1 per cent by the middle of the 20th Century in the death rate from Scarlet Fever is fairly spectacular.

The pattern seen in the above graph for deaths from Scarlet Fever in Ireland (Fig. 2) corresponds very closely with the historical documentation for the same timeframe as those charted for England and Wales and elsewhere.  Moreover, the fact that Scarlet Fever cases (infections) were very prevalent at the time when deaths from the disease was very rare in our more modern era (the 1950s and 60s) is clearly illustrated by the steep decline in deaths from Scarlet Fever and correspondingly high incidence of cases of the disease recorded from official statistics in England and Wales for infants and children even up to the 1940s (given in Figure 14) in: ‘A Century of Changes in the Mortality and Incidence of the Principal Infections of Childhood’ (Gale, A.H, Medical Officer, Ministry of Education, 1945) [10].

Bearing in mind the fact that Scarlet Fever childhood outbreaks (epidemics) in our more modern era did not produce corresponding mortality and morbidity, it is not until we review the death statistics over the course of the earlier part of the 20th Century throughout our developed nations that we really grasp just how dramatic the decline in deaths from Scarlet Fever throughout this timeframe truly was.

For instance, this is clearly seen within graphs presented in Figure 4.15: Scarlet Fever from the turn of the 20th Century until modern times, with the overwhelming majority (over 99 per cent) decline in deaths occurring in the period 1901-45 (See ‘Atlas of Epidemic Britain: A Twentieth Century Picture’  Smallman-Raynor, M., and Cliff, A (2012) [11]

This dramatic decline of Scarlet Fever as a major killer – also within England and Wales – is clearly seen in the graphs generated in Thomas McKeown’s research as illustrated in ‘The Role of Medicine’ (1979)  [12].

This near-simultaneous decline in deaths from Scarlet Fever (but not its cases of infection) is also illustrated by the graphs generated for the United States for a similar timeframe here: Figure 1: Deaths per 100,000 from Rheumatic Fever and Scarlet Fever …Link ‘Health at Older Ages: The Causes and Consequences of Declining Disability among the Elderly’ (Costa,  D.L., 2009) [13]. 

Also see the graphs  ‘Mortality in the United States 1900 – 1950’ (Gordon, T., 1968, figure. 3). [14], and ‘Vital Statistics Rates in the United States, 1940 -1960’, (Grove, R.D. and Hetzel, A., 1968) [15] for very similar patterns of decline in deaths from Scarlet Fever and essentially the same endpoint for this disease ceasing to be a major killer – again, within the United States.

Anyway,  everyone it seems began to sigh a great breath of relief as they were now counting more of their children with increasing confidence, from having had Scarlet Fever. But, it just so happened that Scarlet Fever returned and it would seem that we had forgotten just how tame this pathogen had become and we simply equated the return of the disease in our modern world with a return to the type of fatalities of the 19th and earlier 20th Century. It was a global comeback out of seemingly nowhere, and its impact left everyone scratching their heads.


What is happening!

After decades of decline, scarlet fever is once again on the rise in the UK and other places around the world, and doctors are scrambling to figure out why.

Beginning in 2014, the infection started to steadily rise, and in 2016, over 19,000 cases from 620 outbreaks were reported, mostly in schools and nurseries. This represents a seven-fold increase since 2011.

Starr, M., (2017)

You can begin to imagine the fear as we began to realise that this was the very same disease and symptoms of Scarlet Fever of old. Would our children start dying from the disease in their thousands as emblazoned in our imaginations of the tragic stories of the dark days of the Victorian era?

 Scarlet fever: the disease in the UK

The Pharmaceutical Journal

…it may not sound terrible based on those symptoms, but it was responsible for 36,000 registered deaths in the first decade of the 20th century in England and Wales, and was a leading cause of child mortality.

There’s no vaccine for scarlet fever. Once contracted, it’s treated quite easily with a course of antibiotics, which – at least partially – contributed to the disease’s decline in developed countries after about 1945.

Marshall, S.  (2006)

This was a true pandemic as it almost went worldwide. It was horrifying for the poor parents of infants and children who got the disease as they had remembered stories of the massive death toll of the dreaded strawberry tongue.

Scarlet Fever, a Disease of Yore, Is Making a Comeback

The reason for the sudden surge remains a mystery,

Scientific America

Scarlet fever, a disease that struck fear into the heart of parents when cases surged in the days of yore, appears to be making an unexpected and puzzling comeback in parts of the world. England and Wales have seen a substantial rise in scarlet fever cases starting in 2014.

The number of cases tripled from 2013 and continued to increase in 2015 and 2016, with England and Wales last year recording the highest number of cases there in a half-century, British scientists reported Monday in the journal Lancet Infectious Diseases.

Similar and in some cases even larger surges of scarlet fever have been reported in recent years in South Korea, Vietnam, China, and Hong Kong. Hong Kong, which saw a tenfold rise in cases, continues to report increased annual counts five years after the resurgence was first noticed.

The reason for the sudden and surprising increase is a mystery. And the authors of a commentary that accompanied the article urge other countries to be on the lookout for similar spikes in cases.

Branswell, H. (2017)

However, the pandemic continued to sweep throughout our now developed nations, and its impact was nowhere near as devastating as we were beginning to anticipate. Different causes for the mildness of the modern pandemic were considered as seen in the following.

 Scarlet fever: the disease in the UK

The Pharmaceutical Journal

…The most obvious reason for a resurgence in a bacterial infection would be a new strain of the disease that spreads more easily and is possibly antibiotic-resistant – but molecular genetic testing has ruled this out.

Instead, tests showed a range of already established strains of the bacteria, leaving researchers still looking for a possible cause.

Meanwhile, the 2016 statistics put incidence at 33.2 cases per 100,000 people, with 1 in 40 cases being admitted to hospital (although around half of those get discharged the same day).

Marshall, S. (2006)

This posed the obvious question: if it was not the Scarlet Fever strain of old – the pathogen that killed so many – then,  what on earth was going on? The fact that it was essentially the same disease and amazingly, no deaths and rather brief and typically uneventful hospital visits, left them even more puzzled than before as indicated in the following excerpt:

Scarlet Fever, a Disease of Yore, Is Making a Comeback

The reason for the sudden surge remains a mystery,

Scientific America

“The strains didn’t give us the answer. We were really pinning our hopes on those, because that’s the most obvious answer,” she noted. “We’re left thinking what on earth it could be. We don’t have an answer at the moment.” Even though scarlet fever does not have to be reported to the CDC, Lamagni said a surge in the United States would be hard to miss. “If they were seeing what we’re seeing, they would know about it. It is unusual,” she said.

Branswell, H. (2017)

We now know that the pathogen didn’t change genetically – it’s still the same pathogen that killed thousands annually back in the day. And perhaps the answer to this quandary regarding Scarlet Fever’s return in our fully modern era points to another cause. It looks very likely that our immune systems have become highly educated regarding this particular pathogen. Our genes didn’t change to accommodate, it as it would take too long to make such a genetic change as discussed earlier – perhaps then, our fairly rapid adaptation is more indicative of an epigenetic phenomenon – leaving a long-term impression upon our existing genes that can be transmitted across generations.

Could it be that like the great plagues (Typhus, Typhoid, Dysentery and Cholera) discussed previously, that it was our familiarity and resistance to the pathogen that had changed over many generations and this memory was imprinted in our ancestor’s epigenome? This would provide us with very longterm resilience to the pathogen and it would be perfectly natural if the disease returned in our modern era, that it would behave in the way that it did – giving us all a much-needed booster, lest our immune systems forgot. See it as a type of fire drill and a way of keeping our systems up to date. After all, as the excerpt below indicates:

Scarlet fever: the disease in the UK

The Pharmaceutical Journal

Figures suggest that up to 40 per cent of the population are asymptomatic carriers, with low infectivity and little risk of developing complications.

Marshall, S. (2006)

With that number of asymptomatic carries (or silent carriers of Scarlatina), it seems that it is always circulating away in the background – but simply not generally expressing itself to any great degree.

It would appear that we just became more resilient to its attacks via exposure over the generations akin to what would seem to be the case with all the other – once more deadly contagions discussed thus far. Fortunately, with all the generations of exposure prior to this pandemic event, we were ready for Scarlatina’s return, even if we didn’t know it at the time. With no vaccine at hand, everyone assumed that deaths and horrors of the Victorian era would descend upon our children.

I suppose nobody knew to look back at the death statistics and watch in real-time, as, across all our developing nations, even by the mid-1940s,  this disease was already becoming a relatively benign and natural rite of passage in childhood and do also bear in mind that Measles went the same way shortly after.

Next Episode: Part Seven: (Don’t Count Your Children ’til they’ve had the POX) Killer Plagues of Our Modern Era – Plummet from Natural Causes..?

References to Part Six
[1]  Creighton, C. (1894) A History of Epidemics in Britain, Volume II, From the Extinction of Plague to the Present Time, Vol. II, p. 678, Cambridge University Press, Cambridge.
[2]  Creighton, C. (1894) A History of Epidemics in Britain, Volume II, From the Extinction of Plague to the Present Time, Vol. II, p. 722-3, Cambridge University Press, Cambridge.
[3] Smith, T.C. (2011) Scarlet fever–past and present, Aetiology Blog (July 6yh 2011) [Available online at]
[4] Creighton, C. (1894) A History of Epidemics in Britain, Volume II, From the Extinction of Plague to the Present Time, Vol II. Cambridge University Press, Cambridge.
[5] Smith, T.C. (2011) Scarlet fever–past and present, Aetiology Blog (July 6yh 2011) [Available online at]
[6] Williams, M., (1922) Velveteen Rabbit or How Toys Become Real, Doubleday & Company , Inc., New York. [Available on Project Gutenberg]
[7] Bristol Historical Resource (2000) Chapter 5: The Historiography of Social Medical Improvement, in, (eds) ByIan Archer, Spencer Jordan, Keith Ramsey, Peter Wardley and Matthew Woollard [Available online website BHR]
[8] Creighton, C. (1894) A History of Epidemics in Britain, Volume II, From the Extinction of Plague to the Present Time, Vol. II, p. 72, Cambridge University Press, Cambridge.
[9] Smith, T.C. (2011) Scarlet fever–past and present, Aetiology Blog (July 6yh 2011) [Available online at]
[10] Gale, A. H. (1945). A Century of Changes in the Mortality and Incidence of the Principal Infections of Childhood. Archives of Disease in Childhood, Vol. 20, [101], pp. 2–21.
[11] Smallman-Raynor, M, and Cliff, A (2012), Atlas of Epidemic Britain: A Twentieth Century Picture, Oxford University Press, Oxford. p.50, figure 4:18 (Measles); p. 52, figure 4:24 (Whooping Cough); p.49, figure 4:15 (Scarlet Fever).
[12] Mc Keown, T (1979) The Role of Medicine: Dream, Mirage, or Nemesis? Basil Blackwell, Oxford [Available online as PDF]
[13] Costa, D. L., (2009) Health at Older Ages: The Causes and Consequences of Declining Disability among the Elderly in, (eds.,) David M. Cutler and David A. Wise, Selection from a published volume from the National Bureau of Economic Research, University of Chicago Press [Available online at the National Bureau of Economic Research NBER]
[14] Gordon, T. (1953) Mortality in the United States, 1900-1950. Public Health Reports, Vol. 68 [4], pp. 441–444.
[15] Grove, R.D., and Hezel, A.M (1968) Vital Statistics Rates in the United States, 1940 – 1960, Department of Health, Education and Welfare Public Health Service, National Center for Health Statistics. [Available online]
[16] Starr, M., (2017) The Once-Deadly Scarlet Fever is making a Weird Comeback around the World – What is happening! Irish Times, (November 29th 2017)
[17] Marshall, S., (2006) Scarlet fever: the disease in the UK, The Pharmaceutical Journal (July Issue 2006).
[18] Branswell, H., (2017) Scarlet Fever, a Disease of Yore, Is Making a Comeback – The reason for the sudden surge remains a mystery, Scientific America (November 28th 2017).
[19] Marshall, S., (2006) Scarlet fever: the disease in the UK, The Pharmaceutical Journal (July Issue 2006).
[20] Branswell, H., (2017) Scarlet Fever, a Disease of Yore, Is Making a Comeback – The reason for the sudden surge remains a mystery, Scientific America (November 28th 2017).
[21]  Marshall, S., (2006) Scarlet fever: the disease in the UK, The Pharmaceutical Journal (July Issue 2006).



 Killer Plagues of Our Modern Era – Plummet from Natural Causes..?

Killer Plagues of Children become tamer throughout the 20th Century

Child Infant Mortality in Ireland from contagious diseases 20th Century

Fig. 1: Chart of the annual number of deaths in Ireland recorded from the combined major killers of all ages of impacting mostly infants and children: Scarlet Fever, Whooping Cough, Measles, Diphtheria & Polio. Source: Chart generated using this tumultuous statistics reports since records began – “Annual Reports on Marriages, Births and Deaths in Ireland, from 1864 to 2000” courtesy of: An Phríomh-Oifig Staidrimh, Central Statics Office CSO, link. © Copyright

Parents were now counting more of their children, not just from having had Scarlet Fever, but from just about all the major diseases that also plagued our young. Similar to Scarlet Fever, seasonal epidemics continued to erupt in our more modern era with increasingly fewer deaths and disabilities from contagions that were equally as deadly as Scarlet Fever once was (Measles and Whooping Cough in particular – See Fig. 1).

Basically, in the more modern era, when deaths and destruction had dropped dramatically from these diseases, having and spreading the infection, no longer spelt death and destruction for so many as it had done in the previous era. The world was looking good and we were no longer just as helpless, being tossed about like corks on a raging ocean of contagion.

In the end, it seems from the above graph (Figs. 1), that all of these once deadlier diseases of childhood essentially became resolved, either by, or just shortly after the turn of the middle of the 20th Century. Childhood killers such as Scarlet Fever, Measles, Diphtheria and Whooping Cough (Pertussis), now looked very different to most parents living in the middle of the 20th Century compared to how they would have appeared to all parents living in the middle of the 19th Century.

And once again, this much-welcomed decline in deaths and disabilities from such once more deadly plagues of childhood appears to be a near-universal phenomenon within our respective developed nations. For instance, this pattern of significantly declining death rates from the major childhood contagions is highlighted from other regions corresponding to a fairly similar timeframe in studies such as: ‘The Development of Infant Mortality in Iceland, 1800–1920’ Loftur Guttormsson and Ólöf Garðarsdóttir (2002) [1].

Similarly, across the other side of the world, we see a closely matched trend for approximately the same timeframe as detailed in, ‘Death registration and mortality trends in Australia 1856–1906’ by Michael Willem de Looper (2014), Abstract, p. iv [2], and the later data for Australia corresponds very closely with the same timeframe from other regions as evident from the study entitled: ‘Epidemiologic Transition in Australia: The last hundred years’ (Booth, H, Tickle, L, Zhao, J  2016 [3].

Furthermore, from the long-term data that we have, namely from England and Wales,  we can see a closely matched pattern for the  decline in deaths – for a similar timeframe from the very same childhood contagions as recorded in this present study for Ireland (Fig. 1), only on a relative scale of our respective population sizes as presented in, figures: 4.15 (Scarlet Fever), 4.24 (Whooping Cough/Pertussis) and 4.18 (Measles) showing rates of annual death rates per 100,000 of the population in England and Wales from 1901 – 2000 as presented in ‘Atlas of Epidemic Britain: A Twentieth Century Picture’  Smallman-Raynor, M., and Cliff, A (2012)  [4]. Also see,  ‘Causes of Death: A Study of a Century of Change in England & Wales’ specifically addressing the declining deaths for the major childhood diseases (Baillie, L. and Hawe, E. 2012) [5].

Again, several closely matched graphs relating to deaths from these very same childhood contagions are presented in, ‘Mortality in the United States 1900 – 1950’ (Gordon, T., 1968). [6], and ‘Vital Statistics Rates in the United States, 1940 -1960’, (Grove, R.D. and Hetzel, A., 1968) [7].

Other sources from the U.S. with closely matched patterns of declining deaths from major childhood infections over the very same course of time is presented in :  ‘Annual summary of vital statistics: trends in the health of Americans during the 20th century’, (Guyer B, Freedman M.A., Strobino DM and Sondik E.J. 2000),  in Paediatrics [8], and the raw statistics for many of these childhood diseases in the United States can be generated from tabulations given in ‘Vaccination and the Control of Seven Infectious Diseases in the US -1900-1970’, (Blood, B., 2000-2013) [9] essentially reveal the same pattern.

And regarding the same essential pattern of a significant decline in deaths from many of the same diseases from the United States and Canada, we see  further confirmation of this pattern of death rates from Diphtheria and many of the other major killers highlighted below in statistical terms – (i.e. the death rates for the diseases drop in magnitude significantly from the 1900s onwards).



Toronto, Canada.


Prior to the present century this disease was the major scourge of infancy and childhood. The mortality rates in the United States, for consecutive ten-year periods from 1900 to 1940, were as follows: 40, 21, 15, 5 and 1.
For the city of Toronto, for ten-year periods from 1885 to 1945, the rates were as follows: 132, 66, 34, 19, 8 and 3.

These figures show a steady reduction in mortality which began over 60 years ago…

A similar general decline in incidence and mortality rates for other infectious diseases, notably scarlet fever, whooping cough, measles, mumps, rheumatic fever and typhoid fever, has also been recorded.


Killer Plagues of Adults & Children also become tamer throughout the 20th Century

It would appear that  – not only do all the major childhood killers of the 1800s and earlier part of the 1900s show a dramatic decline throughout much of the first half of the 20th Century, but also, the other great killers that attacked adults and children alike. Figure 2 clearly shows this pattern along with the very tail end of some of the older contagions discussed earlier.

Typhus, Typhoid, TB, etc Ireland Mortality decline .png

Fig. 2: Chart of the annual number of deaths in Ireland recorded from the combined major killers of all ages of TB, Influenza,, Typhus, Cholera, Dysentery & Typhoid Fever. Source: Chart generated using this tumultuous statistics reports since records began – “Annual Reports on Marriages, Births and Deaths in Ireland, from 1864 to 2000” courtesy of: An Phríomh-Oifig Staidrimh, Central Statics Office CSO, link. © Copyright

For instance, when we compare the deaths from TB (Tuberculosis) with the relatively low death rate corresponding to the very tail end of other great contagions such as, Cholera, Dysentery and Typhus (Fig. 2), we can begin to gain a greater historical perspective on the longer-term pattern of the behaviour of these pathogens.

Compare, for example, the greatest spike for deaths from Influenza in white (corresponding to the great Spanish Flu of 1918/19) which pales into insignificance (statistically speaking of course) when assessed against the annual number of deaths from something as deadly as TB.

However, as Figure 2 clearly illustrates, even TB and Influenza declined in terms of their deadliness in the end just like the older great killers before them. These latter two diseases are relatively modern as we can see the rise to deadly prominence of Influenza along with the peaking of both TB and Influenza and ultimately their demise as killers.

By combining the patterns over the same timeframe for the rise and fall of numbers of individual deaths given annually from each disease, we can begin to detect the fact that it would appear each tells a similar story in that the earlier a disease rises to deadly prominence – the sooner it will resolve itself. And conversely, the more recently a disease rose to deadly prominence, the later in time it appears to become resolved.

Furthermore, it also looks like many of these diseases and their death toll when compared over time that often the reign of one pathogen will be predominant, whilst keeping another’s in check. And seemingly, once a void opens up – i.e. as the older (earlier) diseases begin to recede or become significantly less deadly – a previously suppressed or even less familiar pathogen appears to try to fill this void of destruction (Fig. 2), thus, giving us an insight into the pathogen/host or pathogen/pathogen behaviour over the course of time.

It should be noted that generally, most studies do not typically compare mortality charts over time between other regions to detect similarities or assess a range of contagions on a single graph to assess possible pathogen interactions – this is fairly peculiar to this present study. However, the important point that most other studies of this kind do highlight, is the striking decline in deaths from these same diseases – some of the greatest killers recorded – for the same general timeframe as documented and illustrated in the following broader studies of many of the major killers throughout the 20th Century, which correspond fairly closely with the Irish data (see Fig. 2).

This can be seen in studies such as: ‘Twentieth Century Mortality Trends in England and Wales’, Griffiths, C and Brock, A (2003) Office for National Statistics–18–summer-2003/twentieth-century-mortality-trends-in-england-and-wales.pdf [11] and, similarly, a number of comparative graphs are presented in Thomas McKeown’s publication for the same region in: ‘The Role of Medicine: Dream, Mirage, or Nemesis?’ (1979)  [12].

Again, the major killers that have statistics for the deaths over the course of time, clearly demonstrate that these same diseases of all age groups become significantly less deadly across the board in the U.S. as illustrated within in graphs plotting the deaths throughout the earlier part of the 20th Century in the U.S. ‘Mortality in the United States 1900 – 1950’ (Gordon, T., 1968). [13]. And this pattern is also reflected in, ‘Trends in Infectious Disease Mortality in the United States During the 20th Century’ (Gregory L. Armstrong, G.L, Conn L.A, Pinner, R.W 1999) [14]. Once again, supporting data can also be seen in the graphs presented and discussed in, ‘Infectious Diseases and Human Population History’, (Dobson, A.P. and Carper, E.R., 1996), BioScience, [15].

This begins to raise the important question of what caused such a closely corresponding decline in deaths – a near-universal pattern, from once much more deadly diseases in the first place? As outlined earlier in this study, some scholars have suggested causes relating to a Darwinian-type genetic selection idea, which doesn’t seem to stand up to scrutiny over the long term. But more often, others have suggested the hygiene and nutrition concept, along with population trends and economic and social improvements as a cause of the pattern of decline of some of the deadliest contagions known (e.g. recall the earlier discussion regarding the fact that it is widely believed that the Black Death or Plague of the Middle Ages was spread by black rats and their fleas and by cleaning up the rat infestations – the Plague disappeared – but, on closer inspection, this may not be the case).

And some of these hypotheses have been offered  as alternatives due to the fact – as the data strongly support as you will see as we continue – that our medical interventions were virtually non-existent, came either too late or cannot be correlated with the statistics to show that their impact was a direct or significant causal factor in the decline in mortality or morbidity from such plague-like contagions at a population level. This aspect is certainly supported here in this present study as you will see as you continue.

However, as I hope it is now becoming clear from the evidence presented thus far in this present study, it would appear that by digging deeper into the historical archives of the time and by doing a broader comparative assessment of the statistical data, combined with more recent molecular insights, this strongly points to a rather different alternative hypothesis, one of, generational resilience to these pathogens via exposure as a direct causal factor in the overall decline – seen near-simultaneously throughout, and between, our-newly-emerging modern nations over a surprisingly similar timeframe.

Saying that, there is one aspect of the nutritional hypothesis that should be highlighted in this context  as it is derived  directly from clinical studies which clearly demonstrate the profound role that certain essential vitamins and trace minerals can have on many of the specific infections documented above – this warrants further investigation and it is also important to review – at least briefly here – as this type of therapy also played an important role in the history of our medical interventions – albeit briefly.

Intervention via Vitamin Therapy?

In ‘VITAMIN C IN THE PROPHYLAXIS AND THERAPY OF INFECTIOUS DISEASES’ by W. J. McCORMICK, M.D. 1951, McCormick proposes a hypothesis to account for the decline in deaths from a broad range of once much more deadly infectious diseases being caused by nutrient deficiencies arising from increasing urbanisation within populations as a whole, with dietary stability emerging as we settled into this new lifestyle, he argues that this would cause the deaths from highly infectious diseases to decline concurrently with these developments. He postulated this alternative based upon the fact that much of our medical interventions came either too late, or not at all to account for such a decline.

McCormick’s hypothesis was directly inspired by the highly successful clinical therapies and studies applying vitamin therapies which addressed underlying vitamin deficiencies and supported the body’s own immune system to accelerate recovery and limit the complications that still occasionally arose from being infected with such diseases – particularly those that still impacted children. For example, many conditions were treated using high dose vitamin C by McCormick, including the worst effects of Scarlet Fever, as summarised below.



Toronto, Canada.


(The Author’s Experience)
In the author’s private practice during the past ten years, over 5,000 tests for vitamin-C status have been made, .. In many cases of deficiency, where the dietary intake indicates a subnormal in-take of vitamin C over a lengthy period, the correlated clinical history shows repeated occurrence of infectious processes,
Several cases of scarlet fever were given vitamin-C therapy, intravenously and orally, 2,000 mg. daily. In each case the fever dropped to normal in a few hours and the patients were symptom-free within three or four days.

McCormick was practising at the tail end of an era that saw the very promising development of such vitamin therapies which were emerging and gaining more and more momentum from around the first few decades of the 20th Century onwards as documented in summary below regarding Vitamin A therapy.

The Historical Evolution of Thought Regarding Multiple Micronutrient Nutrition

Progress with Vitamin A

In Denmark from 1910–1920, Carl Bloch and Olaf Blegvad … observed high mortality in children who were hospitalized with vitamin A deficiency. The mortality rate of vitamin A-deficient children was reduced by ~54% by treating the children with cod-liver oil and whole milk, two rich sources of vitamin A. In the late 1920s, vitamin A was recognized to have an effect on immunity to infection, and vitamin A became known as the antiinfective vitamin …

Largely through the influence of Mellanby, vitamin A underwent a period of intense clinical investigation. Between 1920 and 1940, at least 30 trials were conducted to determine whether vitamin A could reduce the morbidity and mortality from infectious diseases, including respiratory disease, measles, puerperal sepsis, and tuberculosis …

By the 1930s, it was established that vitamin A supplementation could reduce morbidity and mortality in young children. In 1932, Joseph Ellison… showed that vitamin A supplementation reduced the mortality of vitamin A-deficient children with measles by nearly 60%. Vitamin A became a mainstream preventive measure; cod-liver oil was part of the morning routine for millions of children and was acknowledged in saving the lives of children from poor families in England…

Semba, R.D. (2012)
The Journal of Nutrition

Cod-Liver Oil – Anyone old enough to remember it?

A clear illustration of how these clinical studies filtered out to the broader public can be seen in a sample of the type of advertisements of the era such as that for Cod Liver Oil as a protection against some of the worst effects and complications that could still occasionally arise from having such common childhood diseases:

codliver oil vintage poster

Fig. 4: Vintage advertisement for ‘Squibb’s’ cod-liver oil. Source: Masterjohn, C. (2015) Did Cod Liver Oil Contribute to the Historical Decline in Measles Mortality and Mortality From Other Infectious Diseases? 

See the highlights from the poster below:

• whooping cough, measles, mumps, • chicken pox, scarlet fever

…may do greater harm than most mothers think. But the children have lighter cases, they recover quicker and are less likely to be left with some permanent injury, if they build up good general resistance in advance to fight them…
There is a way to prevent the “common” diseases from resulting seriously. … “resistance-building” Vitamin A! Vitamin A is the important factor which increases their fighting power in time of illness. It helps to set up a defense against the attacking disease germs
In fact, good cod-liver oil is one of the richest sources of Vitamin A mothers can give. Don’t wait until your child catches one of the “common” diseases. Give him Squibb Cod-Liver Oil now!
 Vintage advertisement for ‘Squibb’s’ cod-liver oil

Many clinical studies from the era show that different vitamin therapies were given to patients suffering from the ill effects of a broad range of highly infectious diseases. However, high-dose vitamin C therapy seemed to be the most often applied and takes up the greatest bulk of the medical literature regarding our intervention in such contagions of the era. High dose Vitamin C therapy is consistently shown to be powerfully effective in reducing the time, severity and deaths and disabilities that could sometimes arise from a broad range of both viral and bacterial diseases, but only when applied in a timely manner in fairly high doses.

The literature indicates that high doses (unlike Vitamin A which is more preventative) were necessary at the time of infection due to the fact that many of the diseases used up Vitamin C when the body was attacked and therefore, the immune system required a great deal of Vitamin C to fight such infections. For instance, just like McCormick’s application of high dose Vitamin C to a vast range of viral and bacterial infections, including Scarlet Fever, another example of the use of high dose Vitamin C therapy during the worst effects of Whooping Cough (Pertussis) infection is clearly demonstrated to be rather successful as documented as follows: 

From the Department of Physiology and Pharmacology, University of Manitoba
IN a previous communication two of us (M.J.O. and B.M.U.) gave an account of the treatment of 10 cases of whooping cough with ascorbic acid (synthetic vitamin C). While the small number of cases forbade any statistical conclusions they nevertheless did show that this treatment had an almost specific effect in decreasing the intensity and duration of the disease.
At the time of forwarding the above paper we believed this to be an entirely new system of treatment, but we have since discovered that Otani 7 had published his results in treating 81 cases of whooping cough with ascorbic acid, and we take this opportunity of acknowledging his priority and confirming his results. His method of treatment was the intravenous injection of the same brand of ascorbic acid (Redoxon – Hoffmann-La Roche) as we have used orally, and his patients were drawn from hospital clinics, while ours were treated in the home.
He does not give much detail in the paper but his general conclusions are matched by ours. In hospital work the intravenous method may be ideal, but where oral use is possible and efficient, as it is here, we believe the greater simplicity and reduced cost (about one-fifth that of the intravenous method) of our method is more suited to general practice.
In the present communication we present the results in 17 additional cases of whooping cough treated by oral administration of ascorbic acid. An attempt was made to gain a more accurate idea of dosage and utilization of the vitamin by studying its urinary excretion before and during treatment. …
 Ormerod, M. J., UnKauf, B. M., & White, F. D. (1937).

And again,  the highly effective use of high dose Vitamin C treatment in combating complications from Diphtheria infections is summarised in another medical paper from around the late 1940s. Note the title includes Poliomyelitis (Polio) which we will review shortly.

Journal of Southern Medical & Surgery

The Treatment of Poliomyelitis and Other Virus Diseases with Vitamin C

Harde et al. reported that diphtheria toxin is inactivated by vitamin C in vitro and to a lesser extent in vivo. I have confirmed this finding, in- deed extended it. Diphtheria can be cured in man by the administration of massive frequent doses of … (vitamin C) given intravenously and/or intramuscularly. To the synthetic drug, by mouth, there is little response, even when 1000 to 2000 mg. is used every two hours.
This cure in diphtheria is brought about in half the time required to remove the membrane and give negative smears by antitoxin. This membrane is removed by lysis when “C” is given, rather than by sloughing as results with the use of the antitoxin. An advantage of this form of therapy is that the danger of serum reaction is eliminated. The only disadvantage of the ascorbic acid therapy is the inconvenience of the multiple injections.
Klenner, F.R., (1949, 211)

Although deaths from Polio were minuscule compared to the much more massive deadliness of something like Measles in its heyday (See Fig. 1) and as tragic as this was at the time on the ground for the individuals who were directly impacted, what made Polio so terrifying perhaps was its high visibility in terms of its greater fall out, paralysis. It is that presumably prompted, with some the urgency, the only available treatment at the time which could stave off the worst effects of Polio at its height of the 1940s and 50s across our now fairly modern nations.

Hence, we have quite a large number of studies using vitamin C (high dose) therapy to treat the disease, and often with great success along with tackling many other viral infections – some examples are given in the following references and excerpts: And in many ways, treatment against the worst effects of Polio was seen as a stop-gap until other medical technologies could be made available as indicated in the opening of the excerpt that follows:

Journal of Southern Medical & Surgery

The Treatment of Poliomyelitis and Other Virus Diseases with Vitamin C

Since immunization against poliomyelitis comparable to that against other bacterial diseases is still a matter of the future, it suggested itself that some antibiotic could be found that would destroy this scourge …
These results were so consistently positive that we did not hesitate to try its effectiveness against all tvpes of virus infections.
The frequent administration of massive doses of vitamin C was so encouraging in the early days of the 1948 epidemic of poliomyelitis that a review of the literature was begun…
Klenner, F.R., (1949, 211)
Journal of Southern Medical & Surgery

High-dose Intravenous Vitamin C as a Successful Treatment of Viral Infections

 In the poliomyelitis epidemic in North Carolina in 1948, 60 cases of this disease came under his care. The treatment employed was vitamin C in massive doses. It was given like any other antibiotic every two to four hours. The initial dose was 1000 to 2000 mg, depending on age. This schedule was followed for 24 hours. After this time the fever was consistently down, so the vitamin C was given 1000 to 2000 mg every six hours for the next 48 hours. All patients were clinically well after 72 hours.
Klenner, F.R., (1949, 211-12)

Below is a recommended list of further clinical studies of the general pre-Polio-vaccine era using high dose Vitamin C therapies in the main with an emphasis upon Polio as this was a relatively new emergent contagion at the time.

Orthomolecular Medicine News Service

The Forgotten Research of Claus W. Jungeblut, M.D.

Robert Landwehr. The origin of the 42-year stonewall of vitamin C. Journal of Orthomolecular Medicine, 1991, Vol 6, No 2, p 99-103.
Klenner FR. The use of vitamin C as an antibiotic. Journal of Applied Nutrition, 1953, Vol. 6, p 274-278. 15. Klenner FR. The treatment of poliomyelitis and other virus diseases with vitamin C. Southern Medicine and Surgery, July, 1949, p 209.
Dr. Jungeblut’s 22 research reports were published in the Journal of Experimental Medicine and are available at:
…Other important papers are as follows Key papers regarding vitamin C include:
Jungeblut CW. Inactivation of poliomyelitis virus in vitro by crystalline vitamin C (ascorbic acid). J Exper Med, 1935. October; 62:517-521
Jungeblut CW. Vitamin C therapy and prophylaxis in experimental poliomyelitis. J Exp Med, 1937. 65: 127-146.
Jungeblut CW. Further observations on vitamin C therapy in experimental poliomyelitis. J Exper Med, 1937. 66: 459-477.
Jungeblut CW, Feiner RR. Vitamin C content of monkey tissues in experimental poliomyelitis. J Exper Med, 1937. 66: 479-491.
Jungeblut CW. A further contribution to vitamin C therapy in experimental poliomyelitis. J Exper Med, 1939. 70:315-332.
Orthomolecular Medicine News (2013)

However, as much as these therapies with vitamins were shown to be life-saving and of immense value to the patients for the most part who had had the timely intervention, as it turned out, these vitamin therapies, unfortunately, became somewhat overshadowed by other emerging medical interventions of the era. Most notably, one medical intervention that is still very much with us today – antibiotics seems to have played a particular role in subduing the more natural therapies.


Antibiotic vs. Vitamin Interventions?

There is some indication that vitamin therapies and antibiotics were at logger-heads with each other in the early days of their applied treatments, presumably due to their fundamentally different ways of approaching the treatment of disease as suggested from reading the following excerpts.

VOL.CIII APRIL, 1951 No. 4

Massive Doses of Vitamin C and the Virus Diseases

It has been reported that one of the mold-derived drugs, in addition to being a good antibiotic, is a super-vitamin Conversely, we argue that vitamin C, besides being an essential vitamin, is a superantibiotic.
Hippocrates declared the highest duty of medicine to be to get the patient well. He further declared that, of several remedies physicians should choose the least sensational- Vitamin C would seem to meet both these requirements.
Klenner, F.R. (1951, 101, 107)

 Or to put it another way…

The author’s experience leads to the conclusion that the principle of trying to eradicate disease by concentrating our attack against the associated micro-organisms by means of toxic antibiotics is fundamentally unsound. If we wish to eliminate a desert or swamp we do not proceed to cut down the sage brush and cactus of the former or the lush characteristic verdure of the latter. Instead, we change the condition of the soil. By irrigation we make the desert blossom like a rose, and by drainage we change the flora of the swamp.
The late Dr. Alexis Carrel …has said: “Microbes and viruses are to be found everywhere, in the air, in the water, in our food… Nevertheless, in many people they remain inoffensive… This is natural immunity… But natural immunity does not exclusively derive from our ancestral constitution. It may come also from the mode of life …
 McCormick, W.J  (1951) 

It would also appear, judging by the excerpt above, that these therapies may have perhaps been perceived as simply too simplistic to be taken seriously by the emerging sophistication of the medical industry of the time. However, one major advantage that vitamin therapies had over antibiotics was that, where antibiotics could only treat bacterial disease and different antibiotics needed to be developed to treat particular bacterial infections, no antibiotics could be used to tackle viral or fungal infection types – whereas, the vitamin therapies seemed to be able to tackle just about any microbial type – viral or otherwise and all variants thereof. 

Basically, antibiotics seemed to appeal more to the emerging medical authorities of the era and they supported Alexander Fleming’s new discovery instead. However, as Penicillin only really began to be more widely available after World War II around the mid-1940s, and as this coincides very closely with the vast majority of deaths from contagions having essentially declined to a one or a few per cent from the 1900s in most regions, antibiotics cannot have been responsible for the overall decline in deaths prior to its wider availability. Furthermore, as noted above, it was some years later that other specific antibiotics were developed to counteract the worst effects of particular bacterial infections, and viral infections were not treatable using antibiotics at all. 

About Antimicrobial Resistance

Brief History of Antibiotics

Penicillin, the first commercialized antibiotic, was discovered in 1928 by Alexander Fleming.  While it wasn’t distributed among the general public until 1945, it was widely used in World War II for surgical and wound infections among the Allied Forces.  It was hailed as a “miracle drug” and a future free of infectious diseases was considered.  When Fleming won the Nobel Prize for his discovery, he warned of bacteria becoming resistant to penicillin in his acceptance speech.
CDC (2015)

As indicated above, Fleming’s warning has unfortunately turned out to be quite correct, as we are battling with superbugs through over-use of such treatments against much less trivial infections and those bacterial critters have apparently adapted to just about anything we can throw at them, it is worth reiterating, as vitamin therapies were not in the game of annihilation, but, rather supporting our own immune systems to fight a whole spectrum of pathogens and their adaptive strains, isn’t it fortunate that vitamin therapies are beginning to re-emerge and be taken seriously again in our modern era of the rise of the Superbugs.

Vaccine Interventions?

This brings us to the development of vaccines of the era which seemed to hold the great promise of controlling these scourges of both viral and bacterial type diseases  –  all that was required, was the development of a vaccine for each disease and as some of these were beginning to be tackled using vaccines already, the initial success must have been somewhat amplified by the fact that the deaths were already plummeting – some deaths almost in free fall, and ultimately coming to a fairly abrupt end as illustrated in charts above (Figs. 1 and 2) which is fully reflected in the data from other regions across the world.

This observation has been highlighted within McKeown’s thesis of the 1970s [27]. And similar sentiments are echoed some decades earlier in the 1951 article  by McCormick, who points to the evidence of his own era, where the availability of vaccines for a couple of diseases could not account for the  major decline in deaths prior to their availability, and certainly could not account for the great decrease in deaths that had already plummeted that had not yet had a vaccine developed. And it was this fact that prompted both McCormick and McKeown and other scholars to posit an alternative explanation for the rather dramatic decline in deaths from so many once deadlier infectious diseases.



Toronto, Canada.


The usual explanation offered for this changed trend in infectious diseases has been the forward march of medicine in prophylaxis and therapy; but, from a study of the literature, it is evident that these changes in incidence and mortality have been neither synchronous with nor proportionate to such measures.
…Likewise, the decline in diphtheria, whooping cough and typhoid fever began fully fifty years prior to the inception of artificial immunization and followed an almost even grade before and after the adoption of these control measures.
In the case of scarlet fever, mumps, measles and rheumatic fever there has been no specific innovation in control measures, yet these also have followed the same general pattern in incidence decline…
…On this same subject McKinnon … says: “Quite obviously then, all the factors mentioned are not adequate in themselves to explain the recorded decline. Some other factor or factors must have been operating during this period and it is necessary to cast farther afield in search of them…
We should ascertain whether natural resistance to infections could be conferred on man by definite conditions of life. Injections of specific vaccine or serum for each disease, repeated medical examinations of the whole population, construction of gigantic hospitals, are expensive and not very effective means of preventing diseases and of developing a nation’s health. Good health should be natural.”

In the end, it seems that the belief that we could prevent the disease occurring in the first place, using these newly emerging vaccines therapies,  that overrode any alternative and it is this fact, more than any other I believe, that very likely pushed vitamin therapies into the recesses of history.

However, even though it was early days when McCormick noted the few available vaccines coming online to combat particular diseases, as you can see from the updated statistics right up to our present day (see the above graphs, Figs. 1 and 2), all of our older and more modern diseases declined in terms of becoming less deadly over the generations, Scarlet Fever, the Plague, Dysentery, Cholera, Typhus Fever and Typhoid without any medical intervention at all (although there were tentative moves towards a vaccine for Scarlet Fever and Typhoid – they never came to anything much) and all the other great contagions of our more modern era went the same way – becoming essentially relatively tame, where much of the death and destruction is essentially over as our more modern vaccine era begins.

However, there is one glaring contagion that we haven’t addressed yet in terms of a vaccine that was made available relatively early, and that is Jenner’s famous Cowpox vaccine against Smallpox, which will be addressed in the following section. But, first we will finish off with a short discussion of why I would tend to argue the case for a natural resistance to disease via exposure over generations rather than support the micro-nutrient hypothesis as proposed by McCormick and other related hypotheses offered for the ultimate cause of the rise and fall of once deadlier contagions if it wasn’t caused for the most part by our medical interventions as discussed in the following.

The Cause of the Near-Universal Decline in Deaths from Major Contagions 

This present study would tend to lean more towards the natural generational resistance hypothesis for all the reasons stated previously. However, several other scholars would essentially support McCormick’s type of proposal of the possible cause of the major decline in all the great contagions of old and this is offered along with overall improvements of health, diet, wealth and cleanliness, as an explanation based upon the fact that we cannot explain this common phenomenon via our medical interventions as they essentially came either too late, or not at all, or were too sporadic to explain the pattern of decline across our developing nations as outlined in a number of Thomas McKeown’s publications [29]. 

These scholars are of course not alone in identifying the lack of association between the timing, degree and availability of medical interventions and the dramatic decline in deaths which cannot, for the most part, be correlated with our interventions and therefore, cannot have been the major cause of the decline in the first place.

This issue has been discussed  as it relates to the United States by McKinlay J.B and McKinlay S.M. (1977) in ‘The Questionable Contribution of Medical Measures to the Decline of Mortality in the United States in the Twentieth Century’ [30], where these authors open their study with the following:

The Questionable Contribution of Medical Measures to the Decline of Mortality in the United States in the Twentieth Century

Legislators, practitioners, and the public may deem it “heretical,” but analysis of United States data shows that introduction of specific medical measures and expansion of services account for only a fraction of the decline in mortality since 1900.

Now regarding the nutrition/micro-nutrient hypothesis offered to explain this decline in deaths from some of the deadliest diseases of all time ever recorded without our medical intervention, it must be conceded that diseases such as those directly linked to vital mineral/vitamin deficiencies such as Scurry and Rickets being two examples that come to mind, appear at first to be exceptions to this pattern. However, this is probably because they are not contagious, unlike all the other great epidemic infectious diseases and are specific and limited to the individual that gets the infection.

Among the maladies currently having a resurgence are:
“Rickets affects bone development in children and is caused by vitamin D deficiency,” says Dr Jeff Foster, GP at Spire Parkway Hospital, Solihull. With improvements in nutrition, the condition was considered all but eradicated.
However, 4,638 children admitted to NHS hospitals in 2013 and 2014 were found to be suffering from vitamin D deficiency, compared to 1,398 cases in 2009 and 2010.

Otherwise, going against this nutritional causation interpretation of the data presented thus far in this present study, take for example some of the great contagions that were carried inadvertently to perfectly nutritionally healthy indigenous peoples who are described vividly within our history books where, suddenly they had to face for the first time – outsiders – often seemingly healthy carriers of their new invaders who had already become immune over generations of exposure. We know what happened then, but as discussed earlier, although these people were fairly decimated but within a few short generations they too seemed to become naturally immune when faced with the same disease – sometimes within later populations who hadn’t even directly experienced the disease in their own life-time. 

Now – if this same tribe (prior to first exposure) were fit and healthy and eating nothing but good vitamin-filled fruit if they were tropical dwellers, or full of vitamin A as they ate raw fish for the most part, if they were coastal dwellers in the more northerly regions,  and apparently got their vitamin C or equivalent indirectly (I heard that somewhere – but I can’t immediately remember the details – other than they are not lacking in such immune support), or were hunters on the great plains who ate nothing but good grass-fed nutritional game – living in harmony with nature and did every right according to nature’s rules, then how could the nutritional hypothesis account for either the rise or the subsequent immunity and decline in deaths from these diseases?

In other words, it seems unlikely that  every nook and cranny within our great nations – both large and small and between the lands so geographically and climatically diverse – sometimes separated by thousands of miles of ocean, would all manage to consume comparable levels, or be lacking (whichever may be the case) in vitamin-rich foods and trace minerals to have the very same nutritional status at the exact same time points in history, to show this almost universal effect across the entire earth and over centuries as illustrated by the graphs and overall statistics generated historically for our nations. The Irish famine is a case in point as discussed previously. It seems that your level of exposure ancestrally was more a factor in whether you survived or died back in the day more than your diet.

Final Episode: Part Eight: Don’t Count Your Children ’til they’ve had the POX! Conclusion…

References to Part Seven
[1] Guttormsson, L and Garðarsdóttir, Ó (2002) The Development of Infant Mortality in Iceland, 1800–1920, Hygiea Internationalis, An Interdisciplinary Journal for the History of Public Health, Vol. 3 [1] pp. 151 – 176, PDF · [Available online as PDF] DOI: 10.3384/hygiea.1403-8668.0231151
[2] De Looper, MW (2014) Death registration and mortality trends in Australia 1856–1906, Abstract, p. iv. PhD Thesis: The Australian National University [Available online as PDF]…/De%20Looper%20Thesis%202015.pdf
[3] Booth, H, Tickle, L, Zhao, J (2016) Epidemiologic Transition in Australia: The last hundred years, Canadian Studies in Population Vol. 43, [1–2]: pp. 23–47.
[4] Smallman-Raynor, M, and Cliff, A (2012), Atlas of Epidemic Britain: A Twentieth Century Picture, Oxford University Press, Oxford. p.50, figure 4:18 (Measles); p. 52, figure 4:24 (Whooping Cough); p.49, figure 4:15 (Scarlet Fever).
[5] Baillie, L. and Hawe, E. (2012) Causes of Death: A Study of a Century of Change in England & Wales, OHE (Office of Health & Economics).
[6] Gordon, T. (1953) Mortality in the United States, 1900-1950. Public Health Reports, Vol. 68 [4], pp. 441–444.
[7] Grove, R.D., and Hezel, A.M (1968) Vital Statistics Rates in the United States, 1940 – 1960, Department of Health, Education and Welfare Public Health Service, National Center for Health Statistics. [Available online]
[8] Guyer B, Freedman MA, Strobino DM, Sondik EJ. (2000) Annual summary of vital statistics: trends in the health of Americans during the 20th century, Pediatrics. Vol. 106, [6]: pp. 1307-17.
[9] Blood, B., (2000-2013) Vaccination and the Control of Seven Infectious Diseases in the US (1900-1970), (Compilation of statistics compiled from the Historical Statistics of the United States Colonial Times to 1970 [Available online]
[10] McCormick, W.J (1951) Vitamin C in the Prophylaxis and Therapy of Infectious Diseases, Archives of Pediatrics, Vol. 68, [1].
[11] Griffiths C and Brock A (2003) Twentieth Century Mortality Trends in England and Wales. Health Statistics Quarterly, Issue 18, pp. 5–17. [Available online as PDF]–18–summer-2003/twentieth-century-mortality-trends-in-england-and-wales.pdf
[12] Mc Keown, T (1979) The Role of Medicine: Dream, Mirage, or Nemesis? Basil Blackwell, Oxford [Available online as PDF]
[13] Gordon, T. (1953) Mortality in the United States, 1900-1950. Public Health Reports, Vol. 68 [4], pp. 441–444.
[14] Gregory L. Armstrong, G.L, Conn L.A, Pinner, RW (1999) Trends in Infectious Disease Mortality in the United States During the 20th Century, JAMA. Vol. 281 [1]: pp.61-66. DOI: 10.1001/jama.281.1.61
[15] Dobson, A.P. and Carper, E.R (1996) Infectious Diseases and Human Population History: Throughout history the establishment of disease has been a side effect of the growth of civilization, BioScience, 46, Issue [2,] pp. 115–126, DOI: 10.2307/1312814 [Available online as PDF]
[16] McKinlay JB and McKinlay SM. (1977) The Questionable Contribution of Medical Measures to the Decline of Mortality in the United States in the Twentieth Century. The Millbank Memorial Fund Quarterly. Health and Society. Vol, 55 [3], pp. 405-428. [Available online at JSTOR]
[17] Richard D. Semba, R.D. (2012) The Historical Evolution of Thought Regarding Multiple Micronutrient Nutrition, The Journal of Nutrition, Vo. 142, [1], Progress with Vitamin A, [Available online from]
[18] Masterjohn, C. (2015) Did Cod Liver Oil Contribute to the Historical Decline in Measles Mortality and Mortality From Other Infectious Diseases?, (April 6th 2015) [Available online from]
[19] Ormerod, M. J., UnKauf, B. M., & White, F. D. (1937). A Further Report on the Ascorbic Acid Treatment of Whooping Cough, Department of Physiology and Pharmacology. Canadian Medical Association Journal, Vol. 37 [3] pp. 268–272.
[20] Klenner, F.R., (1949) The Treatment of Poliomyelitis and Other Virus Diseases with Vitamin C 
. Journal of Southern Medical & Surgery, VOL. 111 [1], p. 211, [Available online as PDF from]
[21] Klenner, F.R., (1949) The Treatment of Poliomyelitis and Other Virus Diseases with Vitamin C 
. Journal of Southern Medical & Surgery, VOL. 111 [1], p. 211, [Available online as PDF from]
[22] Klenner, F.R., (1949) The Treatment of Poliomyelitis and Other Virus Diseases with Vitamin C 
. Journal of Southern Medical & Surgery, VOL. 111 [1], p. 211-12[Available online as PDF from]
[23] Orthomolecular Medicine News Service, (2013), Vitamin C and Polio, The Forgotten Research of Claus W. Jungeblut, M.D. in (ed.) Andrew W. Saul, Orthomolecular Medicine News Service (August 7rh, 2013)
[24] Klenner, F.R., (1951) Massive Doses of Vitamin C and the Virus Diseases
. Journal of Southern Medical & Surgery, VOL. 113 [4], p. 101-107 [Available online as PDF from]
[25] McCormick, W.J (1951) Vitamin C in the Prophylaxis and Therapy of Infectious Diseases, Archives of Pediatrics, Vol. 68, [1].
[26] CDC (2015) About Antimicrobial Resistance, Brief History of Antibiotics, CDC website [Available online]
[27] McCormick, W.J (1951) Vitamin C in the Prophylaxis and Therapy of Infectious Diseases, Archives of Pediatrics, Vol. 68, [1].
[28] Mc Keown, T (1979) The Role of Medicine: Dream, Mirage, or Nemesis? Basil Blackwell, Oxford [Available online as PDF]
[29] McCormick, W.J (1951) Vitamin C in the Prophylaxis and Therapy of Infectious Diseases, Archives of Pediatrics, Vol. 68, [1].
[30] Mc Keown, T (1979) The Role of Medicine: Dream, Mirage, or Nemesis? Basil Blackwell, Oxford [Available online as PDF]
[31] O’Connor, M., (2018) Deadly Victorian diseases make a return from obscurity, but why now? RICKETS, TB and scurvy sound like the stuff of history books but official figures show such Dickensian diseases have recently been making a comeback. (January 28th 2018).



Don’t Count Your Children Before They’ve Had the Pox & Conclusion

Smallpox began to rise in cycles of deadly epidemics across and between our developing nations as our urban centres swelled and commerce was interconnecting whole continents, thus allowing the pathogen free-reign within the population as a whole. Smallpox, relative to many of the other great contagions discussed thus far – apart from the Plague itself, is a fairly old disease – seeing its rise to dominance as a major killer from the 1600s onwards as noted below.

Furthermore,  as also highlighted from the historical record, its victims were mostly adults who had not previously experienced the disease to any great extent, but, by the time it was circulating almost universally, it began to impact those less familiar with the pathogen – the children for the most part and as more and more survived Smallpox’s assaults, few grew up unprotected:

A History of Epidemics in Britain

1894, Vol. II

When it first rose to prominence in England, from the reign of James I. onwards, it attacked adults in a large proportion; of which fact the evidence, although not statistical, is sufficient.
But, as the disease became nearly universal and ubiquitous, it was so commonly passed in infancy or childhood, that few grew to maturity without having had it.
The number of adult cases diminished in proportion as the disease became more nearly universal.
Creighton, (1894, 623)

Essentially, over the generations as nearly everyone had the Smallpox as children and as once you had the disease – if you survived, of course, you would be immune for life. Hence, you grew up to have full impunity to the infection – but only because of direct exposure to Smallpox in the first place.

Edward Jenner and the history of smallpox and vaccination

It was common knowledge that survivors of smallpox became immune to the disease. As early as 430 BC, survivors of smallpox were called upon to nurse the afflicted… Man had long been trying to find a cure for the “speckled monster.”
 Riedel, S., (2005, Variolation and Early Attempts of Treatment)
[ 2 ]

Such is the viral cycle which is matter-of-factly put in the following:

The Origin of the Variola Virus

As a result of a smallpox epidemic, the majority of the sensitive population either becomes immune or dies, and the epidemic fades. … Such a viral lifecycle requires a considerable concentration of sensitive hosts.
 Babkin, I.V., and  &  Babkina, I.N., (2015, Smallpox in Ancient Times: Historical Data)

Now, unfortunately, this type of immunity via exposure to the real thing was very often a hard-earned protection, not just because so many died in massive numbers from having the Pox – particularly in the earlier stages of this pathogen’s reign, but, if you did survive, you typically bore the distinctive scarring from this attack and were often left pock-marked for life and many also ended up blind.

Edward Jenner and the history of smallpox and vaccination

‘The Speckled Monster’
The symptoms of smallpox, or the “speckled monster” as it was known in 18th-century England, appeared suddenly and the sequelae were devastating. The case-fatality rate varied from 20% to 60% and left most survivors with disfiguring scars. The case-fatality rate in infants was even higher, approaching 80% in London and 98% in Berlin during the late 1800s…
…In the 18th century in Europe, 400,000 people died annually of smallpox, and one third of the survivors went blind.
 Riedel, S., (2005, Variolation and Early Attempts of Treatment)

The nature of Smallpox in Ireland from the time when it was rising to more deadly prominence, particularly as the 17th Century (the 1600s) unfolded, is exemplified in the case of a famous blind harpist -Turlough O’ Carolan, whose music is still played by traditional musicians of Irish music to this day.

‘Plucking the Strings of Genius’

Irish Times
A national composer with an international stature, the ‘blind harper’ Turlough O’Carolan …was born in Nobber, Co Meath, in 1670 … Blinded by smallpox at the age of 14, he looked to his art and travelled the island of Ireland on horseback, his harp slung over his shoulder.
Battersby, E., (2006)

In the world that O’Carolan grew up in – perhaps he hadn’t even witnessed the great epidemics which were yet to rise to prominence, particularly by the earlier 19th Century, such as Cholera, Typhus, Dysentery, to name but a few. Like these younger contagions, that appears by all historical accounts to have naturally resolved themselves, we can begin to gain a few historical glimpses, into a similar situation regarding the rise, peaking and ultimate fall of Smallpox over the course of generations from Charles Creighton’s ‘A History of Epidemics in Britain’ book of 1894 which maps out the rise – particularly towards the mid 18th Century (the 1700s) and subsequent fall of the dominance of Smallpox as a major killer beginning some time after the 1770s from the few historical sources that we have for this period.

A History of Epidemics in Britain

1894, Vol. II

Authentic accounts of smallpox in Ireland in the 18th century are not easy to find, but it is clear from such notices of it as do exist that it could be widely prevalent and malignant in type…
Rutty, of Dublin, under the year 1745, says: “The smallpox was brought to us by a conflux of beggars from the north, occasioned by the late scarcity there; whose children, full of the smallpox, were frequently exposed in our streets.”
His next mention of smallpox is in the winter of 1757-58, when the disease “kept pace in malignity,” with the prevalent spotted or typhus fever. Amidst numerous entries of fevers of all kinds (typhus, agues, miliary fevers), as well as scarlatina and angina, these are the only two references to smallpox in Rutty’s Dublin annals from 1726 to 1766.
The annals kept by Sims of Tyrone overlap those of Rutty by a few years; and his first reference to smallpox is under the year 1766, which was a year of almost universal smallpox in England. Towards the close of 1766 and in the spring of 1767 the smallpox caused unheard-of havoc, scarcely one-half of all that were attacked escaping death. The disease had appeared the year before along the eastern coast, and proceeded slowly westward with so even a pace that a curious person might with ease have computed the rate of its progress. It had not visited the country for some years, and was not seen again until 1770, when it was less severe than in 1766-7…
Creighton, (1894, 543 – 544)

The rise of Smallpox as a major killer spans O’Carolan’s own era until he becomes an old man, and it is only well after his death that Smallpox begins to become notably less lethal according to the available statistics of the era. However, the most pronounced decline in deaths from the contagion – even amongst children – occurs around the first third of the 19th Century (the 1800s) and it looks like Smallpox’s reign of terror is coming finally to an end. This pattern is clearly documented in the statistical observations for Dublin in Ireland as seen in the following.

The population of Ireland 1700-1900: a survey

… the available data for Dublin are striking: smallpox accounted for twenty per cent of all reported deaths there between 1661 and 1745, but only three per cent in the 1830s …
Ó Grada, C., ( 1979, 288)

This is a pattern seen throughout many other developing nations as indicated by the statistics given below for London in England over a similar course of time.

The decline of adult smallpox in eighteenth-century London

In the case of London, the Bills of Mortality indicate that smallpox was probably the single most lethal cause of death in the eighteenth century, accounting for 6- 10 per cent of all burials. However by the 1840s smallpox was a minor cause of death, suggesting that the decline of smallpox mortality played a major role in the reduction of all-cause mortality, at least in urban areas.
 Davenport, R., Schwarz, L., & Boulton, J. (2011, Abstract)

The point being that deaths from Smallpox had already started to decline significantly by 1770s and by the 1830s-40s onwards, deaths from Smallpox had dropped to become a fairly marginalised disease and if we recall that this is the period that sees a massive rise in other major killers such as Cholera, and Typhus, then we begin to get a glimpse into the competing pathogens themselves? But, of course,  we also know, as discussed previously, that even these great contagions ultimately receded without us ever having intervened medically or otherwise at a timely enough scale to account for the closely matched patterns of declining deaths and ultimate resolution across our diverse developing nations.

Similarly, these deadly infectious diseases as they receded, began to be replaced by other opportunistic pathogens that rose to deadly prominence which mainly attacked children such as Scarlet Fever (that never had a vaccine available for wide-scale use to counteract its impact), and when this began to recede in its deadliness, Measles and other pathogens seem to take its place and in time, even these diseases stopped killing children and only at the very end when all was done and dusted did we intervene with our medical interventions.

So bearing this in mind, it is not therefore surprising that as Smallpox recedes significantly, other contagions are seen to displace it. This is clearly illustrated from the statistical records leading up to the mid-19th Century (the mid-1800s) as highlighted in Charles Creighton’s (1894) study given in the excerpt below:.

A History of Epidemics in Britain

1894, Vol. II

In the great Irish famine of 1846-49, comparatively little is heard of smallpox. It would appear to have been less diffused through the country than in former famines. … In the workhouses and auxiliary workhouses during the ten years 1841-51, smallpox is credited with 5016 deaths, while measles has 8943, fever 34,644…
Creighton, (1894, 220)

Screen Shot 2018-09-20 at 09.23.13

Fig. 1: Chart showing relative deaths from smallpox, measles, and fevers (including Scarlet & Typhoid etc) recorded within workhouses for a ten year period either side of the Great Hunger generated 1841-1851 based upon Charles Creighton, ‘A History of Epidemics in Britain’, 1894, Vol. II, p. 220.

Although the statistics as given above (prior to the official national mortality rates being registered from 1864)  are not perhaps fully representative of the entire population of Ireland (but it comes close to capturing the main impact of disease at a ground level within Ireland), still, a figure of just over 5000 deaths from Smallpox spanning ten years is significant in this present discussion – particularly when we assess this in terms of annual rates, which means that on average, 500 individuals were registered as dying from Smallpox in each year and we place these statistics within their historical context that encompasses the era when Ireland experienced its greatest devastation of starvation, dire destitution and social upheaval ever recorded either before or since.

Now, when we consider the previous discussion on Scarlet Fever and how it was receding as a killer plague of most children around this era, only to be usurped by another killer contagion of mainly children that had risen to deadly prominence soon after – namely Measles, we are assured, as we know all of these once deadlier infections declined dramatically soon after their greatest period of devastation to ultimately become relatively benign childhood infections as the 20th Century progressed.

The tabulation above, showing how much more deadly fevers and indeed, Measles was relative to Smallpox (Fig. 1) captures this point in time (the early 1840s to the early 1850s) when the diseases themselves were seemingly vying for top-dog position within us as their hosts. It, therefore, looks like our immune systems were becoming more astute to their wily ways as we had been building up a formidable defence to their worst effects from generations of exposure.

Again, we know from statistical accounts that this pattern of declining deaths from Smallpox is reflected within other regions for a similar timeframe as documented in detail within Creighton’s study of epidemics (1894). For instance, we can see from the historical statistical records for London that Smallpox continues to decline significantly as a major cause of death after the earlier 1850s which of course corresponds to a similar significant decline in deaths from Smallpox in Ireland for a similar timeframe compared to the great heights of devastation from Smallpox epidemics indicated from the historical record from earlier generations.

Old and New Bills of Mortality; Movement of the Population; Deaths and Fatal Diseases in London During the Last Fourteen Years

Smallpox was less fatal in the latter septennial period than the former. In the fourteen years it destroyed 12,093 lives in London. In one year (1844,) it killed as many as 1,804; last year (1853,) was less fatal than any other, for the number who died from this foul disease was only 217.
 Angus, J. (1854, 127)

However, medical inventions were on the horizon and many professionals believed that if they could do something – they would do something. But, perhaps the people on the ground, particularly the parents were noticing that more of their children were actually surviving their encounters with Smallpox and were now focussed on contending with other great plagues that were becoming much more deadly such as Scarlet Fever, only to see Measles take their children instead as it gained a predominant killer role, and these same parents had other things on their mind.

Allergy & Immunity-Vaccination – Smallpox

It had been known for centuries that survivors of smallpox outbreaks were protected from subsequent infection. Attempts to ward off the disease by inducing a minor form of it was called variolation. This involved inhalation of the dried crusts from smallpox lesions or inoculation of the pus from a lesion into a scratch on the skin. These were potentially hazardous procedures, yet deemed acceptable at the time as smallpox caused such severe mortality and morbidity… His [Jenner’s] ideas were initially greeted with violent opposition, but in time Jenner achieved world fame, and his technique became universally known as vaccination from the Latin name for the cow, vacca. The practice of variolation was forbidden by Act of Parliament in 1840.
University of Dundee (n.d.)

It is of some interest that the period for the use of Jenner’s vaccine being supported by an act of Parliament in Britain – 1840, coincides broadly with the continuing and greatest decline in deaths recorded from Smallpox through to the 1850s in London and Ireland and elsewhere by all other historical accounts documented in Creighton’s in-depth study of 1894.

It is also important to point out that by examining the records and historical accounts from the previous era (prior to Jenner’s vaccine) it becomes clear that the use of inoculation (variolation) was far too sporadic – never really catching on in a systematic manner throughout our nations – to have impacted upon our respective Smallpox mortality statistics in a significant manner.

On the other hand, Jenner’s Cowpox vaccine – that launched our modern vaccine era, became particularly popular within the great urban centres and armies throughout much of the developing world and was almost universally made compulsory across our respective nations. However, as you will see as we proceed, the timing of the introduction, wider use and ultimately, compulsory use of Jenner’s Cowpox vaccine diverged greatly between our diverse and far-flung regions, yet, the pattern of declining mortality rates is remarkably similar, seeing an ultimate resolution to Smallpox around the same time – namely towards the end of the 19th Century across most of our emerging modern nations.

For instance, compulsion came rather late within Britain and Ireland compared to many regions of Europe, and indeed, there was a difference of around a decade between England & Wales (inclusive of London) and Ireland regarding the implementation of such legal requirements (See Brunton 2012, 106) [12], yet as the historical accounts highlighted above strongly indicate, both these regions follow a very close pattern of decline in deaths from Smallpox many generations prior to compulsion and perhaps unexpectedly, this pattern of closely matched mortality statistics for Smallpox relative to our population sizes continues within the post-compulsion era. This curiosity will be discussed shortly.

The Failure of Poor Law Vaccination 1840–50

Chapter Seven Ireland

Public vaccination in England and Wales is generally assumed to have provided the blueprint for the development of the service in Ireland. At first glance, this analysis appears to fit the facts. In both countries, vaccination was first provided free of charge to the whole population through the poor law under the 1840 Vaccination Act. This was followed up by the introduction of compulsory vaccination—in England and Wales in 1853, in Ireland in 1863. Looked at in more detail, it is clear that after 1840, public vaccination followed a distinctive path in Ireland.
 Brunton, D. (2012, 106)

Regarding compulsory vaccination –  wherever and whenever it was implemented, as it often came with penalties if parents did not conform, the Irish situation will suffice to highlight the situation experienced across other nations.

Anti-vaccine sentiment will not be easily eradicated

Irish Times

 According to historian Deborah Brunton, around 70 per cent of infants were vaccinated annually. Rates did fluctuate, often according to the perceived risk of catching smallpox, but remained higher than the rates in England and Wales…
A series of reforms to the provisions eventually resulted in compulsory vaccination from 1863. ..
Parents were given a period of six months within which to vaccinate a newborn child. Failure to vaccinate could result in a fine of 10 shillings. Doctors were also paid for each person they successfully vaccinated. This compulsion was further strengthened in 1879 by increasing the fine to 20 shillings.

Adelman, J. (2017, 9th March) 


Interestingly, most contemporary discussions of the literature on the topic would tend to stress the timely success of the Cowpox vaccination across our various nations and imply that it was the vaccines that ultimately clamped down on the disease and even caused the final eradication of Smallpox across our industrialised nations. So let us first examine the graphs and historical accounts of Smallpox in the post-compulsory vaccination era that are available from official statistics to begin assessing this proposition.

1972 Smallpox Eng Scot Ireland compared

Fig. 2: Graph generated using data from Table VI given in: ‘Fifty Years Vital Statistics in Ireland’. William J.. Thompson, (1919), p. 594, representing deaths from Smallpox per 100,000 of respective populations within these different regions. Source:

Judging by the historical accounts of the era, Scotland appears to have seen the implementation of its compulsory vaccination laws slightly earlier or at least around a similar time as Ireland. As noted above, England and Wales (inclusive of London) see compulsory vaccination being implemented from 1853, whereas, Ireland implements its compulsory laws around a decade later (1863).

However, the data in Figure 2 shows a surprisingly similar and closely matched pattern of annual recorded deaths from Smallpox between our different nations, ending at around the same time in the earliest part of the 20th Century and certainly, nobody would expect anywhere near such a tightly matched spike of deaths relative to our respective population sizes to follow so soon after the differently timed implementations of our respective compulsory vaccination laws.

Figure 2 shows that the graph commences from 1866 when we begin to record deaths from Smallpox in a more standardised way officially across all of England/Wales, Scotland and Ireland and highlights the shared devastation of the 1871/72 Smallpox epidemic. As there are a number of factors involved in this particular Smallpox episode that make it quite unusual compared to earlier epidemics, it is worth highlighting some of these which will aid the overall discussion of natural resolution versus vaccine-induced eradication of the Smallpox pathogen as a major killer.  

For instance, Creighton (1894) draws our attention to some of the aspects of the 1871/72 epidemic that set it apart from the more typical Smallpox epidemic that preceded it in the following.

A History of Epidemics in Britain

1894, Vol. II

The great epidemic of 1837-40 was the last in England which showed smallpox in its old colours. The disease returned once more as a great epidemic in 1871-72, after an interval of a whole generation (in which there had been, of course, a good deal of smallpox); but the epidemic of 1871-72 was different in several important respects from that of 1837-40
…It was a more sudden explosion, destroying about the same number in two years (in a population increased between a third and a half) that the epidemic a generation earlier did in four years. It was an epidemic of the towns and the industrial counties, more than of the villages and the agricultural counties; it was an epidemic of London more than of the provinces..
Creighton, (1894, 615)

Although Creighton may not have realised it at the time of his study (1894), the Smallpox epidemic of the early 1870s was actually the last significant Smallpox outbreak recorded across most of our industrialised nations. Again, it seems that this unusual pattern of deaths from Smallpox being mainly confined to urban centres is reflected elsewhere as exemplified by the Irish statistics.

For instance, zooming into a small nation like Ireland which reflects a similar pattern seen elsewhere during the earlier 1870s Smallpox eruption, we get a true sense of the significant impact that this final epidemic had upon the urban population in particular.

A History of Epidemics in Britain

1894, Vol. II

Registration began in Ireland in 1864, and showed little smallpox for the first few years. The next great epidemic, of 1871-72, showed the incidence upon the large towns, and the comparative immunity of the country population, even more strikingly than in England. In a total mortality of 3913 during the two years of 1871 and 1872, the three counties of Dublin, Cork and Antrim had the following enormous share, which fell mostly to the three cities of Dublin, Cork and Belfast:
   Dublin Co.   1825
   Cork Co.       1070
   Antrim           510
3405 deaths in 3913 for all Ireland.
Creighton, (1894, 621)

Creighton highlights yet another unusual aspect of the Smallpox epidemic of the earlier 1870s in terms of the significant age shift compared to the more typical epidemic that preceded it as documented in the following.

A History of Epidemics in Britain

1894, Vol. II

In England at large smallpox in 1839 was still distinctively a malady of the first years of life. It was not until youths and adults began to have smallpox in large numbers in the epidemic of 1871-72 that the doctrine of re-vaccination was generally apprehended in England.
Creighton, (1894, 612)

Therefore, could it be, as the vaccines had unexpectedly worn off – effectively now we had grown up children – adolescents and young adults for the most part – who were the least protected, leaving them the most vulnerable during the final major Smallpox epidemic of the early 1870s? Remember that the vaccine would have been successful and protected these young people when they were children against circulating Smallpox, but, perhaps it was too successful as in reality their immune systems never got the opportunity to deal directly with the real pathogen due to the artificial barrier afforded by the Cowpox vaccine now made compulsory for every infant!

But, also bear in mind that the infants and smaller children that still had protection from the vaccine during this final significant Smallpox epidemic of the earlier 1870s would have for the most part survived. Thus, although the overall death toll may have been fairly severe compared to the epidemic preceding it in our respective regions, in statistical terms at a population level, these mortality figures are not representative of our nations as a whole as they were, for the most part, confined to our greater urban centres.

Now, this vaccine waning is not confined to these regions as you will see in the excerpt below with reference to Sweden, where Jenner’s vaccine was introduced significantly earlier than in Britain or Ireland, (in Sweden it was introduced as early as 1801 and compulsion came a short fifteen years later 1816).  The excerpt below also highlights the fact that Sweden’s last Smallpox epidemic occurred during the earlier 1870s and it is of interest that this corresponds quite closely with that reported for Britain and Ireland. Furthermore, the final resolution of deaths from Smallpox documented within Sweden corresponds rather closely with the same final decline of Smallpox as seen in the rarity of deaths from the disease in Ireland, Scotland, Wales and England.

Inoculation to Vaccination: Smallpox in Sweden …

       Before vaccination; 95 percent of smallpox deaths were those of children, but after 1801 it became as common among adults.   Due to problems with re-vaccination, adults faced a much greater risk of infection during the last epidemic of 1873-75 than during the previous century. After the 1880s smallpox became an uncommon disease and smallpox deaths were rare.
 Skold, P., (1996, Abstract)

All in all, revaccination would also appear to be an almost universal phenomenon as documented for a whole range of diverse nations by Creighton (1894) below.

A History of Epidemics in Britain

 Vol. II


In other parts of the Continent of Europe the frequency of smallpox in adults was not less remarked than in France in the second quarter of the 19th century. English writers had been able at one time to point to foreign countries for the success of infantile vaccination. Sweden and Denmark were for a long time classical illustrations; then it was Germany’s turn. “…In the German States, vaccination has become universal, and in them as well as in various other countries the smallpox is almost unknown.” When we next find German experience appealed to, it is to enforce the need of re-vaccination: “In 1829,” said Gregory, “the principal Governments of Germany took alarm at the rapid increase of smallpox, and resorted to re-vaccination as a means of checking it…”
Dr Gregory, in his speech at the Medical and Chirurgical Society of London in December, 1838, urged the need of re-vaccination not only by the example of Germany, but also by the experience of Copenhagen, where a thousand cases of smallpox had been received into the hospital (it was nearly always adults that were taken to the general hospitals) in twenty-one months of 1833-34, nine hundred of them being of vaccinated persons…
     “In Prussia, 300,000 had been re-vaccinated, and the same number in Würtemberg. In Berlin nearly all the inhabitants had undergone re-vaccination[…].” It was about the same time that a second vaccination became obligatory in the armies of Prussia, Würtemberg, Baden and other German States, and among the pupils of schools when they reached the age of twelve years.
 Creighton, C, (1894, 612)

However, returning to what would appear to be a more natural state for much of our populations, whether protected by Jenner’s vaccine or not, is that we can clearly see by assessing the available historical accounts as discussed above and employing comparative statistics (see Figure 3 below) that natural immunity was becoming the prevalent state of London and Ireland when compulsory vaccination was implemented (Fig. 3).

smallpox London and Ireland compared

Fig. 3: Reproduced from: Figure. 5.4. Deaths from smallpox per 1000 deaths from all causes in London, from 1629 to 1900. Data from Guy (1882) and the Registrar General’s Statistical Review of England and Wales. in F. Fenner, D. A. Henderson, I. Arita, Z. Jezek, I. D. Ladnyi (1988) – Smallpox and its Eradication. World Health Organization (WHO) 1988. Original source: 9241561106.pdf Irish graph superimposed on London data (green) showing the closely matched pattern of number of annual deaths from Smallpox since records began in 1864, one year after compulsory vaccination. See small graph below the main graph of London and Ireland mortality statistics from Smallpox which shows the stand-alone Irish data derived from “Annual Reports on Marriages, Births and Deaths in Ireland, from 1864 to 2000” courtesy of An Phríomh-Oifig Staidrimh, Central Statics Office CSO, link.

This data above (Fig. 3) shows the annual number of Smallpox deaths since official records began in Ireland (in 1864), a year after the introduction of compulsory vaccination against Smallpox and, although this superimposed graph upon the long-term London data is not to scale, it serves to demonstrate the tightly matching pattern of deaths around the final major epidemic of 1871/72.

However, the unusual spike of the post-compulsory vaccine era of Smallpox deaths recorded in both Ireland and London for the final great Smallpox epidemic of 1871/72 (Fig. 3), should also be viewed in the light of the above discussion regarding the sudden realisation that vaccination did not afford the same life-long protection as the real disease, prompting the urgent need for revaccination, but more importantly, this should be tempered with the fact that for most of the rest of our respective nations, outside the great urban centres, there was  already relative robust natural immunity as suggested within Creighton’s (1894) historical account of the unusual nature of the 1871/72 Smallpox epidemic in England, Ireland and beyond.

In other words, by the time the compulsory laws and further legislation to enforce such laws were in place, it seems that Smallpox had already seen its heyday as a major killer as the historical archives clearly demonstrate and the final resolution of Smallpox as a major killer was most likely explicable by more natural means as the decline in deaths occurs near-simultaneously throughout our respective nations, irrespective of when and how we implemented our various medical interventions using the new Jenner vaccine.

For instance, the longer-term data from London spanning around 300 years clearly illustrates the major rise and fall of Smallpox’s reign in terms of its deadliness. You can clearly see the rise of Smallpox from the 1600s through to the post-1770 era and the significant decline in deaths that was occurring rather dramatically after the first third of the 1800s as documented from historical accounts and statistics. If we had similarly long-term official statistics for Ireland – would the deaths from Smallpox match the peaks and troughs and the overall decline in deaths akin to London relative to Ireland’s significantly smaller population within each era?

Based upon the historical accounts and available statistics outlined earlier, it is very likely that the fate of Smallpox in Ireland did follow a fairly similar pattern to that illustrated for London in Figure 3. Indeed, if we had longer-term statistics for the rest of England, Wales and Scotland, it seems reasonable, based upon similar historical accounts, that all of our nations would follow this overall rise, peaking and ultimate resolution of Smallpox as a major killer and therefore, much of this significant decline in deaths from Smallpox would have occurred well before the wider use of Jenner’s vaccine and irrespective of the timing of our respective implementation of laws enforcing its use.

Even where fairly long-term statistics are available from other regions that had a significantly earlier introduction of Jenner’s vaccine and compulsion followed shortly thereafter as is the case for Sweden (Fig. 4), it still looks like the population as a whole were becoming rapidly resistant to Smallpox’s worst effects many generations prior to our efforts to resolve the disease via artificial means.

Screen Shot 2018-09-24 at 17.19.06

Fig. 4: Graph generated & vaccine info added from data presented in Figure 6, p. 76: THE HISTORY OF SMALLPOX AND ITS PREVENTION IN SWEDEN, P Sköld – ‎2002. Original source: Dates taken from, Inoculation to Vaccination: Smallpox in Sweden in the Eighteenth and Nineteenth Centuries, Skold, P., (1996, Abstract)

As noted in Skold’s study (see links above in the graph in Figure 4), the old form of inoculation was never consistently widespread within Sweden to have impacted upon the overall mortality rates for Smallpox prior to Jenner’s Cowpox vaccine. Further support for the natural decline in deaths occurring as due to a predominantly biological (non-genetic) process rather than an artificial intervention is also seen in the fact that our efforts to eliminate the worst effects of the disease cannot be specifically correlated with the overall pattern of declining death rates from Smallpox when we directly compare long-term statistics from two quite different regions such as London and Sweden as seen in Figure 5.

This comparative graph strongly indicates that even though Sweden introduced vaccination significantly earlier than London (Britain) – first in 1801 and followed up by compulsory vaccination in 1816 as noted above, the pattern of decline in deaths from Smallpox (implying increasing widespread population immunity) is surprisingly closely matched (although not to scale) despite their  respective disparate vaccination  policies.

Smallpox Sweden and London compared

Fig. 5: London graph reproduced and based upon data from Figure. 5.4. representing deaths from smallpox per 1000 deaths from all causes in London, from 1629 to 1900. Data from Guy (1882) and the Registrar General’s Statistical Review of England and Wales. in F. Fenner, D. A. Henderson, I. Arita, Z. Jezek, I. D. Ladnyi (1988) – Smallpox and its Eradication. World Health Organization (WHO) 1988,  source: 9241561106.pdf This is directly compared by superimposing the data from Sweden (black line), not to scale recording deaths from Smallpox for the corresponding period generated from data given in Figure 6, p. 76: THE HISTORY OF SMALLPOX AND ITS PREVENTION IN SWEDEN, P Sköld – ‎2002. Source:

In other words, if Jenner’s vaccine (particularly when it was made compulsory and made almost universally available to all infants) had played a significant role in reducing the number of deaths from Smallpox and its ultimate resolution (eradication), one would most certainly expect that the respective pattern of mortality rates between London and Sweden (Fig. 5) would reflect this intervention in some way.

For example, Sweden should have seen a significantly earlier resolution of deaths from Smallpox due to its earlier implementation of vaccination policies compared to London and similarly, the particularly steep decline in deaths at shared corresponding points in time as seen from the later mid-1700s onwards (the period that we have official records of deaths from within Sweden) would not be what was expected if these region’s respective vaccine policies had truly impacted upon the overall population statistics. Each major decline in deaths from Smallpox should come quickly after each region made the vaccine almost universally available to its infant population.

However, as you can see from the comparative graph above (Fig. 5), the mortality patterns (although not to scale) are fairly closely matched and this goes against the idea that the widespread use of Jenner’s vaccine was the direct cause of this decline.

Do note that the unusual epidemic of the earlier 1870s seen within the London mortality statistics, although much more pronounced than the final epidemic recorded for Sweden (also dating to the earlier 1870s), the London data represents mostly urban deaths, where the rest of our respective populations had essentially remained relatively immune during this time and Sweden, being a fairly non-densely populated region in this era, may also be more representative of this increasingly robust immunity to Smallpox’s worst effects. Indeed, to the point where all of the aforementioned regions, Ireland, England/Wales, Scotland and now Sweden see a final resolution of Smallpox as a major killer by the later 19th Century and deaths become quite rare as we enter the 20th Century, seemingly, across most of our emerging modern nations.

Such closely matched patterns of deaths from Smallpox, across such vastly different nations with such widely different timing in the implementation of our respective vaccination or revaccination policies, strongly implies that in the larger scheme of things, artificial (and seemingly rather short-term) immunity afforded by Jenner’s vaccine had little bearing on the overall reduction of deaths from Smallpox or its ultimate demise as the vaccines wore off, more and more of the populations would have been exposed naturally to the disease and more and more of the population would have therefore become naturally immune anyway.

Furthermore, it would seem rather strange if Smallpox was the exception to the general natural pathogen/host resilience over time via exposure rule. Can we now add Smallpox to the story of natural generational immunity?

It is beginning to look very likely that parents across our near modern nations were indeed counting more of their children, not only because they were exposed to, or had been infected by, the actual POX (Smallpox,  perhaps not Jenner’s cowpox), but, also because they had been exposed to just about everything else during the great age of epidemics  resulting in the end in a fairly robust impunity against some of the deadliest contagions known to humankind – including the Pox.


[1]  Creighton, C., (1894) A History of Epidemics in Britain, From the Extinction of Plague to the Present Time, Vol. II, Cambridge University Press, p. 623.
[2] Riedel, S., (2005) Edward Jenner and the history of smallpox and vaccination, Baylor University Medical Center Proceedings. Vol 18, [1], pp. 21–25,
[3] Babkin, I.V., and  &  Babkina, I.N., (2015) The Origin of the Variola Virus. Viruses. Vol,  7, [3]: pp. 1100–1112. [doi:  10.3390/v7031100 ]
[4] Riedel, S., (2005) Edward Jenner and the history of smallpox and vaccination, Baylor University Medical Center Proceedings. Vol 18, [1], pp. 21–25,
[5] Battersby, E., (2006) Plucking the Strings of Genius,  Irish Times, (April 24th 2006),
[6] Creighton, C., (1894) A History of Epidemics in Britain, From the Extinction of Plague to the Present Time, Vol. II, Cambridge University Press, pp. 543-544.
[7]  Ó Grada, C., (1979) The population of Ireland 1700-1900: a survey, Department of Political Economy University Dublin, p. 288.
[8] Davenport, R., Schwarz, L., & Boulton, J. (2011). The decline of adult smallpox in eighteenth-century London. The Economic History Review, Vol. 64, [4], Abstract, pp. 1289–1314. [
[9] Creighton, C., (1894) A History of Epidemics in Britain, From the Extinction of Plague to the Present Time, Vol. II, Cambridge University Press, p. 220.
[10]  Angus, J.,  (1854) Old and New Bills of Mortality; Movement of the Population; Deaths and Fatal Diseases in London During the Last Fourteen Years,  Journal of the Statistical Society of London, Vol. 17, [2], p,. 127.  [Available online from JSTOR published by: Wiley for the Royal Statistical Society Stable URL:
[11] University of Dundee, (n.d.) Allergy & Immunity-Vaccination – Smallpox, Museum exhibitions medical
[12] Brunton, D., (2008). The Politics of Vaccination: Practice and Policy in England, Wales, Ireland, and Scotland, 1800-1874. Boydell and Brewer, p. 106
[13]  Adelman, J., (2017)  Anti-vaccine sentiment will not be easily eradicated, Irish Times, (March 9th  2017),
[14] Creighton, C., (1894) A History of Epidemics in Britain, From the Extinction of Plague to the Present Time, Vol. II, Cambridge University Press, p. 615.
[15] Creighton, C., (1894) A History of Epidemics in Britain, From the Extinction of Plague to the Present Time, Vol. II, Cambridge University Press, p. 621.
[16] Skold, P., (1996) Inoculation to Vaccination: Smallpox in Sweden in the Eighteenth and Nineteenth Centuries, Population Studies,  A Journal of Demography, Vol. 50, [2] Abstract, pp. 247-262,
[17] Creighton, C., (1894) A History of Epidemics in Britain, From the Extinction of Plague to the Present Time, Vol. II, Cambridge University Press, p. 612.
[18] Creighton, C., (1894) A History of Epidemics in Britain, From the Extinction of Plague to the Present Time, Vol. II, Cambridge University Press, p. 612.


Are We Essentially Immune Because Our Ancestors Had the Pox and Just About Everything Else?

This study focussed upon ancestral natural immunity which emerged from investigating the decline in deaths from some of the deadliest contagions known to our emerging modern societies in the past. Using graphs plotted from the official annual deaths registrar – specifically for Ireland since records began (which has not been done to date), these become a vital source for illustrating and making comparisons with directly corresponding data for the same diseases from other developed nations over similar timescales.

The results show a surprisingly tight correspondence of patterns of rises, peaks and troughs to virtually zero deaths by the time our nations began to become fully modern, yet, other great killers of the past only lived on in folk memory. This could only be described as a fully natural process as our respective interventions, if any at all, can not possibly account for the near-simultaneous and almost universal pattern of declining mortality rates identified throughout this study across so many diverse and far-flung nations.

From Ireland to Iceland, Britain to Berlin and from Australia to the Americas, the overarching pattern of this phenomenon would appear to be that as each major killer of the past rises to dominance, it peaks and then rapidly declines in its deadliness until full resolution of the disease is achieved. We know that these pathogens still exist today – they never went anywhere and they certainly didn’t become extinct as there are simply too many of them. They all essentially still have the same genomes that once caused much deadlier outcomes. So what changed?

All in all, the historical sources and statistical data across our far-flung and diverse emerging modern nations all seem to tell the same essential story of our incredibly adaptive immune system and its ability to combat and tame some of the deadliest contagions known to humanity.

This shared pattern of declining death rates for a closely matched timeframe has been consistently demonstrated throughout our industrialised nations where a study of this type has been carried out. And although this is a fairly well-recognised phenomenon among many historical geographers and health statisticians, however, there has never really been a full consensus on the cause of this phenomenon; except perhaps a general acknowledgement that our medical interventions, when we drill down to the details, cannot account for the significant decline in contagious diseases at a larger population level as discussed throughout this present study.

Even though, some other possible causes for the almost universal decline of seemingly all of the greatest killers of the past have been proposed such as what is commonly referred to as the hygiene hypothesis, the improved nutritional hypothesis, economic factors/social reforms etc, or combinations thereof, this study found that when we examine the historical context for each disease from Dysentery to Cholera or Typhus Fever to Typhoid, that  as we delve deeper none of these proposals stand up to closer scrutiny and certainly, it is difficult to see how all of our nations would suddenly find themselves tackling the very same diseases with the very same interventions (cleaning up sewers, delousing, improving infrastructure, finding fresh vegetables, eliminating poverty etc) to the same degree and at exactly the same time, to bring about this near-universal and near-simultaneous patterning of decline seen across our great nations.

The one thing that many scholars do however agree upon when such an investigation is carried out, as noted above, is that the impact of our various medical interventions could not have for the most part made any significant impact upon the overall deaths from many of these pathogens at a population level due mostly to the fact that as discussed previously, most diseases had already stopped killing everyone, particularly our children, by the time any such interventions even came on the horizon.  The only contagion of old that is often cited as being eradicated due to our medical intervention via vaccination is of course Smallpox. But, hopefully, by now you can begin to see how even this great killer of the past may have, like all the others, become fully resolved by natural means.

However, there is one hypothesis that would appear, for the most part, to most closely match the evidence explored within this present study and I have excerpted a summary of this highly influential work given in a study exploring the impact of Smallpox before vaccination in Britain as seen in the following.

Smallpox transmission and control in Britain before vaccination

In his extremely influential work ‘Plagues and Peoples’ (1977) William McNeil outlined what has become the dominant model of infectious disease patterns in historical populations. McNeill argued that early human populations, living in small groups at low densities, could not have sustained many of the major human pathogens, especially those that conferred long-lasting immunity on survivors. However as populations grew and came into more frequent contact then the opportunities for disease transmission increased. …
Urbanisation provided large dense populations with birth rates sufficient to provide a regular supply of immunologically naive hosts that enabled pathogens to sustain continuous chains of transmission. Thus diseases that initially caused sporadic outbreaks in small groups gradually became capable of persisting in populations without requiring re-introduction, a process called endemicisation.
Trade, migration and exploration brought previously isolated populations into contact and promoted the exchange and globalisation of pathogens. Thus the rise of human populations was accompanied by an increasing burden of infectious diseases.
In the case of immunising diseases then this burden fell increasingly upon the young. McNeill also argued however that although sudden contacts with new diseases could cause dramatic mortality crises, the more gradual process of endemicisation did not result in remorseless rises in mortality. Rather increasing exposure to infectious diseases was accompanied by a process of accommodation between host and pathogen that favoured the evolution of avirulence. Therefore as immunising diseases were reduced to diseases of childhood they also became milder.
 Davenport, R.,  Newton, G.,  Satchell, M., and  Shaw-Taylor, L. (n.d.)

William McNeil’s hypothesis indeed appears to be the pattern we observe from the historical record as discussed thus far, notwithstanding the more recent molecular findings supporting the very long-term generational transference of resistance to this scenario as discussed throughout this present study.

This idea is further supported when we observe from the historical record demonstrating just how rapidly a population can be devastated when first exposed to a pathogen they have not experienced previously to any great extent, irrespective of how healthy they might be, or how much fruit or raw fish they have at their disposal. As discussed earlier in this study (recall Typhoid Mary?) more recent molecular studies are beginning to show that even these naïve (previously unexposed isolated populations) could rapidly build up surprisingly robust resistance within a few short generations – each successive generations seeing fewer and fewer fatalities and of course, this was too fast to be explicable by our modern Darwinian form of genetic inheritance and certainly doesn’t follow the normal thinking of survival of the fittest idea either.

All in all, it now looks like our immune systems may be much more responsive and adaptive than previously thought and due to a more recently recognised understanding of non-genetic inherited immunity and maternal priming, we may have built up relatively robust resistance, rather rapidly over the generations to all sorts of pathogens and the good news is that our immune systems also appear to have incredibly long memories – so we are now in the fortunate position to have benefitted from the battles or our long forgotten ancestors. But, just one little caveat – Nature would appear to require us to be naturally exposed to such pathogens – not to be artificially shielded from them because even if we have forgotten about them – our immune systems have seemingly not! But, they do need a little reminder now and then.

Therefore, although it is quite unusual to place a dedication at the end of a study, I felt it highly appropriate to acknowledge all who went before us and faced on our behalf, some of the deadliest contagions known to humankind.



This Study is dedicated to:

All those who gave their lives & suffered

Untold disabilities on the front line of our

Developing nations,

Who fought with such valour

– Unknowingly – defending our

Future immunity

Against the greatest scourges of humanity.


[1] Davenport, R., Newton, G., Satchell, M., and Shaw-Taylor, L., (n.d.) Smallpox transmission and control in Britain before vaccination, Cambridge University research projects: migration, mortality, medicalisation PDF

For updates on this series make contact on Facebook at Natural Immunity Community below or fill out the contact form if you haven’t signed up for the series already. FROM THE POX TO THE POLIO SERIES STARTING SOON!

fb smash logo



Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s