You are currently browsing the tag archive for the ‘United States’ tag.

thanksgiving greenlanternpressMay I take this opportunity to wish my American readers a very happy Thanksgiving!

My 2013 Thanksgiving post looked at the development of this tradition, which explains the absence of this holiday between the 17th century and the mid-19th century. It all came down to a letter from 1621 about the first Thanksgiving which surfaced in 1841. The post describes what happened at that initial celebration, including what was eaten and the sporting competition which followed.

Other posts for this day have explored the historical significance as many Americans know it: British Calvinists and American Indians together, George Washington’s First Thanksgiving Proclamation, Abraham Lincoln’s Proclamation of Thanksgiving, a biblical perspective with a call for personal priorities and, lest we forget, a USMC chaplain’s poem remembering the troops who are serving the United States at this time. May we remember them in our prayers.

Besides the iconic feast at Plymouth, other American regions (e.g. Virginia and Florida) also had feasts of Thanksgiving which took place. That said, none have captured the imagination or spirit of the holiday as vividly as that in Massachusetts in 1621.

Wherever you are in the world today, have a great time with family and friends!

I have scheduled this post, for reasons stated below, to appear before the final regular episode of Downton Abbey, which airs tonight in the UK.

This sixth and final series will be shown in the US (PBS) starting on January 3, 2016.

As most of my readers are American, I would be grateful if anyone commenting from the British Isles could avoid spoilers. Thank you in advance!

Setting expectations

Not surprisingly, before series six started, a number of newspaper articles appeared.

Michele Dockery, who plays Lady Mary, told InStyle why the series is ending in 1925:

I think, collectively, everyone felt this was the right time. And I think if we had kept going, we’d have gone into maybe, possibly the [G]eneral [S]trike and then onwards. And then you’re into the 30s. And then it becomes kind of Gosford Park territory. And then there’s a whole other kind of shift, a new era, a new decade. So then, when can you stop?

Whilst there is plenty of scope for a sequel series, or perhaps a film — possibly set in the 1950s when many estates were on their knees — Dockery said of the possibility of reprising her role:

… I think the show is an ensemble, so there has to be a collective decision in that, I think. I don’t think you could just grab two characters and create a movie. I think it has to be the show. So, we’ll see.

Executive Producer Gareth Neame told The Guardian that ITV wanted the series to continue. So did PBS, according to Masterpiece chief Rebecca Eaton. Carnival and Masterpiece had mooted the idea of seven series, however, discussions with the cast revealed that six and a final Christmas special (timed for the British) would be the limit.

Neame hinted at a satisfying conclusion, despite the new postwar era with its melancholic undertones.

The genius and writer behind the show — Sir Julian Fellowes — is now working on a series which takes place in early 20th century New York. The Gilded Age centres around the robber barons. Neame is collaborating on it with him.

Jim Carter, who plays Mr Carson, told The Telegraph that the final series and concluding special bring viewers down gently:

It’s just life changing. And none of the maids want to live in (the house), they want to live in the village, so they can see their boyfriends. They want to work in shops. Nobody wants to work in service any more. That way of life – we’re saying goodbye to it. And this series is slowly and effectively – very effectively, the Christmas special is a heartbreaker of an episode. Not because of tragedy, but because you’re saying goodbye to a way of life, and these characters that people have grown very fond of.

Just as scriptwriters and directors can build viewers up for the next episode or series, they can also prepare one for The End. Series six effectively does this, as Carter/Carson says.

Sir Julian Fellowes

In 2012, prior to the third series aired in the US, Vanity Fair featured an interview with Sir Julian Fellowes.

Fellowes, some would say, is a late bloomer. He worked for years as a character actor and novelist prior to writing scripts in the 1990s. Most screenwriters have not only a hard time breaking into the industry but also staying in it. Since film began, directors — Alfred Hitchcock, to name but one — have been notorious for chopping and changing scriptwriters.

Fortunately for Fellowes, he happened to meet Ileen Maisel over 20 years ago. Maisel had just opened the Paramount Pictures office in London. She envisaged developing John Fowles’s Daniel Martin into a movie and was impressed by Fellowes’s knowledge of the novel.

When that project did not come to fruition, Maisel introduced Fellowes to actor/director-producer Bob Balaban. (I remember when a young Balaban played character roles in 1960s US sitcoms. I’m getting old!) Balaban and Maisel wanted to involve Fellowes in another project, a film adaptation of Anthony Trollope’s The Eustace Diamonds.

That, too, foundered, but an impressed Balaban introduced Fellowes to none other than Robert Altman. The meeting took place on the cusp of the 21st century. The film was Gosford Park. Neither Balaban nor Altman knew much about country houses, hence Fellowes’s presence:

“So I got Julian in a room with Robert,” Balaban said, “and Julian starts talking, and he knows everything that happens in a British house of that kind. Both Bob and I were floored.” On the wrong side of 50, at least in industry terms, Fellowes had won his first screenwriting job, with one of the best directors in the history of the medium.

“I am that rare person who owes everything to one guy, and that guy is Bob Altman,” Fellowes said. “He fought the studio to keep me on, and he never once said, ‘This is my 18th film and I’m a world-famous director. Who the Sam Hill are you?’ It was just two overweight men talking and occasionally arguing.”

That the toking, anarchy-fostering maverick auteur worked so harmoniously and fruitfully with the necktied monarchist is a testament to the character of both men.

Fellowes knows of what he spoke then — and now. Members of his family are listed in Burke’s Landed Gentry (not to be confused with Burke’s Peerage). Julian’s birth was similarly listed. His father, Peregrine, was a civil engineer and diplomat. He worked for Shell Oil and the Foreign Office.

Peregrine had a difficult upbringing. His father died in 1915 in the Great War. His mother became interested in dating, so Peregrine was left to be cared for by maiden aunts, one of whom was the inspiration for Lady Violet:

The eldest of them, Isie, is the model for Maggie Smith’s dowager characters in both Gosford Park and Downton Abbey.

“Aunt Isie had this sort of acerbic wit, yet she was kind,” Fellowes said. “Lots of those lines Maggie has, like ‘Bought marmalade! Oh dear, I call that very feeble,’ and ‘What is a weekend?’—they came straight from her.”

Fellowes is not terribly different in some respects. When Vanity Fair‘s interviewer David Kamp took coffee with him, Kamp held the bowl of the cup rather than the handle:

Don’t think he didn’t clock this, the slightest Violet-ish wince of “Oh, dear” in his eyes.

When the two were at Ealing Studios in west London, where many of the interior scenes were filmed, Kamp saw how historically accurate Fellowes was:

“Liz,” he said, addressing Liz Trubridge, one of the show’s producers, “we’ve got to get the glasses of water off the table. They’re having tea. They wouldn’t have water there. A glass of water is a modern thing.” The water glasses were removed, and the scene, now more period-authentic, resumed shooting.

Whilst politically he is Conservative, Fellowes intelligently embraces the present and honours tradition. That blend of perspectives has helped him to propel Downton Abbey to an iconic status among television series of the early 21st century.

It is interesting that Fellowes’s favourite television programme is Coronation Street, Britain’s longest running televised soap opera which takes place in a working class area of Northern England. Four actors from Corrie, as we call it, are or were in Downton. They are Anna (Joanne Froggatt), Lady Violet’s maid Denker (Sue Johnston), Thomas (Rob James-Collier) and O’Brien (Siobhan Finneran).

My predictions

I debated whether to make my predictions public.

On the one hand, I could be wrong. However, it would not be the first time.

On the other, if I were correct, I would have been annoyed not to publish them beforehand!

So, here goes.

Please note that I have not seen the ITV1 trailer (coming attractions) for the final episode nor have I read spoilers, which are everywhere at the moment.

I predict that by the end of the concluding special (Christmas here, 2016 in the US):

1/ Lady Mary will remarry. Her husband will be someone she — and we — have known for a very long time. Her husband is someone who knows her. She can trust and confide in him. He will be a good father to young George. Mary and he also can run the estate in tandem and in full agreement with each other. In other words, Tom.

2/ Lady Edith will meet with or hear from Michael Gregson (ably played by Charles Edwards), the father of her child, Marigold. He will turn out not to have been killed by the Nazis. He will reveal — or someone else will — that he was in hiding all these years, perhaps working as a spy. We will either see them get engaged or be left with the understanding that they will be soon.

3/ We will either see or be left with the impression that Anna delivered a healthy first child, much to Bates‘s delight.

4/ We will discover that Baxter is Thomas’s mother and that Thomas knows who is father is. We will understand how and why Thomas bears a grudge against both.

The Thomas Question

What will happen to the odious Thomas? He has made many of the nicer servants’ lives a misery over the years, especially when O’Brien worked there.

Given that homosexuality was, at the time, illegal and considered as the height of moral depravity, it is no mystery that Carson, in particular, views him with disdain.

I doubt he will be made head butler at Downton.

But what is the point of the character? We can but wonder why he has not yet met with either a Damascene conversion or dramatic death.

It will be hard for him to shake his dodgy reputation.

Isolated, lonely, angry, he could commit or attempt suicide — also illegal at the time.

I don’t have an answer other than to link his future — or demise — with Baxter in some way.

Final memory

Along with countless millions of others around the world, I shall miss Downton Abbey greatly.

Even the title sequence was endearing — absolutely perfect:

Sincere thanks to Julian Fellowes, Rebecca Eaton, Masterpiece, Carnival Productions as well as all the many actors, actresses and crew members who made several Sunday nights a year sheer televisual pleasure!

Yesterday’s post had the first part of a two-part series on American soil deficiency in 1936.

The source material, at the request of a US Senator at the time — Duncan Fletcher (D – Florida) — was included in the 74th Congress 2nd Session, Senate Document #264, 1936.

The original document is on the US Senate website. (An HTML version is here.) It is an article from a family news magazine, The Cosmopolitan, which much later became the title we know today.

The article is called ‘Modern Miracle Men’ written by Rex Beach about Dr Charles Northen, a physician who went into soil replenishment to better nourish man and beast. He was based in Orlando, Florida, and could have been resident in Fletcher’s constituency. The article says that Northen was considered

the most valuable man in the State.

Yesterday’s post excerpted and summarised Northen’s findings about the poor mineral quality of America’s soil in the 1930s. It had significantly declined since the 19th century and, in many parts of the country, food and meat had little nutritional value.

Today’s excerpts and summary discuss the second half of the article. Emphases in bold are mine.

I cannot help but think we are in no better shape today with regard to the food we consume.

Why no one cared — or cares?

Northen was decried for his research.

The article points out that the medical establishment had been wrong before: in the late 19th century, the Medical Society of Boston condemned the use of bathtubs!

Similarly, physicians and other experts were — are? — wrong on ignoring soil deficiencies. In the 1930s, textbooks kept using outdated analyses from a bygone era decades before when soil was still rich in nutrients.

Although Northen was able to demonstrate that soil samples can vary greatly even in a local area, his peers scoffed: ‘So what?’

Northen’s work on various farms and orchards was exemplary. By carefully mineralising the soil, grass was better, fruit trees pest-free and abundant whilst livestock were healthier. All those fresh products then went into the human food chain, improving the lives of the lucky Americans who ate them.

Northen’s wisdom — interview

Beach, who owned a farm, ended the article by redacting part of the interview Northen gave him.

Although Northen was elderly at the time, he was a goldmine of statistics, experience and knowledge. As we’ll find out, Beach turned around his own soil with Northen’s help.

Sick soils mean sick plants, sick animals, and sick people. Physical, mental, and moral fitness depends largely upon an ample supply and a proper proportion of the minerals in our foods. Nerve function, nerve stability, nerve-cell-building likewise depend thereon. I’m really a doctor of sick soils.”

Do you mean to imply that the vegetables I’m raising on my farm are sick?” I asked.

Precisely! They’re as weak and undernourished as anemic children. They’re not much good as food. Look at the pests and the disease that plague them. Insecticides cost farmers nearly as much as fertilizers these days.

A healthy plant, however, grown in soil properly balanced, can and will resist most insect pests. That very characteristic makes it a better food product. You have tuberculosis and pneumonia germ in your system but you’re strong enough to throw them off. Similarly, a really healthy plant will pretty nearly take care of itself in the battle against insects and blights –and will also give the human system what it requires.”

“Good heavens! Do you realize what that means to agriculture?”

“Perfectly. Enormous saving. Better crops. Lowered living costs to the rest of us. But I’m not so much interested in agriculture as in health.”

“It sounds beautifully theoretical and utterly impractical to me,” I told the doctor, whereupon he gave me some of his case records.

For instance, in an orange grove infested with scale, when he restored the mineral balance to part of the soil, the trees growing in that part became clean while the rest remained diseased. By the same means he had grown healthy rosebushes between rows that were riddled by insects.

He had grown tomato and cucumber plants, both healthy and diseased, where the vines intertwined. The bugs ate up the diseased and refused to touch the healthy plants! He showed me interesting analysis of citrus fruit, the chemistry and the food value of which accurately reflected the soil treatment the trees had received.

There is no space here to go fully into Dr. Northen’s work but it is of such importance as to rank with that of Burbank, the plant wizard, and with that of our famous physiologists and nutritional experts.

Healthy plants mean healthy people“, said he. “We can’t raise a strong race on a weak soil. Why don’t you try mending the deficiencies on your farm and growing more minerals into your crops?”

I did try and I succeeded. I was planting a large acreage of celery and under Dr. Northen’s direction I fed minerals into certain blocks of the land in varying amounts. When the plants from this soil were mature I had them analyzed, along with celery from other parts of the State. It was the most careful and comprehensive study of the kind ever made, and it included over 250 separate chemical determinations. I was amazed to learn that my celery had more than twice the mineral content of the best grown elsewhere. Furthermore, it kept much better, with and without refrigeration, proving that the cell structure was sounder.

In 1927, Mr. W. W. Kincaid, a “gentleman farmer” of Niagara Falls, heard an address by Dr. Northen and was so impressed that he began extensive experiments in the mineral feeding of plants and animals. The results he has accomplished are conspicuous. He set himself the task of increasing the iodine in the milk from his dairy herd. He has succeeded in adding both iodine and iron so liberally that one glass of his milk contains all of these minerals that an adult person requires for a day.

The article goes on to say that lack of iodine causes goiters.

Goiters were a huge health problem then. My maternal grandmother, who was raising a large family in that era, was preoccupied by goiter, even though no one in her family had any, thankfully. But she always impressed upon us grandchildren that eating enough iodine-rich foods and using iodised salt was essential.

She was not wrong. As the article states, the Great Lakes Region, the Northwest and South Carolina had significant numbers of people with goiter. Milk was a good way of supplying iodine. The aforementioned Mr Kincaid raised a Swiss heifer calf, taking care to mineralise her pasture and provide her with a balanced diet. She went on to become the third all-time champion of her breed, supplying 21,924 pounds of milk and 1,037 pounds of butter in one year!

Illinois farmers then began following Kincaid’s example. Fertiliser companies were quick to promote the mineral content of their products. Minerals were also made into colloidal form for inexpensive yet efficient soil correction.

Dangers then and now

The article concludes with more ailments caused by depleted soil. Some of them, such as heart disease, can be fatal. Others, like arthritis, can be debilitating.

On a wider scale, without these essential minerals in our food, we become increasingly susceptible to infection.

Northen suggested that the American populace of the 1930s clamour for food from good soil that would naturally supply their nutritional needs. He also urged them to insist that doctors and health departments establish standards of nutritional value.

He said that farmers and growers would eagerly respond to higher soil nutrition because it would mean better quality crops, better yield and happier customers.

After all, he reasoned, it is easier and less costly to cure sick soil than sick people.

It makes sense. Yet, is that what happened?

Tomorrow: ‘Sick soil’ in North America and the UK

Last week I mentioned the late Joe Vialls and his investigations.

I don’t agree with everything Vialls wrote, but he looked at every aspect of a topic. His research into health matters was spot on.

One of his articles concerns potassium deficiency, which I’ll write about this week. At the end of that article are ‘Verbatim Unabridged extracts from the 74th Congress 2nd Session, Senate Document #264, 1936’.

The contents of this came from a popular American magazine of the day, Cosmopolitan, a very different iteration of the current title.

‘Dr Z’ of the eponymous medical reports says in ‘Senate Document #264 debunked’ what you will read below is rubbish. It was heartening to see that so many of his readers took exception to what he wrote.

Dr Z did a poor job of debunking. One of the glaring errors was not even bothering to look up Cosmopolitan in a search engine.

Dr Z says, rather irresponsibly:

these are verbatim unabridged extracts of an article from Cosmopolitan magazine in 1936 and probably have about even less scientific credibility as an article from Cosmo would have today.

Had he done a few minutes of research, he would have read that Helen Gurley Brown launched the current Cosmo in 1965. He looks old enough to have known that.

Since its inception in 1886 The Cosmopolitan was, for years, a family-friendly magazine with investigative journalism, short stories and fashion spreads. In the 1950s, it was transformed into a literary magazine and, finally, a decade later, became the single women’s publication we recognise today.

What Dr Z does provide, albeit dismissively, is useful information as to how the extracts from The Cosmopolitan‘s article came to appear in a Senate document:

It’s not research, it wasn’t commissioned by and had absolutely nothing to do with the government other than the fact that Senator Duncan Fletcher, Democrat of Florida, asked that it be put into the Congressional Record (two weeks before his death of a heart attack at the age of 77).

This is the original document, still on the US Senate website. The title page says that it was presented by Fletcher. It is a reprint of Rex Beach’s article about the work of Dr Charles Northen, a physician who went into soil replenishment to better nourish man and beast. He was based in Orlando, Florida, and could have been resident in Fletcher’s constituency. The article says that Northen was considered

the most valuable man in the State.

Poor soil = poor nutrition

The Depression produced hardship, however, as Beach revealed, Northen found it relatively inexpensive to replenish soil with missing minerals necessary for health.

Also, whilst we today wonder how our forebears of the 19th century survived without calling the doctor except in a severe emergency, food had much more nutritional value to it in those days.

Excerpts and a summary of Beach’s article follow. If you prefer a version other than the PDF, an alternative format is here. I’ve added sub-headings for easier navigation. Emphases in bold below are mine.

Food poor and more needed

Do you know that most of us today are suffering from certain dangerous diet deficiencies which cannot be remedied until the depleted soils from which our foods come are brought into proper mineral balance?

The alarming fact is that foods — fruit and vegetables and grains — now being raised on millions of acres of land no longer contain enough of certain needed minerals, are starving us — no matter how much of them we eat!

This talk about minerals is novel and quite startling. In fact, a realization of the importance of minerals in food is so new that the textbooks on nutritional dietetics contain very little about it. Nevertheless it is something that concerns all of us, and the further we delve into it the more startling it becomes.

You’d think, wouldn’t you, that a carrot is a carrot–that one is about as good as another as far as nourishment is concerned? But it isn’t; one carrot may look and taste like another and yet be lacking in the particular mineral element which our system requires and which carrots are supposed to contain. Laboratory tests prove that the fruits, the vegetables, the grains, the eggs and even the milk and the meats of today are not what they were a few generations ago. (Which doubtless explains why our forefathers [and foremothers] thrived on a selection of foods that would starve us!) No one of today can eat enough fruits and vegetables to supply their system with the mineral salts they require for perfect health, because their stomach isn’t big enough to hold them! And we are running to big stomachs.

No longer does a balanced and fully nourishing diet consist merely of so many calories or certain vitamins or a fixed proportion of starches, proteins, and carbohydrates. We now know that it must contain, in addition, something like a score of mineral salts.

It is bad news to learn from our leading authorities that 99 percent of the American people are deficient in these minerals, and that a marked deficiency in any one of the more important minerals actually results in disease. Any upset of the balance, any considerable lack of one or another element, however microscopic the body requirement may be, and we sicken, suffer, shorten our lives.

Northen ridiculed

Following a wide experience in general practice, Dr. Northen specialized in stomach diseases and nutritional disorder. Later, he moved to New York and made extensive studies along this line, in conjunction with a famous French scientist from Sorbonne. In the course of that work he convinced himself that there was little authentic, definite information on the chemistry of foods, and that no dependence could be placed on existing data.

He asked himself how foods could be used intelligently in the treatment of disease, when they differed so widely in content. The answer seemed to be that they could not be used intelligently. In establishing the fact that serious deficiencies existed and in searching out the reasons therefore, he made an extensive study of the soil. It was he who first voiced the surprising assertion that we must make soil building the basis of food building in order to accomplish human building.

“Bear in mind,” says Dr. Northen, “that minerals are vital to human metabolism and health–and that no plant or animal can appropriate to itself any mineral which is not present in the soil upon which it feeds.

When I first made this statement I was ridiculed, for up to that time people had paid little attention to food deficiencies and even less to soil deficiencies. Men eminent in medicine denied there was any such thing as vegetables and fruits that did not contain sufficient minerals for human needs. Eminent agricultural authorities insisted that all soil contained all necessary minerals. They reasoned that plants take what they need, and that it is the function of the human body to appropriate what it requires. Failure to do so, they said, was a symptom of disorder.

“Some of our respected authorities even claimed that the so-called secondary minerals played no part whatever in human health. It is only recently that such men as Dr. McCollum of Johns Hopkins, Dr. Mendel of Yale, Dr. Sherman of Columbia, Dr. Lipman of Rutgers, and Drs. H.G. Knight and Oswald Schreiner of the United States Department of Agriculture have agreed that these minerals are essential to plant, animal, and human feeding.

“We know that vitamins are complex substances which are indispensable to nutrition, and that each of them is of importance for the normal function of some special structure in the body. Disorder and disease result from any vitamin deficiency.

“It is not commonly realized, however, that vitamins control the body’s appropriation of minerals, and in the absence of minerals they have no function to perform. Lacking vitamins, the system can make some use of minerals, but lacking minerals, vitamins are useless.”

What mineral deficiency means

“The truth is that our foods vary enormously in value, and some of them aren’t worth eating, as food. For example, vegetation grown in one part of the country may assay 1,100 parts, per billion, of iodine, as against 20 in that grown elsewhere. Processed milk has run anywhere from 362 parts, per million, of iodine and 127 of iron, down to nothing.

“Some of or lands, even unhappily for us, we have been systematically robbing the poor soils and the good soils alike of the very substances most necessary to health, growth, long life, and resistance to disease. Up to the time I began experimenting, almost nothing had been done to make good the theft.

The more I studied nutritional problems and the effects of mineral deficiencies upon disease, the more plainly I saw that here lay the most direct approach to better health, and the more important it became in my mind to find a method of restoring those missing minerals to our foods.

“The subject interested me so profoundly that I retired from active medical practice and for a good many years now I have devoted myself to it. It’s a fascinating subject, for it goes to the heart of human betterment.”

The results obtained by Dr. Northen are outstanding. By putting back into foods the stuff that foods are made of, he has proved himself to be a real miracle man of medicine, for he has opened up the shortest and most rational route to better health.

He showed first that it should be done, and then that it could be done. He doubled and redoubled the natural mineral content of fruits and vegetables. He improved the quality of milk by increasing the iron and the iodine in it.He caused hens to lay eggs richer in the vital elements.

By scientific soil feeding, he raised better seed potatoes in Maine, better grapes in California, Better oranges in Florida, and better field crops in other States. (By “better” is meant not only an improvement in food value but also an increase in quantity and quality.)

Before going further into the results he has obtained, let’s see just what is involved in this matter of “mineral deficiencies”, what it may mean to our health, and how it may effect the growth and development, both mental and physical, of our children.

We know that rats, guinea pigs, and other animals can be fed into a diseased condition and out again by controlling only the minerals in their food.

A 10-year test with rats proved that by withholding calcium they can be bred down to a third the size of those fed with an adequate amount of that mineral. Their intelligence, too, can be controlled by mineral feeding as readily as can their size, their bony structure, and their general health.

Place a number of these little animals inside a maze after starving some of them in a certain mineral element. The starved ones will be unable to find their way out, whereas the others will have little or no difficulty in getting out. Their dispositions can be altered by mineral feeding. They can be made quarrelsome and belligerent; they can even be turned into cannibals and be made to devour each other.

A cage full of normal rats will live in amity. Restrict their calcium, and they will become irritable and draw apart from one another. Then they will begin to fight. Restore their calcium balance and they will grow more friendly; in time they will begin to sleep in a pile as before.

Many backward children are “stupid” merely because they are deficient in magnesia. We punish them for OUR failure to feed them properly.

Certainly our physical well-being is more directly dependent upon the minerals we take into our systems than upon the calories or vitamins or upon the precise proportions of starch, protein, or carbohydrates we consume.

It is now agreed that at least 16 mineral elements are indispensable for normal nutrition, and several more are always found in small amounts in the body, although their precise physiological role has not been determined. Of the 11 indispensable salts, calcium, phosphorous, and iron are perhaps the most important.

Regional statistics from the 1930s

The article goes on to list some of the mineral deficiencies around the United States in the 1930s, which I shall summarise below:

  • Calcium is essential for proper nerve and cell functions. Yet, a Columbia University study showed that 50% of Americans were calcium ‘starved’. A study of patients in a New York hospital showed that, out of 4,000, only 2 had adequate calcium in their bodies.
  • A city in the Midwest had calcium-poor soil. Of 300 children examined, 90% had bad teeth. Sixty-nine per cent had nose and throat problems, swollen glands and either enlarged or diseased tonsils. Over a third had poor eyesight, joint problems and anaemia.
  • Calcium and phosphorus need to be consumed together for either to work properly. Children require the same amount as adults. Adequate phosphates in the bloodstream prevent tooth decay. Livestock died when one or the other mineral was deficient in the soil they grazed on.
  • Our blood requires iron, yet our bodies cannot process it unless we have adequate amounts of copper. Florida’s cattle were dying of ‘salt sickness’. When the soil of their pastures was examined, it lacked iron and copper. As the grass they consumed lacked these elements, there was no way anyone eating the beef of the surviving cattle could obtain these necessary nutrients.
  • A lack of iodine disrupts thyroid function and can cause goiters. Humans only need a tiny amount each day — fourteen-thousandths of a milligram — yet the Great Lakes region was a ‘goiter belt’ and pockets of the Northwest showed severe iodine deficiencies.

Then, as now, medical specialists giving vitamin and mineral supplements to people seemed to be the way forward. Ironically, we need only trace amounts a day yet cannot manage to get that. However, the body best absorbs these when they are present in food rather than tablets, capsules or liquids. This is because they are colloidal — in fine suspension — when present in food and easily absorbed into the body.

Tomorrow: Why the medical establishment didn’t — doesn’t? — care

October 31 is widely celebrated in North America.

Hallowe’en has not managed to recuperate its roots in Europe, despite efforts by marketers and the media to encourage trick-or-treating.

In England, at least, households not wishing to participate keep their hallway and front door lights off. Generally speaking, trick-or-treaters respect this gesture and stay away.

Although I run the risk of over-simplifying the origins of Hallowe’en — All Hallows Eve/Evening, hence the traditional contraction — I may expand on it next year at this time. My pagan readers are welcome to contribute in the comments, which will stay open for a fortnight.


During the Middle Ages, a tradition called mumming developed whereby a group of people dressed up, went door-to-door or to a venue such as a pub to perform a short skit or play. They did this at various times through the year.

So far, historians have only been able to find scripts from plays which date back to the 18th century, when mumming reached its peak. It continued through the 19th century, at least in the British Isles, then faded out.

The scarcity of written records makes it difficult for researchers to pinpoint the exact origin of mumming. Wikipedia says:

Early scholars of folk drama, influenced by James Frazer‘s The Golden Bough, tended to view these plays as debased versions of a pre-Christian fertility ritual, but some modern researchers discount this view preferring a late mediaeval origin (for which there is no evidence either).[3]

That said:

Mummers and “guisers” (performers in disguise) can be traced back at least to the Middle Ages, though when the term “mummer” appears in medieval manuscripts it is rarely clear what sort of performance was involved. In 1296, for example, the festivities for Christmas and for the marriage of Edward I’s daughter included “fiddlers and minstrels” along with “mummers of the court”.[2] At one time, in the royal courts, special allegorical plays were written for the mummers each year — for instance at the court of Edward III, as shown in a 14th-century manuscript, now in the Bodleian Library, Oxford.[citation needed]

In any event — apart from mumming — the Middle Ages also saw the rise of souling, the practice of poor children and adults going door-to-door offering to pray or sing a Psalm for the dead in return for a soul cake. This took place on Hallowmas, which had pagan origins (emphases mine below):

The custom of trick-or-treating at Halloween may come from the belief that supernatural beings, or the souls of the dead, roamed the earth at this time and needed to be appeased.

It may have originated in a Celtic festival, held on 31 October–1 November, to mark the beginning of winter. It was Samhain in Ireland, Scotland and the Isle of Man, and Calan Gaeaf in Wales, Cornwall and Brittany. The festival is believed to have pre-Christian roots. The Church made the date All Saints’ Day in the 9th century. Among Celtic-speaking peoples, it was seen as a liminal time, when the spirits or fairies (the Aos Sí), and the souls of the dead, came into our world and were appeased with offerings of food and drink. Similar beliefs and customs were found in other parts of Europe.

It is suggested that trick-or-treating evolved from a tradition whereby people impersonated the spirits, or the souls of the dead, and received offerings on their behalf. S. V. Peddle suggests they “personify the old spirits of the winter, who demanded reward in exchange for good fortune”.[2] Impersonating these spirits or souls was also believed to protect oneself from them.[3]

At least as far back as the 15th century, there had been a custom of sharing soul cakes at Hallowmas.[4] People would visit houses and take soul cakes, either as representatives of the dead, or in return for praying for their souls.[5] It was known as “souling” and was recorded in parts of Britain, Flanders, southern Germany and Austria.[6] Shakespeare mentions the practice in his comedy The Two Gentlemen of Verona (1593), when Speed accuses his master of “puling [whimpering or whining] like a beggar at Hallowmas.”[7] The wearing of costumes, or “guising”, at Hallowmas, had been recorded in Scotland in the 16th century[8] and was later recorded in other parts of Britain and Ireland.[9]

The Soul — Souling — Cake

The Semper Eadem blog, which concerns all things Elizabethan, has a recipe for souling cakes, for those who are interested in making these for friends or family.

The recipe post explains:

A Soul Cake (or Souling Cake) is a small round cake, like a biscuit, which is traditionally made for All Souls’ Day (the 2nd November, the day after All Saint’s Day) to celebrate the dead

Traditionally each cake eaten would represent a soul being freed from Purgatory. The practice of giving and eating soul cakes is often seen as the origin of modern day Trick or Treating, which now falls on Halloween (two days before All Souls’ Day). The tradition of ‘souling’ and giving out Soul Cakes on All Souls’ Day originated in Britain and Ireland hundreds of years ago, from giving out bread on All Souls’ Day during the devout Middle Ages …

Soul cakes and breads were often made by drawing a cross shape into the dough before baking, signifying their purpose as Alms for the dead.

The recipe given is one from the Victorian era when many ingredients that were very expensive in the Middle Ages became more widely available. However, when the tradition first started:

Indeed, any spice at this time, sugar included, would have been a prized commodity that primarily only the wealthy could afford. To go from door to door, praying for the souls of the departed in return for these sweet treats, would have been viewed by generations of poor children as quite a good trade-off.

The Reformation

The Reformation is synonymous with the printing press. Even if one could not read, one could at least go to church to hear the Bible read in one’s own language, rendering it comprehensible for many.

As a result, where Protestantism took root, the government and Reformers frowned upon earlier syncretic practices. In England:

Henry VIII changed the perceptions of the kingdom forever when he broke from Rome. A guiding force in his reformation of the Catholic Church was the destruction of what he and his chief minister Thomas Cromwell scorned as “superstition.” Saints’ statues were removed; murals telling mystical stories were painted over; shrines were pillaged; the number of feast days was sharply reduced so that more work could be done during the growing season. “The Protestant reformers rejected the magical powers and supernatural sanctions which had been so plentifully invoked by the medieval church,” writes Keith Thomas. The story in The Crown is told from the perspective of a young Catholic novice who struggles to cope with these radical changes.

Yet somehow Halloween, the day before All Saints’ Day, survived the government’s anti-superstition movement, to grow and survive long after the Tudors were followed by the Stuarts

Recent practice

Trick-or-treating still exists in parts of the British Isles and elsewhere in Europe. Ancient traditions live on, even if they are not widespread.


An Irishwoman, Bernadette, wrote on a 2009 Telegraph blog that, where she lives, October 31 is a religious rather than secular celebration:

Round here, all the kids dress up as saints, have their mates round, run riot, prize for the best re-enactment of the life story of the saint you’ve come as, Mass, Adoration, pizzas….. which takes us nicely into All Saints Day. Come on — who celebrates Hallowee’en anymore as ghosts witches and ghoulies ? It’s so passé, dear. Keep up. Catholics have moved on a bit recently.


Scotland has the practice of guising — disguising.

I have only seen it once, around Guy Fawkes’ (Bonfire) Night (November 5), when I was approached on Princes Street in Edinburgh one evening by a little girl and her mother. The little girl was in ancient dress, held out a small bag and said:

Penny for the guy.

I gave her a couple of copper coins, she thanked me nicely and we all went on our way.

Another Telegraph reader, johnofcroy, shared his childhood memories:

As a boy growing up in Scotland we used to dress up at Halloween as “guisers”, carry a hollowed out turnip and call on the neighbours when, in exchange for a song or dance, we would be given some sweets. This was in the sixties when American trick or treat culture was totally unknown to us. So although there may be no English tradition of guising at Halloween there most certainly was a long Scottish tradition.

Northern England

An English reader, crownarmourer, recalled going around with his friends carrying a moggy — a jack o’lantern:

and asking for cash not candy for years in my home village in the North East of England.

Miserable Southerners may not have any old customs but we did and still do …

Hans Castorp wrote:

… The distant origins of ‘trick or treat’ came from these islands, probably the Celtic fringes where the autumnal feast, clearly pagan, was Beltane, much condemned by Scottish divines. (It looks like it was originally a pagan autumn equinox which was transferred to the eve of All Saints Day after Christianisation. Anyone got detail on this?)

The remnants of this in the non-Celtic north of England (Yorkshire, Lancashire, Cumbria etc) is ‘Mischief Night’ which involves acts of hooliganism by teenagers against unpopular neighbours. Again, a threat against neighbours as with ‘T or T’ but a rather more serious one and police are or were often invoked to deal with it

Parts of the American Midwest

This I did not know. It appears as if guising is alive and well in pockets of the Midwest.

From Wikipedia:

Children of the St. Louis, Missouri area are expected to perform a joke, usually a simple Halloween-themed pun or riddle, before receiving any candy; this “trick” earns the “treat”.[52] Children in Des Moines, Iowa also tell jokes or otherwise perform before receiving their treat.


From the same Wikipedia link:

In Portugal children go from house to house in All Saints day and All Souls Day, carrying pumpkin carved lanterns called coca,[57] asking every one they see for Pão-por-Deus singing rhymes where they remind people why they are begging, saying “…It is for me and for you, and to give to the deceased who are dead and buried[…]”[58] or “[…]It is to share with your deceased […]”[59] If a door is not open or the children don’t get anything, they end their singing saying “[…]In this house smells like lard, here must live someone deceased”.

Pão-por-Deus translates as ‘Bread of God’. Records of this tradition go back to the 15th century.

In the nearby Azores:

the bread given to the children takes the shape of the top of a skull.[60]

After the ‘begging’ is complete:

the Magusto [feast for the dead] and big bonfires are lit with the “firewood of the souls”. The young people play around smothering their faces with the ashes. The ritual begging for the deceased used to take place all over the year as in several regions the dead, those who were dear, were expected to arrive and take part in the major celebrations like Christmas and a plate with food or a seat at the table was always left for them.[62]

Politically incorrect

In closing, a group of leftists have criticised American Hallowe’en celebrations as being politically incorrect. They allege the costumes (e.g. cowboys and Indians) reopen old historic wounds. A brief, sometimes entertaining, video has just appeared on YouTube criticising those who want to do away with Hallowe’en for reasons of ‘offence’:


I was amazed to find out about all the ancient and modern commemorations for the dead which take place all over the world, and not always around the end of October and the beginning of November.

Next year, I intend to write a piece on Day of the Dead, which became popular in the US after I left. It is a newish tradition celebrated by St Mark’s Episcopal Church in Manhattan. A church should not be taking part in a syncretic tradition, even if their altar to the dead is in a nearby tent.

As far as most Westerners are concerned, there is no greater evil than tobacco, especially where athletic prowess is concerned.

An issue of Tobacco & the Elderly Notes from 1998 examplifies the anti-tobacco stance with its feature which deplores past sports stars advertising cigarettes. Yet, my post yesterday showed that a number of top athletes enjoyed their smokes and still went on to break records during their satisfying careers.

I wonder what the editors of Tobacco & the Elderly Notes would think of the increasing drug use prevalent among high school, college and professional athletes? Is that a better proposition than tobacco?

I used to support the legalisation of cannabis until I saw what it did to a friend of ours. He never quite recovered from his use of skunk during the 1990s. What started out as recreational led to divorce and estrangement from his child, by now an adolescent. Even now that he’s gone straight, he’s still irritable, excitable and paranoid.

For those my age and older, it’s impossible to get the old strains which are now so last century. Every variety of cannabis on the market today has some psychotropic element to it. It’s no longer a case of a happy or sleepy high. It’s affecting people like our friend adversely.

Furthermore, use of other drugs, including K2, is on the increase by athletes.

College athletes and marijuana

Recreator has a series of excellent graphics and NCAA statistics which everyone should have a look at.

It will surprise many.

The survey is taken every four years. Results published are from the 2013 survey.

More than 20% of athletes smoked dope in 2005 (21%), 2009 (23%) and 2013 (22%).

One quarter of male athletes smoked it in 2013 versus 17% of women athletes.

Use by NCAA division statistics are as follows: Division I is 16%; Division II is 20% and Division III shows a significant 29%.

Statistics for marijuana use by sport revealed that 46% of lacrosse players smoke. Next are swimming (32%) and soccer (31%), followed by football (23%) and, finally, basketball (19%).

A 2012 article in Time on American football players states (emphases mine):

What is surprising is the frequency, proliferation and seeming constancy of the confessed drug useESPN The Magazine‘s Sam Alipour begins with a detailed scene of an Oregon football player, fresh off this year’s Rose Bowl victory, kicking back by rolling a joint. The unnamed player (there are many unnamed sources in the article, which isn’t surprising given the content) estimates that about half of the team smokes marijuana on a regular basis. The magazine also cited interviews with 19 current and former Ducks going back a decade and a half who put that number at between 40 and 60%.

The article states that, even more unusual is that, generally speaking, some football players get high before practice — or a game.

It used to be that such activity could harm one’s chances for a professional career. Today, it’s less of a problem:

For many athletes, the only downside to being caught using marijuana is a drop in their draft status, but there is an interesting catch-22 in which NFL scouts and executives assume that because so many athletes have used marijuana, they don’t believe those who claim they haven’t.

An article on draft picks on, also from 2012, looked at the same ESPN Magazine report that Time did. We learn the following:

Four out of 10 draft-eligible prospects from the 2012 class failed at least one school-administered drug test for marijuana; two in 10 failed multiple times, per a CBS Sports report from April.

“About 70 percent” of prospects at the combine admitted to using marijuana, per an ESPN report.

NFL players

The article considered the ESPN report alongside three marijuana-related arrests in the Detroit Lions that year:

Lomas Brown, now an ESPN analyst, claims at least 50 percent of NFL players likely smoke marijuana, according to a report in the Detroit News

“I just don’t think you’ll be able to curb this,” Brown told the newspaper.

In Brown’s eyes, this is actually an improvement. Brown claims up to 90 percent of players league-wide smoked marijuana when he began his career with the Lions in 1985.

K2 — undetectable — popular with youths and pros

Three years ago I wrote about the dangers of synthetic drug K2, which is widely available and legal. It is sold in filling stations and malls.

K2 looks like a little packet of potpourri and all packets say ‘not for consumption’.

ThePostGame has an excellent article on the increased popularity of K2 with athletes, from high school to professional level.

K2 is smoked and mimics cannabis. It is also undetectable in drug tests:

According to the American Association of Poison Control Centers, there were 14 cases of K2 exposure in the 48 states plus the District of Columbia in 2009. In 2010, that number exploded to 2,888. Already this year, there have been nearly 1,000. In the last four months alone, 151 Navy sailors have been accused of using or possessing the drug.

The U.S. Naval Academy expelled eight midshipmen last month for using K2.

Jay Schauben, director of the Florida Poison Control Center, warns:

The possible side effects include significant hallucination, cardiac effects, seizures, rapid heart rate, hypertension, severe agitation, passing out, and panic attacks.

Anyone who takes K2 is playing Russian roulette.

Secondary school use

ThePostGame‘s article opens with a profile of an 18-year-old K2 user who committed suicide. David was a notionally all-American boy living with his mother and father. He ended his life just after attending a high school graduation party. His parents had no reason to believe their son was using any sort of unusual substance until his girlfriend spoke with them a few days later.

My aforementioned post from 2012 recapped a drowning incident involving a 19-year old high school football player in Florida who took K2 with a friend. He didn’t want to go home and the friend left him alone, never imagining the youth would drown himself in a nearby creek.

University use

Athletes at university level like K2 because drug tests cannot detect it. Consequently, it is being heavily marketed on campuses all over the US:

“We’re receiving more reports of its use in the athlete population,” says Frank Uryasz, director of the National Center of Drug-Free Sport … ‘We’re getting reports from colleges, where athletes are asking about it.”

One such report to the Drug-Free Sport hotline, from an NCAA athletic trainer, reads:

“Three student-athletes were breaking apart cigarettes, mixing it with K2, rolling it back up into papers and then smoking. One young man, who had NO past medical history, had a seizure and lost consciousness. He was found outside the dorm by campus security convulsing. His heart rate was elevated above 200 for enough time that he was admitted for 24 hours of observation … When asked why he did it: “I didn’t think it would be that much of a rush, I had no control over my body in that I could see but could not talk or speak.”

Just because they are young and fit does not mean that university athletes are immune to harm from K2, especially when combined with another substance:

Performance-enhancing drugs may add yet another layer of risk. “If you combine these products and steroids, I can’t begin to predict the negative consequences,” says Anthony Scalzo, director of toxicology at St. Louis University. “If you add these stresses to the heart, someone’s probably going to have a heart attack from it.”

NFL use

One pro explained his drug-taking strategy:

I go straight weed in the off-season,” one NFL veteran told on condition of anonymity. “Then, in-season, when they test, I go to [K2].”

It is highly possible that within the next few years we might see unexpected deaths in fit athletes — including professionals — using K2.

And people rail against tobacco and nicotine!

During the tenth month of the year, the NHS runs an anti-smoking campaign called Stoptober.

For the past few years the smoking community in the UK has written essays against the demonisation of tobacco under the hashtag #octabber, ‘tab’ being British slang for ‘cigarette’.

Although Octabber might not be running this year — I am using the tag for my own reference — plenty of us are unhappy with the endless denormalisation and demonisation of smokers in Britain and elsewhere.


Non-smokers are probably unaware of all the anti-smoking campaigns that take place under the banner of ‘public health’ — financed by smokers through sin tax on their pack of 18 (no longer 20 for many manufacturers). Never mind that only a fifth of Britons smoke today. The fight here for Tobacco Control, as elsewhere in the West, must continue until no one smokes.

Dick Puddlecote succinctly described how it worked in 2014 — ‘How Stoptober Really Views Smokers’:

The UK’s first state-funded anti-smoking organisation ASH (motto: “denormalising you with your money since 1972”) claim that they “do not attack smokers or condemn smoking”. It’s a debatable point, but the huge tax sponging industry they have spawned don’t seem to share the same mission statement, it appears.

My three aunts — two nurses and one personal assistant (executive secretary, for my US readers) — smoked 20 a day. All stopped when they retired. Two are still alive and one died a few years ago, a great grandmother who outlived her non-smoking husband. They all had children. All the children are healthy. Half of them are grandparents. All have led responsible lives.

More from Dick Puddlecote:

I’m sure you’re very reassured that {cough} highly-respected politicians believe ‘professionals’ who post like this on Twitter are model citizens and should be shovelled skiploads of your hard-earned cash.

He ended his post with another 2014 tweet from the NHS’s #Stoptober:

Remember, if you’re doing Stoptober, everyone is behind you! And this time it isn’t because your breath smells like Fireman Sam’s jockstrap.

Rather crude, wouldn’t you say?

However, people will always smoke.

Attack on vaping

Countless numbers of ex-smokers around the world have taken up vaping.

Some of these people wanted to stop smoking cigarettes. Vaping gives them the same inhaling experience and they can often enjoy a few puffs in places where tobacco smoking is prohibited.

Not surprisingly, vaping has been gaining in popularity.

However, vaping has been under attack by health ministries around the world. A China Daily Hong Kong article from October 13, 2015 states that Brazil banned e-cigarettes in 2009, although they are still readily available on the black market. Canada followed suit in the same year but restricted the ban to e-cigarettes containing nicotine. In 2013, Spain banned vaping devices from public places.

The China Daily article describes the popularity of e-cigarettes among young people. In the United States, vaping among high school students increased exponentially between 2013 and 2014. The US National Youth Tobacco Survey data showed that, in 2013, approximately 660,000 secondary school students vaped. In 2014, their number increased to 2 million. Among middle school students a similar increase occurred; there were 450,000 young vapers in 2014 versus 120,000 in 2013.

In Britain — as in France — health ministries wish to minimise vaping altogether and not just for young people. Never mind that thousands of adults have been able to stop or switch from cigarettes to vaping, which, as the name implies, involves vapour not smoke.

In the UK, the government is openly against vaping. For their efforts, vapers are under attack, as Christopher Snowdon wrote recently in The Spectator. With an indoor ban to come in Wales, he writes (emphases mine):

Banning vaping indoors is such a criminally stupid and negligent idea that even the prohibitionists at Action on Smoking and Health are opposed to it. The unintended consequences are utterly predictable. Once people who have switched from smoking to vaping are thrown outside, they may come to the conclusion that they might as well smoke. Meanwhile, smokers who might switch to vaping have one less incentive to do so. The negative effect on health is plain to see, even if we ignore the glaring fact that none of this is the government’s business.


Vapers have every right to be outraged by this evidence-free attack on a habit that is not only harmless to bystanders but positively beneficial to them personally as erstwhile smokers. This is the important point to remember about so-called ‘e-cigarette campaigners’. They used to be smokers. You know how some ex-smokers can seem a little self-righteous and pleased with themselves? Vapers have taken that sense of triumph and channelled it into promoting – or, at least, protecting – the product that helped them quit.

Vapers did the notionally correct thing, obeying Public Health, only to find themselves on the wrong side now:

As smokers, vapers spent years being taxed, demonised and kicked into the street. Anti-smoking campaigners would never put it in such blunt terms, but their objective is to make smokers’ lives so miserable that they decide to quit smoking. Vapers did quit smoking, often to their own surprise. They did exactly what was asked of them, but instead of being embraced by their old tormentors, they found themselves with another battle to fight.

Our media and medical communities are full of warnings about the ‘dangers’ of e-cigarettes. Whilst Britain’s ASH might side with vapers, however, the daddy of Tobacco Control, Stanton Glantz

has helped bring about the banning of not only the use, but also the possession, of e-cigarettes on his campus in San Francisco.

Many smokers and vapers predicted this backlash a few years ago. It was only a matter of time.

It’s odd that ex-smokers inhaling vapour can cause such ill feeling. Under such restrictions, we should not be surprised if they take up tobacco again.

My American readers will already be acquainted with Ben Carson’s marvellous life story.

Therefore, this post is intended mainly for readers in other countries.

Carson is the only black candidate for the presidency at present. He is a Republican.

The Revd Michael Ashcraft and Mark Ellis wrote an outstanding article on Carson for their news site Godreports (H/T: Pastor Ashcraft’s Mustard Seed Budget).

What follows is a summary and excerpts from ‘God answered Ben Carson’s prayer before chemistry exam with a powerful dream’ and other sources. Emphases mine below.

Detroit origins and conversion

Ben Carson was born in Detroit, Michigan, on September 18, 1951, to Sonya (née Copeland) and the Revd Robert Solomon Carson, a Seventh Day Adventist minister.

Ben Carson has remained a member of the sect, although he attends churches of other denominations. For him:

it’s the relationship with God that’s most important.

His DNA reveals that he is 20% European and 80% Makua. The Makua are a Bantu ethnic group found today in Mozambique and South Africa.

When Ben was eight years old, his parents divorced. His mother raised him and his brother Curtis alone. An article in the Telegraph reveals that she worked three jobs when many of her neighbours relied on welfare:

“She thought you could do it on your own and not be beholden to anybody else,” he says. “She got a lot of arguments from a lot of people who said, come on, you can sit at home, the government will give you money.

“Well the interesting thing is most of those people never went anywhere and their kids never went anywhere. And you know I became a brain surgeon, my brother became a rocket scientist.”

As Americans will know, Detroit is not an easy city in which to live. Ben could have ended up in big trouble. When he was a teenager he and a friend:

were arguing over a choice of radio stations. Things got heated and Ben took out a pocketknife and lunged the knife blade toward his friend’s stomach.

The blade hit his friend’s belt buckle, broke in half, which saved his friend from harm and Ben from becoming a murderer. Frightened by what nearly happened, Ben ran home and locked himself in the bathroom with a Bible.

Ben turned to the wisdom of Proverbs, reading “A gentle answer turns away wrath, but a harsh word stirs up anger” (15:1); or “An angry man stirs up dissension, and a hot-tempered one commits many sins” (29:22), and a final admonition, “Pride goes before destruction, a haughty spirit before a fall” (16:18).

Humbled by the power of God’s Word, Carson realized that if left to his own devices, his anger would drive him toward ruin. Instead he prayed that God would help him control his temper instead of letting it control him, and God answered his prayer.

He continued reading:

Carson said two stories inspired him in his life: Up from Slavery, an autobiography of Booker T. Washington, and the account of Joseph’s life in the Old Testament of the Bible.

If you’ve never read Up from Slavery, I highly recommend it. Like Carson, I read it in high school and was blown away. Were that Booker T Washington more of a recommended role model for inner-city youth rather than rappers and gang leaders! Washington founded the Tuskegee Institute — now a university — in Alabama in 1881 for former slaves and their children. The book recounts that story and more:

He reflects on the generosity of both teachers and philanthropists who helped in educating blacks and native Americans. He describes his efforts to instill manners, breeding, health and a feeling of dignity to students. His educational philosophy stresses combining academic subjects with learning a trade (something which is reminiscent of the educational theories of John Ruskin). Washington explained that the integration of practical subjects is partly designed to reassure the white community as to the usefulness of educating black people.

This book was first released as a serialized work in 1900 through The Outlook, a Christian newspaper of New York. This work was serialized because this meant that during the writing process, Washington was able to hear critiques and requests from his audience and could more easily adapt his paper to his diverse audience.[1]

Washington was a controversial figure in his own lifetime, and W. E. B. Du Bois, among others, criticized some of his views. The book was, however, a best-seller, and remained the most popular African American autobiography until that of Malcolm X.[2] In 1998, the Modern Library listed the book at No. 3 on its list of the 100 best nonfiction books of the 20th century.

But I digress.

In high school Carson joined the JROTC programme, becoming a military cadet. His spiritual and temporal self-discipline helped him get accepted to Yale University, where he majored in Psychology. Afterwards, he returned home to earn his medical degree at the University of Michigan Medical School. He returned to the East coast for a residency at the renowned Johns Hopkins Hospital in Baltimore.

Stellar medical career

Carson became a neurosurgeon with an interest in paediatric medicine. He has performed detailed, exhausting surgery on conjoined twins and became a pioneer in this regard:

Carson went on to become the first surgeon to successfully separate conjoined twins joined at the head. He also revived an extreme form of brain surgery in which part or all of one hemisphere of the brain is removed to control severe pediatric epilepsy.

The twins conjoined at the head underwent 22 hours of surgery.

He has always prayed before an operation:

Even when I don’t operate, I pray because I feel that God is the ultimate source of all wisdom,” said Carson.

“Quite frankly, as a neurosurgeon, there’s a lot of emphasis on technical ability, but I believe that that’s something that can be taught, but wisdom comes from God and I think that it’s something that you have to seek.”

In another difficult operation on conjoined twins from Zambia, Carson:

reached a particularly challenging point in the operation that involved separating a tangle of blood vessels. Carson began to feel frustrated and fatigued.

I began praying desperately that God would take over and simply use me to accomplish what only he could do,” he noted later.

Despite the exhaustion that had almost paralyzed me a short time earlier, I now sensed a remarkable steadiness in my hands. I felt a strange calm, an almost detached awareness – as if I were merely watching my hands move and someone else had actually taken over the surgery.”

… “I don’t know that I have ever experienced anything quite like what happened in the operating room that day. When I separated the very last vein connecting Joseph and Luka, the stereo system at that very moment began playing the “Hallelujah Chorus” from Handel’s Messiah. I suspect every single person in that OR felt goose bumps and knew that something remarkable had taken place. And it was not our doing,” he said, giving all the praise and glory to God.


Carson retired from practising medicine in 2013.

He has a number of lucrative engagements since then. He has written for The Washington Times and is a commentator for Fox News. He has published his autobiography and has written other books for Zondervan Books, a Christian publisher.

He has also served on the boards of directors of Kellogg’s and Costco. He resigned from the latter earlier this year.

He also rejoined the Republican Party in 2014 after a hiatus of many years during which he was an independent.

Carson is a much sought-after public speaker at churches and for citizens’ organisations, one of them being the 2013 Values Voters Summit in Washington, DC. He told the audience that ObamaCare is:

the worst thing that has happened in this nation since slavery,” further adding that it is a form of slavery because it “[makes] all of us subservient to the government.”

Insurance companies have also attracted his wrath. In 2009:

Carson said that he found the “concept of for-profits for the insurance companies” absurd. He continued, “The first thing we need to do is get rid of for-profit insurance companies. We have a lack of policies and we need to make the government responsible for catastrophic health care. We have to make the insurance companies responsible only for routine health care.”[62][63]

Race for the White House

Although Carson trails Donald Trump, he has been firmly in second place among the GOP candidates for weeks (as of this writing).

On September 15, the Telegraph reported:

Ben Carson, the retired neurosurgeon, says he is “not the slightest bit” worried about sharing the stage with Mr Trump, and a New York Times/CBS News poll putting him on 23 per cent to Mr Trump’s 27 per cent suggested his confidence is not misplaced.

“The Donald” has been publicly firing salvoes of insults at Dr Carson, accusing him of lacking energy and implying he is not “smart” enough to lead.

“That is the claim of someone who doesn’t really know me,” Dr Carson told the Telegraph. “Someone who hasn’t seen me standing at the operating table for 10, 12, 18 hours doing complex surgeries, dealing with complex situations that come up at the last moment.”

Too right!

On September 16, CNN reported:

Trump is still leading GOP candidates and has 22% of likely voters in the New Hampshire Republican primary, according to the WBUR survey. But Carson has garnered 18% of support, bringing him within 4 points of Trump. Fiorina has 11%, pushing Bush out of the top 3. The former Florida governor and Ohio Gov. John Kasich both polled at 9%.

On October 20, 2015, Real Clear Politics showed a 4.8 point lead — 48.2 to 43.4 — over Hillary Clinton, top Democratic Party contender. (By the time you look at the chart figures will have changed.)

The Telegraph article and Wikipedia have more on Carson’s political views. On IS, Carson says:

You can’t stick your head in the ground and hope they will go away. Going and singing kum-bah-yah by the fire does not work,” he said, describing how he thinks the current administration has handled the jihadist takeover of Iraq and Syria.

“You know, Isil are looking like winners, so of course people want to be on the winning team. If you really want to defeat them you have to make them not look like winners – that means take that land back from them, and you are not going to be able to do that by dropping bombs in the desert.”

No doubt Carson’s wife Candy is a source of great support to him in his campaign.

Ben Carson is one to watch. Although I do not intend to comment much on the 2016 US elections, I do hope that he wins the Republican nomination and would be delighted to see him in the White House.

The worried well are, by and large, Westerners overcome by health fears.

Many of these fears are driven by preventive health programmes — interventions — present not only in doctors’ offices but places of employment.

Denmark’s puzzling statistics

Some of these can actually harm one’s health. My reader from The Last Furlong has a report from Denmark which says that their public health programmes have actually increased the number of hospitalisations, oddly, after the country’s smoking ban in 2007. Soon afterward:

the number of hospital admissions exploded.

By 2012, there were

a staggering 1.33 million annual hospital admissions – it was 150,000 more compared to 2006, or 13% more. This is double the rate of increase compared to the corresponding period before the smoking law.

Is this a mere coincidence? Or are Danes fretting more about their health?

Another curious increase is in the number of Danish patients admitted for heart disease, which increased between 2006 and 2012. Surely, with healthier lifestyles being mandated, this should have continued to decline.

Then there is the public health intervention encouraging people to exercise more. The result is that more Danes, especially women, have been admitted to hospital for joint and bone fractures caused by the perceived need for rigorous physical workouts in the name of better health.

The Danish report concluded:

The plan to reduce medical expenses by means of patronage has not worked as intended. The “healthy” Denmark, on the contrary, has been a regular disease factory.

The figures make a total failure of the idea that the state should interfere in people’s lifestyle to prevent disease.

Preventive [medicine] makes healthy people sick – and pharmaceutical companies happy.

Preventive interventions dubious

Huffington Post has an interesting article on preventive medicine by Allen Frances, Professor Emeritus at Duke University. Dr Frances begins with a quote from Aldous Huxley:

Medical science is making such remarkable progress that soon none of us will be well.

Isn’t that the truth!

Frances says (emphases mine):

The evidence is compelling that we in the developed countries (especially the US) are overtesting for disease, overdiagnosing it, and overtreating. Wasteful medical care of milder or nonexistent problems does more harm than good to the individual patient, diverts scarce medical resources away from those who really need them, and is an unsustainable drain on the economy.

Westerners, especially Americans, might have noticed that screening advice and frequency has changed over the years. One example is prostate cancer screening:

It used to be recommended that men of a certain age be tested yearly. It is now recommended that the test not be done at all unless a man has a family history or other special risk factors.

Why the big change? Definitive long term studies prove that the test doesn’t save lives and instead ruins them by triggering invasive interventions with painful complications. Screening is usually too late to stop fast spreading tumours and too good at identifying slow growing ones that don’t count and are better left alone. If they live long enough, the majority of men will develop an incidental and benign prostate cancer before they die from something else. Picking up these tumours early causes great grief for no return.

But, surely, early screening encourages disease prevention? Frances disputes that line of thinking:

The reality is that getting there too early misidentifies too many people who are not really at risk and then subjects them to needless and harmful tests and treatments.

Along with that is the psychological stress not only for the patient but for his nearest and dearest.

As for the radiation from certain tests, he tells us:

If we do enough CT scans we can find structural abnormalities in just about everyone. But most findings are incidental and don’t have any real clinical meaning. Paradoxically, lots of otherwise healthy people will get dangerous cancers from the CT radiation that served no useful purpose.

Other questionable procedures

Dr Kenny Lin is a family physician and public health professional who practises medicine in the Washington, DC, area. He teaches at Georgetown University School of Medicine, Uniformed Services University of the Health Sciences, and the Johns Hopkins University Bloomberg School of Public Health. His website is called Common Sense Family Doctor.

He advocates a cautious, informed approach to batteries of medical tests.

With regard to routine blood tests:

In 2007, I co-authored an editorial in the journal American Family Physician about this topic. We wrote:

“‘Big-ticket’ tests [such as CT (Computed tomography) scans and MRIs] are easy targets for those seeking to reduce waste in health care. But what about the seemingly innocuous practice of performing routine tests such as a complete blood count (CBC) or urinalysis? … These tests would be useful only if they provided additional diagnostic information that would not otherwise be obtained during a history and physical examination. In fact, large prospective studies performed in the early 1990s concluded that these tests rarely identify clinically significant problems when performed routinely in general outpatient populations. Although the majority of abnormal screening test results are false positives, their presence usually mandates confirmatory testing that causes additional inconvenience, and occasionally physical harm, to patients.”

Don’t misunderstand me. There are certain situations in which targeted screening tests can provide valuable information for the early detection of diseases. To learn more about which tests are recommended for your or your family members, I recommend that you visit the excellent website But the next time you go to a doctor’s office and he or she proposes to check some “routine blood work,” be sure to ask what these tests are for and what would happen if any of them turn out to be positive, so that you can make an informed choice about what’s right for you.

As for mammography:

the only reliable measure of a screening test’s superiority is whether or not it leads to fewer deaths. For 3D mammography, there’s absolutely no proof that it does.

I recognize that for women or loved ones of women who believe their lives to have been saved by mammography, no amount of scientific evidence that I or anyone else can marshal will change their minds …

So how can we counter the prevailing narrative of the Task Force [recommending fewer mammographies] as a group of cold-hearted scientists who are more concerned about population-level data than the individual lives of the women we love? We can tell the human story of the guideline developers – half of whom are women over the age of 40 who have personally faced the mammography decision at some point themselves – but we can do much more than that. We can tell a representative story of the hundreds of thousands (or millions, perhaps) of women who experienced serious emotional or physical harm as a result of screening mammography

He goes on to recount a case that Dr Louise Aronson wrote about for the Journal of the American Medical Association. Mammogram results for this patient, Elizabeth, revealed ‘gross’ abnormalities. Not surprisingly, she was called back for more mammograms over the next few weeks. She was beside herself with worry as was her family. She could barely concentrate at work and that year the family Thanksgiving gathering was sombre, to say the least. Aronson wrote:

Meanwhile, her physicians were at war: based on the x-ray films, the radiologists argued she had metastatic cancer with a less than 50% chance of 5-year survival, while her surgeons, based on the biopsy pathology, contended she had a rare, mostly benign condition. Fortunately, the surgeons were right. Still, sorting that out took weeks, and because the condition was associated with increased cancer risk, they insisted on bilateral surgery to remove all of the suspicious areas. So Elizabeth’s mammogram didn’t find cancer, but it did lead to the permanent mutilation of her breasts, huge medical bill copays, significant lost time from work, months of extreme stress, and ongoing anxiety about her disfigurement and risk of cancer.

Was it worth it?

Then there are the CT scans for lung cancer. These are just as contentious as mammograms. Many of Dr Lin’s readers fiercely defend them, however, his post warns that the risks may outweigh the benefits in some cases:

1. The risk of developing cancer from the CT scan itself isn’t trivial. A recent analysis published in the Archives of Internal Medicine found that a single chest CT scan exposed patients to the radiation equivalent of more than 100 chest X-rays, and that at age 60, an estimated 1 in 1000 women or 1 in 2000 men would eventually develop cancer from that single scan. (Participants in the lung cancer screening study actually underwent three consecutive annual CT scans.)

2. False alarms are extremely common. In the NCI’s lung cancer screening study, researchers found that 1 in 3 patients had at least one false-positive result after undergoing two CT scans. Of those patients, 1 in 14 needed an invasive lung biopsy to be sure they were cancer-free.

3. Even if screening catches lung cancer early, there’s no guarantee your prognosis will be better. This is due to “overdiagnosis,” or the unnecessary diagnosis of a condition (typically cancer) that will never cause symptoms in a patient’s lifetime, either because it’s so slow-growing or the patient dies from some other cause … because there’s no way of knowing at the time of diagnosis if a lung cancer will be fatal, inevitably many patients will be needlessly subjected to the side effects of treatment.

4. Finally, it’s highly likely that a CT scan for lung cancer will find some other abnormality that will require further investigation. You might think this is a good thing, but studies show that most of these abnormalities turn out to be false alarms, too

Finally, there are the private screening companies that send you a nice letter about the package of tests they can perform on you. The target market is the 50+ age group, and, even here in the UK, we receive such solicitations.

Dr Lin warns:

1. “Blocked arteries” / stroke screening is most likely a carotid ultrasound scan, which doesn’t help because most patients with asymptomatic carotid artery blockages will not suffer strokes. Although the screening test is “non-invasive and painless,” the confirmatory test, angiography, is not (it actually causes a stroke in a small number of patients) and unnecessary carotid endarterectomy can lead to death.

3. “Hardening of the arteries in the legs,” or screening for peripheral vascular disease with an arterial-brachial index, hasn’t been proven to prevent heart attacks but will certainly lead to many false positive results.

He discusses three other tests of dubious value and concludes:

In a nutshell, that’s why companies like Life Line have no business portraying these services as “preventive health screenings,” in my church or any other community setting. (I’ve sent an e-mail to my pastor recommending that they be dis-invited for the reasons I’ve outlined above.) It’s one thing to draw blood for a cholesterol test and take someone’s blood pressure (which will cost a whole lot less than $149), and quite another to offer these other procedures which are, at the very least, a waste of money and quite possibly harmful.

Solutions to excessive testing

Dr Frances makes the following recommendations, excerpted below:

  • Tame and shame Big Pharma. Stop the direct to consumer advertising that is allowed only in the US and New Zealand. Prohibit all Pharma contributions to professional associations and consumer groups. Regulate and make transparent all the marketing ploys used to mislead doctors. Force the publication of all clinical research trial data.
  • Recognize that all existing medical guidelines that define disease thresholds and make treatment recommendations are suspect. They have been developed by experts in each field who always have an intellectual conflict of interest (and often enough also have a financial conflict of interest) that biases them toward overdiagnosis and overtreatment in their pet area of research interest …
  • Employers, insurance companies, and government payors should be smarter consumers of health services and should stop paying for tests and treatments that do more harm than good and are not cost effective.
  •  Consumers should be smarter consumers and not buy into the idea that more is always better.
  •  Medical journals need to be more skeptical of the medical research enterprise and should look toward the harms, not just the potentials, of each new purported advance.

He concludes by reminding us that there are many really ill people who cannot get the healthcare they need and deserve.

Big Pharma, he says, is every bit as big a monster as Big Tobacco. On that point, I would disagree. Big Pharma is much more dangerous than ‘Big Tobacco’ — I use the term advisedly — will ever be.

Big Pharma probably kills more people around the world than tobacco. If statistics were honest, we could find out the truth. Unfortunately, we’ll have to wait a few more decades. By then, tobacco will no doubt be back in style!

Yesterday’s post discussed the community college massacre in Oregon, which took place on October 1, 2015.

Matthew Vadum wrote a full analysis of the shootings, including a detailed profile of the interests of the killer who later shot himself, Chris Harper-Mercer.

It is tragic not only that this took place but that it occurred near a small town in Oregon — Roseburg. Oddly, in 2006, Roseburg High School was the scene of a shooting.

Therefore, small town America might not necessarily guarantee personal or family safety.

More reasons for this — unrelated to the Oregon shooting — have come to light in recent years.

In 2012, WND produced a documentary on 35 training camps and communes around the United States:

Jamaat ul-Fuqra, known in the U.S. as “Muslims of America,” has purchased or leased hundreds of acres of property – from New York to California – in which the leader, Sheikh Mubarak Gilani, boasts of conducting “the most advanced training courses in Islamic military warfare.”

In a recruitment video captured from Gilani’s “Soldiers of Allah,” he states in English: “We are fighting to destroy the enemy. We are dealing with evil at its roots and its roots are America.”

The documentary has filmed evidence:

In the video, producers visited some camps, attempted to visit others and interviewed neighbors and local police officials. It also include excerpts of the “Muslims of America” recruitment video.

The recruitment video shows American converts to Islam being instructed in the operation of AK-47 rifles, rocket launchers and machine guns and C4 explosives. It provides instruction in how to kidnap Americans, kill them and how to conduct sabotage and subversive operations.

Jamaat ul-Fuqra’s attacks on American soil range from bombings to murder to plots to blow up U.S. landmarks. A 2006 Department of Justice report states Jamaat ul-Fuqra “has more than 35 suspected communes and more than 3,000 members spread across the United States, all in support of one goal: the purification of Islam through violence.” In 2005, the Department of Homeland Security predicted the group would continue to carry out attacks in the U.S.

“Act like you are his friend. Then kill him,” says Gilani in the recruitment video, explaining how to handle American “infidels.”

WND explains that the group has been implicated in the death of Wall Street Journal correspondent Daniel Pearl. Pearl was attempting to interview Gilani. He was then kidnapped and beheaded. That was in 2002. In 2003, Jamaat ul-Fuqra member and Al Qaeda adherent Iyman Faris, pleaded guilty to a plot involving blowing up the Brooklyn Bridge.

I have not seen the documentary, but it also features interviews with law enforcement officials:

“What we are witnessing here is kind of a brand-new form of terrorism,” says FBI Special Agent Jody Weis in the documentary. “These home-grown terrorists can prove to be as dangerous as any known group, if not more so.”

Conservative Tribune features a map and list of towns where Jamaat ul-Fuqra is — or was — operating. Whilst some of their communes are in Philadelphia and Houston, the majority are in smaller communities.

The group is not listed among proscribed organisations in the United States.

© Churchmouse and Churchmouse Campanologist, 2009-2015. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Churchmouse and Churchmouse Campanologist with appropriate and specific direction to the original content.
WHAT DOES THIS MEAN? If you wish to borrow, 1) please use the link from the post, 2) give credit to Churchmouse and Churchmouse Campanologist, 3) copy only selected paragraphs from the post -- not all of it.
PLAGIARISERS will be named and shamed.
First case: June 2-3, 2011 -- resolved

Creative Commons License
Churchmouse Campanologist by Churchmouse is licensed under a Creative Commons Attribution 2.0 UK: England & Wales License.
Permissions beyond the scope of this license may be available at

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 700 other followers


Calendar of posts

November 2015
« Oct    
2930 - The internets fastest growing blog directory
Powered by WebRing.
This site is a member of WebRing.
To browse visit Here.

Blog Stats

  • 854,115 hits

Get every new post delivered to your Inbox.

Join 700 other followers