Alice’s son, Michael Foote,  his wife, Rachel Brodie, and their little son, River, live on their mountain-side farm in northern Vermont.  The upper part of their land is level with a vegetable garden, berry patch, and pasture for visiting horses. The property is wooded below that, falling to a stream that supported a beaver family until a neighbor shot them. Last Christmas, they gave jars of honey to friends and family with this letter.  RJN


Christmas, 2017

Dear Family and Friends,

Here is a jar of pure, unfiltered, naturally crystallized honey from my bees on Swamp Road to you.

Rachel thinks the rest of this letter makes me sound like an old Vermonter.  I take that as a compliment, so here we go!

Then: my brother Jesse and I received a grant while undergrads to buy bees, and we started the hives at Dartmouth College. The next summer, we drove them to Scout camp to offer the bee-keeping merit badge as well as to continue our study.

Now: It’s been a fun little adventure to get to this point–from buying a box of bees to where I actually have honey in containers.  This is my fourth year keeping bees.  The first year was a disaster.  I tried to be a little too creative, testing an alternative hive method before I knew enough to be doing that.

Displaying Top bar hive.jpgTop bar hive: My first hive, an alternative method, didn’t last the winter.

My first hive did not survive the winter, succumbing to an overwintering mouse.  If a mouse gets into a hive in winter, it can exhaust the honey reserve.  The mouse was cute, but seeing the pile of dead bees in spring was heartbreaking.

I started with the basics the second year, using a tried-and-true structure for raising bees.   I purchased mail-order bees (Italian and Carniolan bees) and dumped them into a couple of hives I had built from kits.  I was always worried about them.  When it was cold out, I assumed they were cold.  I wrapped them in insulation and anxiously pressed my ears up to the hives to hear the telltale buzz of life.  When it rained for several days, I assumed they would be in need of food, so I fed them sugar water and a pollen substitute.  I didn’t let them just be bees.  Still, whether my parenting style had anything to do with it, my bees did thrive that year.

In my third year, I purchased a Vermont mongrel hive that had been bred to thrive in northern climates.  I liked the Italian bees, but the Carniolan bees have been grumpy, so I was looking for a more docile, better adapted bee.  I relaxed a bit, and still the hives did very well, each producing about 200 pounds of honey.

Displaying Apiary.JPGApiary: A photo of three of my hives in the bee yard.

Going into my fourth, most recent yearI attempted to split some of my hives in two and start the summer with 8.  All but one split thrived and I took about 150 pounds of honey in all.  Not much, but a newly split hive must build its comb as well as store honey.

Displaying Splitting.JPG  Splitting: I made two hives from one, simply be splitting it in two. The queenless hive made a new queen to become “queen right.”

Swarming is the process by which a new honey bee colony is formed when the queen bee leaves the colony with a large group of worker bees. In the prime swarm, about 60% of the worker bees leave the original hive location with the old queen.

Displaying New home.JPG·        Swarm: One of my hives sent out a swarm, which I then caught and put in a new hive. Unfortunately it didn’t stay.

Displaying Swarm.JPG

I tried to capture a couple of swarms this year.  A swarm of bees lingers near its original hive for only a short period before moving to its new home.  I was late.


Displaying Very active.JPG  Very active: A warm summer day, lots of food to collect.

Luckily, in this neck of the woods, I don’t have the problem of the well-publicized colony collapse disorder where hives oddly become active in the middle of winter, leave, and die.  From my reading, I understand that saturation of the environment with pesticides and other chemicals, including neo-nicotinoids, is to blame.  Neo-nicotinoids are a major factor in the decimation of pollinators everywhere and in the build-up of chemicals in our own bodies.  Our nearby town of Richmond is aware of these chemicals and fairly progressive, to the benefit of my bees. Their honey is probably safer for us to eat than some produced elsewhere.

Future: I hope to try my hand at queen-rearing this coming spring to boost my hive numbers.  (You grow a queen, give the queen a couple frames of bees, and your hive takes off.)  I’m also hoping to plant a half-acre of Anise Hyssop for the bees to give their honey a hint of anise flavor.

I plan to pursue organic certification eventually, but, for the time being, I’m doing as much as I can to be environment- and health-conscious, as in buying hive wood from a responsible lumber yard down the road and using organic sugar feed when possible.

I should be clear–there’s no money in this business, but I love it.  I love to work outside, and I find the bees fascinating:  their complex social structure, their numbers (more than 50,000 in a hive), their communication systems (dancing, wiggling, pheromones, electric fields), and their ability to make wax, propolis, royal jelly and … honey!  I could watch my bees all day long as they go back and forth with little baskets on their legs filled with pollen.  I can’t wait until River is old enough to join me.

I rarely get stung, mostly because the bees are gentle. Still, when I’m opening their hives, I make sure to put on protective gear and use a smoker.  Smoke makes the bees think a fire is coming and they move into the hive to eat honey in case they need to leave,  After eating, the bees are pretty lazy and have a hard time bending their bodies to sting.  I still run away when they get angry and have no shame doing the bee dance, an awkward combination of flailing, running, and yelling when a bee gets under my mask.  When I do get stung I just bear it and feel tough.

Rachel supports my bee work and hasn’t complained about the cost of building an apiary.  She grows a little tired of finding everything sticky in the kitchen. I try to protect life at home and at work from losing to the bees.  I sneak out during River’s naps and get up early to do hive maintenance during the months March through October.  In winter I can enjoy dreaming about what I will do with the bees the next year.

I aspire to sell honey on the roadside this spring and to guests in our rental unit to earn enough money to build a little bee shed so that I don’t have to do all my honey extraction in the house.

Let me know when you need more honey!  Happy holidays.  Bee well.

More info

Note:  My honey, like all honey, naturally crystalizes, preserving flavor and quality (considered premium quality because it is not blended with other substances), yielding richer taste in cooking, and spreading well enough  Because I don’t filter or heat the honey, crystalization is quicker. Filtering honey removes a lot of the pollens and propolis that add to he nutritional value, and heating denatures the proteins,


To liquify honey, it is heated in a jar in a pot of hot water and stirred frequently until it is liquid.   For storage, honey is best kept at 50 degrees prevent fermentation, though the very old alcoholic drink made with honey, mead, seems to gaining popularity.

Emphasis added, RJN







A newly discovered cache of internal documents reveals that the sugar industry downplayed the risks of sugar in the 1960s.  Luis Ascui/Getty Images

50 Years Ago, Sugar Industry Quietly Paid Scientists To Point Blame At Fat

National Public Radio   source
In the 1960s, the sugar industry funded research that downplayed the risks of sugar and highlighted the hazards of fat, according to a newly published article in JAMA Internal Medicine.

The article draws on internal documents to show that an industry group called the Sugar Research Foundation wanted to “refute” concerns about sugar’s possible role in heart disease. The SRF then sponsored research by Harvard scientists that did just that. The result was published in the New England Journal of Medicine in 1967, with no disclosure of the sugar industry funding.
Sugar Shocked?

The Rest Of Food Industry Pays For Lots Of Research, Too
The sugar-funded project in question was a literature review, examining a variety of studies and experiments. It suggested there were major problems with all the studies that implicated sugar, and concluded that cutting fat out of American diets was the best way to address coronary heart disease.

The authors of the new article say that for the past five decades, the sugar industry has been attempting to influence the scientific debate over the relative risks of sugar and fat.

“It was a very smart thing the sugar industry did, because review papers, especially if you get them published in a very prominent journal, tend to shape the overall scientific discussion,” co-author Stanton Glantz told The New York Times.

Money on the line
How The Food Industry Manipulates Taste Buds With ‘Salt Sugar Fat’
In the article, published Monday, authors Glantz, Cristin Kearns and Laura Schmidt aren’t trying make the case for a link between sugar and coronary heart disease. Their interest is in the process. They say the documents reveal the sugar industry attempting to influence scientific inquiry and debate.

The researchers note that they worked under some limitations — “We could not interview key actors involved in this historical episode because they have died,” they write. Other organizations were also advocating concerns about fat, they note.

There’s no evidence that the SRF directly edited the manuscript published by the Harvard scientists in 1967, but there is “circumstantial” evidence that the interests of the sugar lobby shaped the conclusions of the review, the researchers say.

For one thing, there’s motivation and intent. In 1954, the researchers note, the president of the SRF gave a speech describing a great business opportunity.

If Americans could be persuaded to eat a lower-fat diet — for the sake of their health — they would need to replace that fat with something else. America’s per capita sugar consumption could go up by a third.
In ‘Soda Politics,’ Big Soda At Crossroads Of Profit And Public Health
But in the ’60s, the SRF became aware of “flowing reports that sugar is a less desirable dietary source of calories than other carbohydrates,” as John Hickson, SRF vice president and director of research, put it in one document.

He recommended that the industry fund its own studies — “Then we can publish the data and refute our detractors.

The next year, after several scientific articles were published suggesting a link between sucrose and coronary heart disease, the SRF approved the literature-review project. It wound up paying approximately $50,000 in today’s dollars for the research.

One of the researchers was the chairman of Harvard’s Public Health Nutrition Department — and an ad hoc member of SRF’s board.

“A different standard” for different studies

Glantz, Kearns and Schmidt say many of the articles examined in the review were hand-selected by SRF, and it was implied that the sugar industry would expect them to be critiqued.

Obesity And The Toxic-Sugar Wars
Obesity And The Toxic-Sugar Wars
In a letter, SRF’s Hickson said that the organization’s “particular interest” was in evaluating studies focused on “carbohydrates in the form of sucrose.”

“We are well aware,” one of the scientists replied, “and will cover this as well as we can.”

The project wound up taking longer than expected, because more and more studies were being released that suggested sugar might be linked to coronary heart disease. But it was finally published in 1967.

Hickson was certainly happy with the result: “Let me assure you this is quite what we had in mind and we look forward to its appearance in print,” he told one of the scientists.

The review minimized the significance of research that suggested sugar could play a role in coronary heart disease. In some cases the scientists alleged investigator incompetence or flawed methodology.

“It is always appropriate to question the validity of individual studies,” Kearns told Bloomberg via email. But, she says, “the authors applied a different standard” to different studies — looking very critically at research that implicated sugar, and ignoring problems with studies that found dangers in fat.

Epidemiological studies of sugar consumption — which look at patterns of health and disease in the real world — were dismissed for having too many possible factors getting in the way. Experimental studies were dismissed for being too dissimilar to real life.

One study that found a health benefit when people ate less sugar and more vegetables was dismissed because that dietary change was not feasible.

Another study, in which rats were given a diet low in fat and high in sugar, was rejected because “such diets are rarely consumed by man.”

The Harvard researchers then turned to studies that examined risks of fat — which included the same kind of epidemiological studies they had dismissed when it came to sugar.

Citing “few study characteristics and no quantitative results,” as Kearns, Glantz and Schmidt put it, they concluded that cutting out fat was “no doubt” the best dietary intervention to prevent coronary heart disease.

Sugar lobby: “Transparency standards were not the norm”

In a statement, the Sugar Association — which evolved out of the SRF — said it is challenging to comment on events from so long ago.

“We acknowledge that the Sugar Research Foundation should have exercised greater transparency in all of its research activities, however, when the studies in question were published funding disclosures and transparency standards were not the norm they are today,” the association said.

“Generally speaking, it is not only unfortunate but a disservice that industry-funded research is branded as tainted,” the statement continues. “What is often missing from the dialogue is that industry-funded research has been informative in addressing key issues.”

The documents in question are five decades old, but the larger issue is of the moment, as Marion Nestle notes in a commentary in the same issue of JAMA Internal Medicine:

“Is it really true that food companies deliberately set out to manipulate research in their favor? Yes, it is, and the practice continues. In 2015, the New York Times obtained emails revealing Coca-Cola’s cozy relationships with sponsored researchers who were conducting studies aimed at minimizing the effects of sugary drinks on obesity. Even more recently, the Associated Press obtained emails showing how a candy trade association funded and influenced studies to show that children who eat sweets have healthier body weights than those who do not.”
As for the article authors who dug into the documents around this funding, they offer two suggestions for the future.

“Policymaking committees should consider giving less weight to food industry-funded studies,” they write.

They also call for new research into any ties between added sugars and coronary heart disease.


Department Of Defense Investigating U.S.-Led Coalition Airstrike In Syria
Mike Pence speaks to Republicans at the Ronald Reagan Presidential Library in Si



Cracker Barrel


Cracker Barrel Store Front

Cracker Barrel is a chain of inexpensive restaurants with attached stores selling souvenirs, chotzkies, gift items.  They serve breakfast, lunch, and dinner.  I’ve eaten in one twice, and I’ve despised them until I read this:                               RJN

An anonymous online review:

“A Tip for Seniors”

I visit Cracker Barrel at least once a week.

This is one place that they will let Seniors

order off the kids menu and get the free drink too.

You can get a complete meal with drink and

either a biscuit or cornbread for under $5.00.

It’s cheaper than cooking at home for one or two people.

In the winter, I go up there a lot and ask for a table

by the fireplace. Cheaper than getting a duralog

for my fireplace and sitting alone.



To Cracker Barrel


I like to tease AIice about eating at Cracker Barrel Restaurants, and I ran across this piece, printed as a customer review,  while looking for ammunition.

At first I noticed it is nicely written, then that it is touching, then that it is a poem. I’ve only adjusted the length of lines and applied the italic font.  rjn


I visit Cracker Barrel at least once a week.

 This is one place that they will let Seniors

order off the kids menu and get the free drink too.


You can get a complete meal with drink and

either a biscuit or cornbread for under $5.00.

It’s cheaper than cooking at home for one or two people.


In the winter, I go up there a lot and ask for a table

by the fireplace. Cheaper than getting a duralog

for my fireplace and sitting alone.


  Gloriousglo,  Indiana


Slicing Meat Shaped Modern Humans



Chew On This: Slicing Meat Helped Shape Modern Humans

Audio for this story from All Things Considered will be available at approximately 7:00 p.m. ET.

Katherine Du/NPRiKatherine Du/NPR

Miss Manners and skilled prep cooks should be pleased: Our early human ancestors likely mastered the art of chopping and slicing more than 2 million years ago. Not only did this yield daintier pieces of meat and vegetables that were much easier to digest raw, with less chewing — it also helped us along the road to becoming modern humans, researchers reported Wednesday.

And our ancestors picked up these skills at least 1.5 million years before cooking took off as a common way to prepare food, the researchers say.

Chewing, it turns out, takes a lot of time and energy, say Katherine Zink and Daniel Lieberman, evolutionary biologists at Harvard University. They recently set about measuring precisely how much effort is required to chew raw food, and to what degree simple stone tools might have eased the toil.

“Every time I go out to dinner, I watch people chew,” Lieberman tells us. “And sometimes, I actually count how many times they chew.”

Nom Nom: Chimpanzee skull (top), A. afarensis jaws (center) and human jaws. It's likely that tool use and meat-eating reduced the evolutionary pressure to have big, powerful jaws and sharp teeth, the researchers behind a new Nature study say.

Nom Nom: Chimpanzee skull (top), A. afarensis jaws (center) and human jaws. It’s likely that tool use and meat-eating reduced the evolutionary pressure to have big, powerful jaws and sharp teeth, the researchers behind a new Naturestudy say.

John Reader/Science Source

It’s not just a hobby. Lieberman’s interest gets to some basic questions of how humans evolved.

Scientists have long known that Homo erectus, an ancestor of modern humans who lived about 2 million years ago, had already evolved to have a bigger body and brain than earlier hominins, and would have needed much more daily energy to survive. But the jaw and teeth of H. erectus were much like ours today — significantly smaller and less powerful than those of Australopithecus afarensis, or other hominins of earlier epochs.

A diet that included cooked meat would have provided that ready energy without the need for sharp canines and big grinders. But the research evidence is pretty clear that cooking didn’t become common until about 500,000 years ago, Lieberman says. So, how did H. erectus get the needed calories?

To test a long-held hypothesis that simple food processing might be the answer, Zink and Lieberman invited some Harvard colleagues to what Zink calls “a lab café,” and served them small portions of carrots, beets, jewel yams and goat meat. The food was served variously as roasted or raw; sliced, pounded or left in hunks.

“If I were to give you raw goat,” Lieberman says, “you’d chew, and nothing would happen.” Like a lot of wild game, goat meat tends to be stringy, he says. Chewing a big piece makes it more elastic, but it doesn’t readily break into pieces.

“But if you cut goat into smaller pieces,” he says, “your ability to chew it would improved dramatically.”

All the volunteers (14 for the vegetables and just 10 for the goat meat) wore a number of small sensors pasted to their faces, to detect and count contractions of various muscle fibers as they chewed the bite of food to the point of swallowing. The scientists then translated those contractions into a measure of muscular effort, and also checked to see how well the food was broken up.

Their results, published in the journal Nature, suggest that when eating a diet made up of one-third meat, if early humans pounded the vegetables before eating them, and sliced the meat, they would need to chew 17 percent less often and 26 percent less forcefully than if they started with larger slabs of the food. Every little flex of the jaw and grinding of the teeth adds up: Over the course of a year, Lieberman says, simply having a sharp stone to slice meat would reduce the number of “chews” needed by 2.5 million.

“I think it’s amazing,” he says, “to think that the simple stone tool could have amassive effect on how effectively we chew a piece of meat.”

It’s possible, he and Zink think, that the benefits of meat-eating and food processing favored the transition to smaller teeth and jaws.

But it seems more likely, they write in their study, that tool use and meat-eating simply reduced the evolutionary pressure to have big, powerful jaws and sharp teeth, “thus permitting selection to decrease facial and dental size for other functions, such as speech production, locomotion, thermoregulation, or, perhaps even changes in the size and shape of the brain.”

The Time Traveler’s Cookbook–take a look:  Several years ago, as part of our Meat Week coverage, we put together a tongue-in-cheek cookbook — based on archaeological digs and actual historical texts — tracing humanity’s changing relationship with meat. Check it out below or download the PDF.






5 myths about gluten




By Alessio Fasano   Chicago Tribune  12.22/15

   When I founded our celiac center nearly 20 years ago, writers couldn’t spell “celiac,” and very few people had ever heard the word “gluten.” One of our primary goals has been to advance awareness of celiac disease to improve the quality of life for people with gluten-related disorders, and I’ve been amazed to see what has happened in 20 years. Most people have now heard of gluten, but many have a pretty poor understanding of what it is and how it fits into a healthy diet. An ancient and complex protein, gluten is a major component of wheat. It helps bread to rise and gives it a characteristic chewy texture. Similar proteins called secalin and hordein are found in barley and rye. We lump the three together as the only proteins we can’t digest and call this gluten. For people with celiac disease, a lifelong disorder, these proteins wreak havoc on the small intestine. For the rest of us, it’s a different story.  

1 Our bodies are not meant to process gluten, so no one should eat it.   Many people now vilify wheat as unfit for human consumption. Eating it “raises blood sugar levels, causes immunoreactive problems, inhibits the absorption of important minerals, and aggravates our intestines,” in the words of prominent bioethicist and futurist blogger George Dvorksy. This “sensational science” is fertile terrain for TV shows such as “Dr. Oz” and books that identify gluten as the villain of the 21st century. Gluten has been blamed for many diseases outside gluten-related disorders, and therefore some people have suggested that it should be completely banned from the human diet.   It is true that our bodies do not have the proper enzymes to break down the complex proteins found in gluten. The immune system spots gluten as an invader and goes into battle mode to get rid of it. But here’s the key: In most people, the immune system is able to “clean up” the gluten invasion, and then it’s back to business as usual.   For the approximately 1 percent of humans with celiac disease, the immune system can’t handle the cleanup. Instead, it goes into overdrive, producing autoantibodies that attack the tissue in the small intestine, leading to inflammation and tissue destruction. This leads to malabsorption of nutrients, which causes myriad symptoms, gastrointestinal and otherwise, in people with this autoimmune disorder.   Other people affected by wheat allergy or non-celiac gluten sensitivity may also find that their bodies react inappropriately after they eat gluten-containing grains. But epidemiological studies, including our 2003 study in the United States, show that the vast majority of us tolerate gluten without any problem. The fact that about 1 percent of the population is affected by celiac disease, while almost 100 percent of humankind is exposed to gluten-containing grains, is evidence that these grains are safe for most people. After all, our species has evolved during the past 10,000 years eating gluten-containing grains.  

2 Cutting gluten from your diet is beneficial, even if you don’t have celiac disease.   Approximately 1 in 4 U.S. consumers think that going gluten-free is good for everyone, according to the NDP Group, a market research organization. The same group reports that about 11 percent of U.S. households eat gluten-free. These people are probably following advice such as: “Eliminating wheat is the easiest and most effective step you can take to safeguard your health and trim your waistline,” from William Davis, the physician of “Wheat Belly” fame.   But only about 400,000 Americans have been diagnosed with celiac disease. This is a small fraction of the approximately 3 million people in the United States who have it; the rest remain undiagnosed. Wheat allergy sufferers number about 0.03 percent of the U.S. population. Because medicine has no reliable test for the condition, the number of people with nonceliac gluten sensitivity has not yet been established. Our center recently estimated it at 6 percent of the U.S. population, but it is only our best guess until we develop a biomarker to identify the condition.   For most of us, a gluten-free diet is not a naturally healthier diet. If you give up gluten-containing cookies, cakes and beer, and replace them with gluten-free cookies, cakes and beer, you will not lose weight or feel better. But while avoiding gluten itself won’t help, giving up many of the processed foods that contain it will. If you stop eating fried foods, highly processed foods and foods high in sugar, and replace them with fresh fruits, vegetables, olive oil, and protein from lean meat, eggs, seafood, nuts and beans (essentially, the Mediterranean diet), you will definitely feel better, unless you have an unidentified underlying condition. People who do undertake a gluten-free diet should work with a registered dietitian to make sure they’re getting all the vitamins and micronutrients they need.  

3 Gluten sensitivity doesn’t really exist. About five or six years ago, we began to see a new phenomenon in our clinic: people who reacted poorly to gluten but had none of the diagnostic or histological markers for celiac disease. Eventually, our group published a paper calling the condition “non-celiac gluten sensitivity” or “gluten sensitivity.” As celebrities from Gwyneth Paltrow to Novak Djokovic have espoused the “benefits” of going gluten-free, though, there’s been some resistance to the idea of gluten sensitivity; at this point, enough people have gotten on board with ditching gluten that it’s being mocked as a “fad” diet by people such as cookbook author and culinary historian Clifford Wright. Based on conflicting studies, the existence of gluten sensitivity has been challenged in the press. And until recently, the terms “celiac disease” and “gluten sensitivity” had been used interchangeably in medical literature, as Amy Brown noted in a 2012 article in Expert Review in Gastroenterology & Hepatology.   Many symptoms are similar, but the conditions are very different metabolically. In non-celiac gluten sensitivity, we don’t see the same intestinal inflammation we see in people with celiac disease. Also, some people with gluten sensitivity can tolerate small amounts of gluten, which is never the case with celiac disease.  

4 People with celiac disease can eat a little bit of gluten. A group of scientists from St. Bartholomew’s Hospital in London published a study in 1988 concluding that adult celiac patients could safely consume a low-gluten diet, as opposed to a gluten-free one. Unfortunately, that misconception is still with us.   We now know this is not true, since we have pretty solid evidence that traces of gluten can be as harmful as large amounts, even if the clinical consequences don’t materialize until years later. People with celiac disease must avoid gluten at all costs. While eliminating the “big items” (pizza, pasta, cookies, beer, bagels, etc.) is painful but relatively easy, avoiding the traces of gluten found in many processed foods (gluten is a cheap and efficient filler) can be much more difficult. It takes only a tiny crumb of bread to set the autoimmune machinery into motion, creating the intestinal damage that leads to symptoms and nutrient malabsorption.   This is what makes the gluten-free diet so tricky, especially outside the controlled environment of your own kitchen. Think about celiac kids in day care centers or classrooms (craft projects and cupcakes), and celiac diners in restaurants, social settings and traveling away from home. People with celiac disease and parents ofchildren with the condition must be vigilant about what they put in their mouths every day in a way that the rest of us don’t need to be.   What makes this clinical chameleon even trickier is that you can have celiac disease (the intestinal inflammation and malabsorption) and not exhibit any symptoms, gastrointestinal or otherwise, for a long time. Meanwhile, damage to your intestine continues and could lead to the development of related conditions and, in extremely rare cases, intestinal lymphoma. The only way to identify these asymptomatic patients is through blood tests and an intestinal biopsy.  

5 If you have celiac disease as a child, you will outgrow it. This question comes up often in our celiac clinic because of the lingering misconception that celiac disease is a pediatric condition. In the 1930s and ’40s, children diagnosed with this mysterious gastrointestinal disease were fed a banana-based diet. The mortality rate was high, but the lucky children who survived were told that they could resume eating wheat after a period of time. This led to the idea that you could outgrow celiac disease. Years later, with advanced diagnostic tools, many of those “banana babies” were rediagnosed with celiac disease.   In the 1950s, a Dutch pediatrician named Willem-Karel Dicke determined that wheat flour was responsible for the symptoms he saw in his young patients. After watching the mortality rate of children with celiac disease drop during World War II, Dicke suspected that the decline might be related to the scarcity of bread at that time. Still, it would be decades before the notion that you can outgrow celiac disease was challenged. In 1958, Cyrus Rubin determined that pediatric and adult celiac were the same condition. With the development of the first diagnostic tools in the 1970s and blood-screening tests in the 1990s, the diagnostic rates for children and adults increased.   But the real breakthrough came in the 1990s, when researchers determined that celiac disease is not a food allergy or an intolerance, but a gluten-triggered autoimmune disease that patients cannot outgrow. Another milestone was when we determined that people can develop celiac disease at any time in their lives, even into old age. Now we know it is a lifelong condition, and the best medical intervention we have is a gluten-free diet. 

Washington Post   Dr. Alessio Fasano is the founder and director of the Center for Celiac Research & Treatment at Massachusetts General Hospital in Boston and the author of “Gluten Freedom.”



Peppers–Why are they hot? Why do we eat them?

British Broadcasting Company    The Why Factor

The chilli pepper is a work of ‘evolutionary elegance’. Its complex chemistry can fool our brains. Why do we eat something that causes us pain?

Mike Williams explores the origins and history of chillies,thought to be the hot and humid climates of Bolivia and northern Brazil before being spread through the world by Portuguese colonists in the 15th entury. He finds out that ancient chillies were not hot.


Pepper stand at market in Texas, with                                                                                                    Scoville scale.


Dr Josh Tewkesbury from the University of Washington explains why the chilli pepper developed heat and why human beings are one of the only mammals in the world to actually enjoy eating them. We unlock the pungency and flavour of chillies in curries with chef and writer Roopa Gulatti. And we uncover their power and punch in powder and pepper spray with Dr Anuj Baruah, a biotechnologist in the north-eastern state of Assam, India, who extracted the chemical compound inside the chilli for India’s ministry of defence.

Award winning science writer and journalist, Deborah Blum gives her analysis of the chemistry inside the chilli and its development to explain why she thinks that plants like chillies are ‘formidable military machines’. Finally, Mike tastes one of the world’s hottest – the bhut jolokia – also known as the ghost or poison pepper.

LISTEN to 18-minute radio show


More radio?  LISTEN to 17-minute radio show on ghosts

We join a group of ghost hunters in England on a spooktacular tour of a derelict orphanage; Mike meets the cultural historian Dr Shane McCorristine in the birthplace of the Victorian ghost story; and the psychologist Professor Christopher French explains the mind’s capacity to produce hallucinations.

Corn Souffle — verse


Image result for photos man reading chair


She says I’ve been imagining a corn soufflé !

(this burst of culinary fervor toward my reading nook).

I think I’ll try it out today. Says how are you at stripping

kernels off the cob?  Excellent I say—my kind of job

although I may well cut myself.  A spurt of blood

will add a  je ne sais, some color to the food.

Oh never mind she says; you’d create crud,

and I return to safely minding my own book.


Spam, If You Dare

Most of the material here is from Wikipedia where you can find more.

My gmail page often carries an ad for Spam, a recipeSpam Breakfast Burritos and various casserole dishes.

I feel that I’ve known about Spam all my life, not sure we had it when I was a kid.  I don’t remember ever buying a can, but I think I know what it tastes like.

Spam is a brand of canned precooked meat products made by Hormel Foods Corporation. It was first introduced in 1937 and gained popularity worldwide after its use during World War IIBy 2003, Spam was sold in 41 countries on six continents and trademarked in over 100 countries. In 2007, the seven billionth can of Spam was sold.

Here is Spam in Puerto:  Sandwich de Mezcla is a party staple in Puerto Rico containing Spam, Velveeta, and pimientos between two slices of Wonder Bread.

In HawaiiBurger King restaurants began serving Spam in 2007 to compete with the local McDonald’s chains.  Spam is so popular that it is sometimes referred to as “The Hawaiian Steak”.[24]


   Spam musubi is a popular snack and lunch food in Hawaii

In the South Pacific islands,  Spam is blamed for an “obesity crisis“.  I remember hearing of a supply boat’s failure to arrive on time at one of the islands with the essential Spam and mayonaise.  The crowd at the dock rioted.

In continental U.S.–Statistics from the 1990s say that 3.8 cans of Spam are consumed every second in the United States, totaling nearly 122 million cans annually. Part of the diet of almost 30% of American households, it is perceived differently in various regions of the country.[17] It is also sometimes associated with economic hardship because of its relatively low cost.[1]

Spam that is sold in North America, South America, and Australia is produced in Austin, Minnesota (also known as “Spam Town USA”) and in Fremont, Nebraska. Austin, Minnesota also has a restaurant with a menu devoted exclusively to Spam, called “Johnny’s SPAMarama Menu”.[18]

In 1963, Spam was introduced to various private and public schools in South Florida as cheap food and even for art sculptures. Due to the success of the introduction, Hormel Foods also introduced school “color-themed” spam, the first being a blue and green variety which is still traditionally used in some private schools of South Florida.[19]

According to its label, Spam’s basic ingredients are pork shoulder meat, with ham meat added, salt, water, modified potato starch as a binder, sugar, and sodium nitrite as a preservative.

Spam has been kidded a lot, as on Monte Python’s Flying Circus. This skit is the origin of “spam” as denoting bulk commercial email according to Merriam-Webster Unabridged Dictionary.

Nutritional Information for Original Spam
Net weight per package: 340 grams (12 oz.)Serving size: 100g
Quantity per 100g
Energy 1,300 kJ ( 310 Calories or kilocalories)
Protein 13g (26% Daily Value or DV)
Total Fat 27g (41% DV)
  – saturated fat 10g (49% DV)
Carbohydrates 3g (1% DV)
Sodium 1369 mg (57% DV)
Cholesterol 70 mg (23% DV)
Vitamins and Minerals (% DV) 1% Vitamin C, 1% Calcium, 5% Iron,3% Magnesium, 9% Potassium, 12% Zinc,and 5% Copper


(Note 1369 grams of sodium (salt) per small serving.  Our “heart healthy” Campbell’s soups in the pantry have 410 grams per serving.  Want to know how too much salt can kill you? Click here. Lots of fat in this product, too)



Spam is celebrated in Austin, Minnesota, home to the Spam Museum. The museum tells the history of the Hormel company, the origin of Spam, and its place in world culture. The Museum is closed for moving just now.

Chocolate Shortage–Other Problems


The world’s biggest chocolate-maker says we’re running out of chocolate

By Roberto A. Ferdman November 15, 2014

The chocolate deficit is about to go way up.

There’s no easy way to say this: You’re eating too much chocolate, all of you. And it’s getting so out of hand that the world could be headed towards a potentially disastrous (if you love chocolate) scenario if it doesn’t stop.

Those are, roughly speaking, the words of two huge chocolate makers, Mars, Inc. and Barry Callebaut. And there’s some data to back them up.

Chocolate deficits, whereby farmers produce less cocoa than the world eats, are becoming the norm. Already, we are in the midst of what could be the longest streak of consecutive chocolate deficits in more than 50 years. It also looks like deficits aren’t just carrying over from year-to-year—the industry expects them to grow. Last year, the world ate roughly 70,000 metric tons more cocoa than it produced. By 2020, the two chocolate-makers warn that that number could swell to 1 million metric tons, a more than 14-fold increase; by 2030, they think the deficit could reach 2 million metric tons.

The problem is, for one, a supply issue. Dry weather in West Africa (specifically in the Ivory Coast and Ghana, where more than 70 percent of the world’s cocoa is produced) has greatly decreased production in the region. A nasty fungal disease known as frosty pod hasn’t helped either. The International Cocoa Organization estimates it has wiped out between 30 percent and 40 percent of global cocoa production. Because of all this, cocoa farming has proven a particularly tough business, and many farmers have shifted to more profitable crops, like corn, as a result.

Then there’s the world’s insatiable appetite for chocolate. China’s growing love for the stuff is of particular concern. The Chinese are buying more and more chocolate each year. Still, they only consume per capita about 5 percent of what the average Western European eats. There’s also the rising popularity of dark chocolate, which contains a good deal more cocoa by volume than traditional chocolate bars (the average chocolate bar contains about 10 percent, while dark chocolate often contains upwards of 70 percent).

For these reasons, cocoa prices have climbed by more than 60 percent since 2012, when people started eating more chocolate than the world could produce. And chocolate makers have, in turn, been forced to adjust by raising the price of their bars. Hershey’s was the first, but others have followed suit.

Efforts to counter the growing imbalance between the amount of chocolate the world wants and the amount farmers can produce has inspired a bit of much needed innovation. Specifically, an agricultural research group in Central Africa is developing trees that can produce up to seven times the amount of beans traditional cocoa trees can. The uptick in efficiency, however, might be compromising taste, says Bloomberg’s Mark Schatzker. He likens the trade-off to other mass-produced commodities.

Efforts are under way to make chocolate cheap and abundant — in the process inadvertently rendering it as tasteless as today’s store-bought tomatoes, yet another food, along with chicken and strawberries, that went from flavorful to forgettable on the road to plenitude.

It’s unclear anyone will mind a milder flavor if it keeps prices down. And the industry certainly won’t mind, so long as it keeps the potential for a gargantuan shortage at bay.