Electricity from Ocean Waves

 

1st wave of wave-born power hits

Buoys that really tap ocean currents go online in Hawaii

A wave energy device converts the movement of the ocean into electricity at a Navy test site at Kaneohe Bay in Hawaii. (Northwest Energy Innovations)
Image 1 of 2

By Cathy Bussewitz   Associated Press   In Chicago Tribune

KANEOHE BAY, Hawaii — Off the coast of Hawaii, a tall buoy bobs and sways in the water, using the rise and fall of the waves to generate electricity.

The current travels through an undersea cable for a mile to a military base, where it feeds into Oahu’s power grid — the first wave-produced electricity to go online in the U.S.

By some estimates, the ocean’s endless motion packs enough power to meet a quarter of America’s energy needs and dramatically reduce the nation’s reliance on oil, gas and coal. But wave energy technology lags well behind wind and solar power, with important technical hurdles still to be overcome.

To that end, the Navy has established a test site in Hawaii, with hopes the technology can someday be used to produce clean, renewable power for the fleet and provide electricity to coastal communities around the world.

“More power from more places translates to a more agile, more flexible, more capable force,” Joseph Bryan, deputy assistant secretary of the Navy, said during an event at the site. “So we’re always looking for new ways to power the mission.”

Hawaii would seem a natural site for such technology. As any surfer can tell you, it is blessed with powerful waves. The island state also has the nation’s highest electricity costs — largely because of its heavy reliance on oil delivered by sea — and has a legislative mandate to get 100 percent of its energy from renewables by 2045.

Still, it could be five to 10 years before wave energy technology can provide an affordable alternative to fossil fuels, experts say.

For one thing, developers are still working to come up with the best design. Some buoys capture the up-and-down motion of the waves, while others exploit the side-to-side movement. Industry experts say a machine that uses all the ocean’s movements is most likely to succeed.

Also, the machinery has to be able to withstand powerful storms, the constant pounding of the seas and the corrosive effects of saltwater.

“You’ve got to design something that can stay in the water for a long time but be able to survive,” said Patrick Cross, specialist at the Hawaii Natural Energy Institute at the University of Hawaii at Manoa, which helps run the test site.

The U.S. has set a goal of reducing carbon emissions by one-third from 2005 levels by 2030, and many states are seeking to develop more renewable energy in the coming decades.

Jose Zayas, a director of the Wind and Water Power Technologies Office at the U.S. Energy Department, which helps fund the Hawaii site, said the U.S. could get 20 to 28 percent of its energy needs from waves without encroaching on sensitive waters such as marine preserves.

“When you think about all of the states that have water along their coasts, there’s quite a bit of wave energy potential,” he said.

Wave energy technology is at about the same stage as the solar and wind industries were in the 1980s. Both received substantial government investment and tax credits that helped them become energy sources cheap enough to compete with fossil fuels.

But while the U.S. government and military have put about $334 million into marine energy research over the past decade, Britain and the rest of Europe have invested more than $1 billion, according to the Marine Energy Council, a trade group.

“We’re about, I’d say, a decade behind the Europeans,” said Alexandra De Visser, the Navy’s Hawaii test site project manager.

The European Marine Energy Centre in Scotland, for example, has 14 grid-connected berths that have housed dozens of wave and tidal energy devices from around the world over the past 13 years, and Wave Hub in England has several such berths. China, too, has been testing dozens of units .

Though small in scale, the test project near Kaneohe Bay represents the vanguard of U.S. wave energy development. It consists of two buoys anchored a half-mile to a mile offshore.

One of them, the Azura, which extends 12 feet above the surface and 50 feet below, converts the waves’ vertical and horizontal movements into up to 18 kilowatts of electricity, enough for about a dozen homes. The company working with the Navy, Northwest Energy Innovations of Portland, Ore., plans a version that can generate at least 500 kilowatts, or enough to power hundreds of homes.

A Norwegian company developed the other buoy, a 50-foot-wide, doughnut-shaped device called the Lifesaver. Cables anchor the 3-foot-tall ring to the ocean floor. When the sea wobbles the buoy, the cables move, turning a generator’s wheels. It produces an average of 4 kilowatts.

Test sites run by other researchers are being planned or expanded in Oregon and California. One of them, Cal Wave, run by California Polytechnic State University, hopes to provide utility-scale power to Vandenberg Air Force Base.

The Hawaii buoys are barely noticeable from shore, but developers envision dozens of machines working at once, an idea that could run into the same opposition wind turbines have faced from environmentalists, tourist groups and others.

“Nobody wants to look out and see wind turbines or wave machines off the coast,” said Steve Kopf, CEO of Northwest Energy Innovations.

More in Wikipedia

SUGAR VS. FAT=FRAUD

A newly discovered cache of internal documents reveals that the sugar industry downplayed the risks of sugar in the 1960s.  Luis Ascui/Getty Images

50 Years Ago, Sugar Industry Quietly Paid Scientists To Point Blame At Fat

National Public Radio   source
In the 1960s, the sugar industry funded research that downplayed the risks of sugar and highlighted the hazards of fat, according to a newly published article in JAMA Internal Medicine.

The article draws on internal documents to show that an industry group called the Sugar Research Foundation wanted to “refute” concerns about sugar’s possible role in heart disease. The SRF then sponsored research by Harvard scientists that did just that. The result was published in the New England Journal of Medicine in 1967, with no disclosure of the sugar industry funding.
Sugar Shocked?

The Rest Of Food Industry Pays For Lots Of Research, Too
The sugar-funded project in question was a literature review, examining a variety of studies and experiments. It suggested there were major problems with all the studies that implicated sugar, and concluded that cutting fat out of American diets was the best way to address coronary heart disease.

The authors of the new article say that for the past five decades, the sugar industry has been attempting to influence the scientific debate over the relative risks of sugar and fat.

“It was a very smart thing the sugar industry did, because review papers, especially if you get them published in a very prominent journal, tend to shape the overall scientific discussion,” co-author Stanton Glantz told The New York Times.

Money on the line
How The Food Industry Manipulates Taste Buds With ‘Salt Sugar Fat’
In the article, published Monday, authors Glantz, Cristin Kearns and Laura Schmidt aren’t trying make the case for a link between sugar and coronary heart disease. Their interest is in the process. They say the documents reveal the sugar industry attempting to influence scientific inquiry and debate.

The researchers note that they worked under some limitations — “We could not interview key actors involved in this historical episode because they have died,” they write. Other organizations were also advocating concerns about fat, they note.

There’s no evidence that the SRF directly edited the manuscript published by the Harvard scientists in 1967, but there is “circumstantial” evidence that the interests of the sugar lobby shaped the conclusions of the review, the researchers say.

For one thing, there’s motivation and intent. In 1954, the researchers note, the president of the SRF gave a speech describing a great business opportunity.

If Americans could be persuaded to eat a lower-fat diet — for the sake of their health — they would need to replace that fat with something else. America’s per capita sugar consumption could go up by a third.
In ‘Soda Politics,’ Big Soda At Crossroads Of Profit And Public Health
But in the ’60s, the SRF became aware of “flowing reports that sugar is a less desirable dietary source of calories than other carbohydrates,” as John Hickson, SRF vice president and director of research, put it in one document.

He recommended that the industry fund its own studies — “Then we can publish the data and refute our detractors.

The next year, after several scientific articles were published suggesting a link between sucrose and coronary heart disease, the SRF approved the literature-review project. It wound up paying approximately $50,000 in today’s dollars for the research.

One of the researchers was the chairman of Harvard’s Public Health Nutrition Department — and an ad hoc member of SRF’s board.

“A different standard” for different studies

Glantz, Kearns and Schmidt say many of the articles examined in the review were hand-selected by SRF, and it was implied that the sugar industry would expect them to be critiqued.

Obesity And The Toxic-Sugar Wars
13.7: COSMOS AND CULTURE
Obesity And The Toxic-Sugar Wars
In a letter, SRF’s Hickson said that the organization’s “particular interest” was in evaluating studies focused on “carbohydrates in the form of sucrose.”

“We are well aware,” one of the scientists replied, “and will cover this as well as we can.”

The project wound up taking longer than expected, because more and more studies were being released that suggested sugar might be linked to coronary heart disease. But it was finally published in 1967.

Hickson was certainly happy with the result: “Let me assure you this is quite what we had in mind and we look forward to its appearance in print,” he told one of the scientists.

The review minimized the significance of research that suggested sugar could play a role in coronary heart disease. In some cases the scientists alleged investigator incompetence or flawed methodology.

“It is always appropriate to question the validity of individual studies,” Kearns told Bloomberg via email. But, she says, “the authors applied a different standard” to different studies — looking very critically at research that implicated sugar, and ignoring problems with studies that found dangers in fat.

Epidemiological studies of sugar consumption — which look at patterns of health and disease in the real world — were dismissed for having too many possible factors getting in the way. Experimental studies were dismissed for being too dissimilar to real life.

One study that found a health benefit when people ate less sugar and more vegetables was dismissed because that dietary change was not feasible.

Another study, in which rats were given a diet low in fat and high in sugar, was rejected because “such diets are rarely consumed by man.”

The Harvard researchers then turned to studies that examined risks of fat — which included the same kind of epidemiological studies they had dismissed when it came to sugar.

Citing “few study characteristics and no quantitative results,” as Kearns, Glantz and Schmidt put it, they concluded that cutting out fat was “no doubt” the best dietary intervention to prevent coronary heart disease.

Sugar lobby: “Transparency standards were not the norm”

In a statement, the Sugar Association — which evolved out of the SRF — said it is challenging to comment on events from so long ago.

“We acknowledge that the Sugar Research Foundation should have exercised greater transparency in all of its research activities, however, when the studies in question were published funding disclosures and transparency standards were not the norm they are today,” the association said.

“Generally speaking, it is not only unfortunate but a disservice that industry-funded research is branded as tainted,” the statement continues. “What is often missing from the dialogue is that industry-funded research has been informative in addressing key issues.”

The documents in question are five decades old, but the larger issue is of the moment, as Marion Nestle notes in a commentary in the same issue of JAMA Internal Medicine:

“Is it really true that food companies deliberately set out to manipulate research in their favor? Yes, it is, and the practice continues. In 2015, the New York Times obtained emails revealing Coca-Cola’s cozy relationships with sponsored researchers who were conducting studies aimed at minimizing the effects of sugary drinks on obesity. Even more recently, the Associated Press obtained emails showing how a candy trade association funded and influenced studies to show that children who eat sweets have healthier body weights than those who do not.”
As for the article authors who dug into the documents around this funding, they offer two suggestions for the future.

“Policymaking committees should consider giving less weight to food industry-funded studies,” they write.

They also call for new research into any ties between added sugars and coronary heart disease.

 

MIDDLE EAST
Department Of Defense Investigating U.S.-Led Coalition Airstrike In Syria
Mike Pence speaks to Republicans at the Ronald Reagan Presidential Library in Si

 

 

Cracker Barrel

 

Cracker Barrel Store Front

Cracker Barrel is a chain of inexpensive restaurants with attached stores selling souvenirs, chotzkies, gift items.  They serve breakfast, lunch, and dinner.  I’ve eaten in one twice, and I’ve despised them until I read this:                               RJN

An anonymous online review:

“A Tip for Seniors”

I visit Cracker Barrel at least once a week.

This is one place that they will let Seniors

order off the kids menu and get the free drink too.

You can get a complete meal with drink and

either a biscuit or cornbread for under $5.00.

It’s cheaper than cooking at home for one or two people.

In the winter, I go up there a lot and ask for a table

by the fireplace. Cheaper than getting a duralog

for my fireplace and sitting alone.

 

source

Secretly Sick Presidents

THE SECRET AILMENTS OF PRESIDENTS

A history of illnesses kept from public

By Joel Achenbach and Lillian Cunningham                                                                                       The Washington Post in Chicago Tribune 9.13.16

In his second term as president, Dwight Eisenhower looked like an old man. He’d had a serious heart attack in 1955, requiring extensive hospitalization. He later suffered a stroke. In contrast, his successor, John F. Kennedy, seemed vibrant and flamboyant.

The reality was that Eisenhower wasn’t really that old — he was just 62 when he was first elected. And Kennedy wasn’t that vigorous and indeed was secretly afflicted by serious medical problems, including Addison’s disease*, that his aides concealed from the public.

In his second term as president, Dwight Eisenhower looked like an old man. He’d had a serious heart attack in 1955, requiring extensive hospitalization. He later suffered a stroke. In contrast, his successor, John F. Kennedy, seemed vibrant and flamboyant.

The reality was that Eisenhower wasn’t really that old — he was just 62 when he was first elected. And Kennedy wasn’t that vigorous and indeed was secretly afflicted by serious medical problems, including Addison’s disease, that his aides concealed from the public.

The history of the presidency includes a running thread of illness and incapacity, much of it hidden from the public out of political calculation. A stroke incapacitated Woodrow Wilson in 1919, for example, but the public had no inkling until many months later. And when Grover Cleveland needed surgery in 1893 to remove a cancerous tumor in his mouth, he did it secretly on a friend’s yacht cruising through Long Island Sound.

Presidential history reveals a more subtle trend: Age isn’t what it used to be. American culture has redefined old age, pushing it back significantly as people live longer and expect to be more active into their eighth or ninth decade or beyond.

Hillary Clinton is 68, and Donald Trump is 70. They’re the oldest pair of major party candidates in history. If elected, Clinton would be the second-oldest person to assume the presidency, after Ronald Reagan. Trump would be the oldest.

Health has suddenly become a preoccupation on the campaign trail in the wake of Clinton’s wobbly episode Sunday when she left a 9/11 service in New York City. The Clinton camp initially called it merely a case of overheating. Late in the day, the campaign revealed that, in fact, she was diagnosed with pneumonia on Friday. On Monday, a Clinton spokesperson acknowledged that the campaign could have been more forthcoming on Sunday.

Neither candidate has released detailed medical records.

Clinton’s gender gives her an advantage on one respect: Women in the U.S. outlive men by several years. According to the Social Security Administration’s online life expectancy calculator, a woman of Clinton’s age is likely to live an additional 18.4 years. A man of Trump’s age is likely to live an additional 15.2.

Voters will have to determine if the murky health status of Clinton and Trump should be a factor in the November decision. What’s certain is that the campaign trail can be brutal and that the presidency itself can pound away at the health of whoever occupies the Oval Office.

President Cleveland kept his cancer surgery secret in part because cancer at the time was such a dreaded disease. He also didn’t trust reporters or think his medical condition was anyone’s business, Cleveland biographer Matthew Algeo, author of “The President is a Sick Man,” told The Washington Post.

Algeo makes a broader observation: The desire for secrecy led many American presidents to avoid the best doctors. “With presidents, a lot of times they don’t get the best care. You would expect they would, but they’re so paranoid about anyone knowing what’s wrong with them that they employ old family doctors,” Algeo said.

The public had limited information about Franklin Delano Roosevelt’s physical condition and the fact that he used a wheelchair. By the time he ran for a fourth term in 1944, he had heart disease, was constantly tired and had trouble concentrating. Frank Lahey, a surgeon who examined Roosevelt, wrote a memo saying FDR would never survive another four-year term. The memo was not disclosed until 2011.

Roosevelt sailed to another victory and died in April 1945, leaving Harry Truman to close out World War II.

Kennedy suffered from Addison’s disease and had to take steroids and other drugs to ward off the symptoms, but he did so secretly. As the Los Angeles Times reported: “During the 1960 campaign, Kennedy’s opponents said he had Addison’s. His physicians released a cleverly worded statement saying that he did not have Addison’s disease caused by tuberculosis, and the matter was dropped.

“Kennedy collapsed twice because of the disease: once at the end of a parade during an election campaign and once on a congressional visit to Britain.”

The history of the presidency includes a running thread of illness and incapacity, much of it hidden from the public out of political calculation. A stroke incapacitated Woodrow Wilson in 1919, for example, but the public had no inkling until many months later. And when Grover Cleveland needed surgery in 1893 to remove a cancerous tumor in his mouth, he did it secretly on a friend’s yacht cruising through Long Island Sound.

Presidential history reveals a more subtle trend: Age isn’t what it used to be. American culture has redefined old age, pushing it back significantly as people live longer and expect to be more active into their eighth or ninth decade or beyond.

Hillary Clinton is 68, and Donald Trump is 70. They’re the oldest pair of major party candidates in history. If elected, Clinton would be the second-oldest person to assume the presidency, after Ronald Reagan. Trump would be the oldest.

Health has suddenly become a preoccupation on the campaign trail in the wake of Clinton’s wobbly episode Sunday when she left a 9/11 service in New York City. The Clinton camp initially called it merely a case of overheating. Late in the day, the campaign revealed that, in fact, she was diagnosed with pneumonia on Friday. On Monday, a Clinton spokesperson acknowledged that the campaign could have been more forthcoming on Sunday.

Neither candidate has released detailed medical records.

Clinton’s gender gives her an advantage on one respect: Women in the U.S. outlive men by several years. According to the Social Security Administration’s online life expectancy calculator, a woman of Clinton’s age is likely to live an additional 18.4 years. A man of Trump’s age is likely to live an additional 15.2.

Voters will have to determine if the murky health status of Clinton and Trump should be a factor in the November decision. What’s certain is that the campaign trail can be brutal and that the presidency itself can pound away at the health of whoever occupies the Oval Office.

President Cleveland kept his cancer surgery secret in part because cancer at the time was such a dreaded disease. He also didn’t trust reporters or think his medical condition was anyone’s business, Cleveland biographer Matthew Algeo, author of “The President is a Sick Man,” told The Washington Post.

Algeo makes a broader observation: The desire for secrecy led many American presidents to avoid the best doctors. “With presidents, a lot of times they don’t get the best care. You would expect they would, but they’re so paranoid about anyone knowing what’s wrong with them that they employ old family doctors,” Algeo said.

The public had limited information about Franklin Delano Roosevelt’s physical condition and the fact that he used a wheelchair. By the time he ran for a fourth term in 1944, he had heart disease, was constantly tired and had trouble concentrating. Frank Lahey, a surgeon who examined Roosevelt, wrote a memo saying FDR would never survive another four-year term. The memo was not disclosed until 2011.

Roosevelt sailed to another victory and died in April 1945, leaving Harry Truman to close out World War II.

Kennedy suffered from Addison’s disease and had to take steroids and other drugs to ward off the symptoms, but he did so secretly. As the Los Angeles Times reported: “During the 1960 campaign, Kennedy’s opponents said he had Addison’s. His physicians released a cleverly worded statement saying that he did not have Addison’s disease caused by tuberculosis, and the matter was dropped.

“Kennedy collapsed twice because of the disease: once at the end of a parade during an election campaign and once on a congressional visit to Britain.”

 * Addison’s disease is a disorder that occurs when your body produces insufficient amounts of certain hormones produced by your adrenal glands. In Addison’s disease, your adrenal glands produce too little cortisol and often insufficient levels of aldosterone as well.  Read more at source.

NCAA reacts to NC’s LGBT law

NCAA PULLS 7 POSTSEASON EVENTS OUT OF NCAA  PULLS 7 EVENTS OUT OF NORTH CAROLINA DUE TO LGBT LAW

  sources  Associated Press and Fox Sports Sep 12, 2016
The NCAA has pulled seven championship events from North Carolina, including opening-weekend men’s basketball tournament games, for the coming year due to a state law that some say can lead to discrimination against LGBT people.

In a news release Monday, the NCAA says the decision by its board of governors came ”because of the cumulative actions taken by the state concerning civil rights protections.”

”This decision is consistent with the NCAA’s long-standing core values of inclusion, student-athlete well-being and creating a culture of fairness,” said Georgia Tech President G.P. ”Bud” Peterson, the chair of the board of governors.
The law – known as HB2 – requires transgender people to use restrooms at schools and government buildings corresponding to the sex on their birth certificates. It also excludes gender identity and sexual orientation from local and statewide antidiscrimination protections.

HB2 was signed into law by Republican Gov. Pat McCrory earlier this year. A spokesman with McCrory’s office couldn’t immediately be reached for comment Monday evening.

The only championship events that can be hosted in North Carolina this academic year are ones determined when a team earns the right to play on their own campus.

The NCAA said it will relocate the men’s basketball first- and second-round games that were scheduled for March 17 and 19 in Greensboro. The NCAA will also relocate:

– the Division I women’s soccer championship scheduled for Dec. 2 and 4 in Cary, just outside the capital city of Raleigh;

– the Division III men’s and women’s soccer championships set for Dec. 2 and 3 in Greensboro;

– the Division I women’s golf regional championships set for May 8-10 in Greenville;

– the Division III men’s and women’s tennis championships set for May 22-27 in Cary;

– the Division I women’s lacrosse championship set for May 26 and 28 in Cary;

– and the Division II baseball championship from May 27 to June 3 in Cary.

North Carolina athletic director Bubba Cunningham and North Carolina State AD Debbie Yow both issued statements Monday evening saying they were disappointed at the loss of the events.

”We certainly hope there will be resolution in the very near future,” Yow said.

The campaign spokesman for Democrat Roy Cooper, the state’s attorney general and McCrory’s re-election opponent in November, said the law needs to be repealed.

”It seems that almost every day, we learn of a new consequence of HB2,” spokesman Ford Porter said. ”… We need to repeal this law and get our state back on track.”

The NCAA’s move leaves the Atlantic Coast Conference football championship game in Charlotte as the marquee college sporting event in the state this year as the men’s basketball tournament starts a two-year stay in Brooklyn, New York.

However, that event also could be in jeopardy. In May, the ACC announced that member schools discussed the law during their annual spring meetings and said it could impact whether the state hosts league championship events.

In April, the NCAA announced it was adopting an anti-discrimination measure that would affect the way the governing body evaluates bids to host sporting events and required sites to ”demonstrate how they will provide an environment that is safe, healthy and free of discrimination.”

In a statement Monday night, NCAA President Mark Emmert said the governing body will delay announcements on future championship sites until early next year. That comes as it reviews responses to questionnaires required of prospective site hosts on how they would comply with the NCAA’s anti-discrimination measure.

In announcing its decision Monday, the NCAA stated current North Carolina laws ”make it challenging to guarantee that host communities can help deliver” on that requirement.

The NCAA also took special note of four ways North Carolina’s law differs from other states. The NCAA pointed out that five states – Connecticut, Minnesota, New York, Vermont and Washington – and several cities prohibit travel by public employees and representatives of public institutions to the state of North Carolina. Those representatives prohibited to travel could include athletes, coaches and athletic administrators.

Monday’s action by the NCAA is the latest public and business backlash that has arisen since the law was enacted. The NBA moved its 2017 All-Star Game to New Orleans instead of hosting it in Charlotte as originally scheduled because of the law. Duke lost a men’s basketball game from its schedule when Albany backed out due to that state’s travel ban, while the Vermont women’s basketball team has canceled a December trip to play North Carolina in Chapel Hill.

Entertainers like Bruce Springsteen, Pearl Jam and Ringo Starr have canceled plans to play in North Carolina. And PayPal reversed plans to open a 400-employee operation center in Charlotte.

Eating Deer, Elk, and People Spreads Disease

A sign said DEPOSIT DEER AND ELK HEADS HERE at a government building next door to our hotel in Fort Collins, Colorado.  The heads were to be used in the study of chronic wasting disease which is related to mad cow disease and kuru.  RJN

___________________________________________

WHEN PEOPLE ATE PEOPLE, A STRANGE DISEASE EMERGED

In 1962, a local leader in the Eastern Highlands of Papua New Guinea asks Fore men to stop the sorcery that he believes is killing women and children.  Courtesy Shirley Lindenbaum

Most of the world didn’t know anyone lived in the highlands of Papua New Guinea until the 1930s, when Australian gold prospectors surveying the area realized there were about a million people there.

When researchers made their way to those villages in the 1950s, they found something disturbing. Among a tribe of about 11,000 people called the Fore, up to 200 people a year had been dying of an inexplicable illness. They called the disease kuru, which means “shivering” or “trembling.”

Once symptoms set in, it was a swift demise. First, they’d have trouble walking, a sign that they were about to lose control over their limbs. They’d also lose control over their emotions, which is why people called it the “laughing death.” Within a year, they couldn’t get up off the floor, feed themselves or control their bodily functions.

Many locals were convinced it was the result of sorcery. The disease primarily hit adult women and children younger than 8 years old. In some villages, there were almost no young women left.

“They were obsessed with trying to save themselves because they knew demographically that they were on the brink of extinction,” says Shirley Lindenbaum, a medical anthropologist with the City University of New York.

But what was causing it? That answer eluded researchers for years. Afterruling out an exhaustive list of contaminants, they thought it must be genetic. So in 1961, Lindenbaum traveled from village to village mapping family trees so researchers could settle the issue.

But Lindenbaum, who continues to write about the epidemic, knew it couldn’t be genetic, because it affected women and children in the same social groups, but not in the same genetic groups. She also knew that it had started in villages in the north around the turn of the century, and then moved south over the decades.

Lindenbaum had a hunch about what was going on, and she turned out to be right. It had to do with funerals. Specifically, it had to do with eating dead bodies at funerals.

In many villages, when a person died, they would be cooked and consumed. It was an act of love and grief.

As one medical researcher described, “If the body was buried it was eaten by worms; if it was placed on a platform it was eaten by maggots; the Fore believed it was much better that the body was eaten by people who loved the deceased than by worms and insects.”

Women removed the brain, mixed it with ferns, and cooked it in tubes of bamboo. They fire-roasted and ate everything except the gall bladder. It was primarily adult women who did so, says Lindenbaum, because their bodies were thought to be capable of housing and taming the dangerous spirit that would accompany a dead body.

“So, the women took on the role of consuming the dead body and giving it a safe place inside their own body — taming it, for a period of time, during this dangerous period of mortuary ceremonies,” says Lindenbaum.

But women would occasionally pass pieces of the feast to children. “Snacks,” says Lindenbaum. “They ate what their mothers gave them,” she says, until the boys hit a certain age and went off to live with the men. “Then, they were told not to touch that stuff.”

Finally, after urging from researchers like Lindenbaum, biologists came around to the idea that the strange disease stemmed from eating dead people. The case was closed after a group at the U.S. National Institutes of Health injected infected human brain into chimpanzees, and watched symptoms of kuru develop in the animals months later. The group, whichwon a Nobel Prize for the findings, dubbed it a “slow virus.”

But it wasn’t a virus — or a bacterium, fungus, or parasite. It was an entirely new infectious agent, one that had no genetic material, could survive being boiled, and wasn’t even alive.

As another group would find years later, it was just a twisted protein, capable of performing the microscopic equivalent of a Jedi mind trick, compelling normal proteins on the surface of nerve cells in the brain to contort just like them. The so-called “prions,” or “proteinaceous infectious particles,” would eventually misfold enough proteins to kill pockets of nerve cells in the brain, leaving the cerebellum riddled with holes, like a sponge.

The process was so odd that some compared it to Dr. Jekyll’s transformation to Mr. Hyde: “the same entity but in two manifestations — a ‘kind’, innocuous one and a ‘vicious’, lethal one.”

The epidemic likely started when one person in a Fore village developed sporadic Creutzfeldt-Jakob Disease, a degenerative neurological disorder similar to kuru. According to the Centers for Disease Control and Prevention, about one in a million people in the U.S. develop CJD the difference is that others rarely come into contact with infected human tissue.

Though the Fore stopped the practice of mortuary feasts more than 50 years ago, cases of kuru continued to surface over the years, because the prions could take decades to show their effects.

According to Michael Alpers, a medical researcher at Curtin University in Australia who tracked kuru cases for decades, the last person with kuru died in 2009. His team continued surveillance until 2012, when the epidemic was officially declared over. “I have followed up a few rumoured cases since then but they were not kuru,” he wrote in an email.

When Shirley Lindenbaum visited a South Fore village in 2008, one man said excitedly, “See how many children we have now?”  Courtesy Shirley Lindenbaum

But while they remain rare, transmissible prion diseases did not die out with the last kuru case, as people have found repeatedly in recent decades. People have developed variant CJD after eating the meat of cattle infected with mad cow disease. Dr. Ermias Belay, a prion diseaseresearcher with the Centers for Disease Control and Prevention, says that’s the only scenario in which there is “definitive evidence” that humans can develop a prion disease after eating the infected meat of another species.

But, he says, there are still a lot of open questions about how and why humans get prion diseases.

For one, it’s still a mystery why animals, including humans, have those proteins in the first place — the Jekylls that can be so easily turned into Hydes. One leading hypothesis, described recently in the journal Nature, is that they play an important role in the protective coating around nerves.

But here’s the bigger question, says Belay: “How many of these diseases actually jump species and affect humans?”

Kuru showed that people could get a prion disease from eating infected people. Mad cow disease showed that people can get a prion disease from eating infected cow. But what about other prion diseases in other animals? Could, say, hunters get sick from eating infected deer? That’s what researchers in North America, including Belay, are trying to find out right now.

Chronic wasting disease in North America is spreading fast,” says Belay. The disease causes infected wild deer and elk to starve to death. “In early 2000, we had about three states that reported CWD in the wild in deer and elk. Today, that number is 21.”

Belay says the disease is “a little bit concerning” because, unlike mad cow disease and kuru, where infectious prions were concentrated in the brain and nervous system tissue, in an animal with chronic wasting disease, the misfolded prions show up all over the body. They can even be found in saliva, feces and urine, which could explain how the disease is spreading so quickly among wild deer and elk.

The CDC is working with public health authorities in Wyoming and Colorado to monitor hunters for signs of prion disease.

“Unfortunately, because these diseases have long incubation periods, it’s not easy to monitor transmission,” says Belay. He says he and his colleagues have yet to find any evidence that hunters have picked up chronic wasting disease from the meat of infected wild animals.

“And that, in itself, is good news for us,” he says.

But, as with kuru, it will take years — maybe even decades — before he can know for sure.

 

War Stories 3

 

This post expands on the notes in War Stories I & 2 posted in March, 2015.

_______________________________

GUARD DUTY  As I walked a post one night, guarding maybe a warehouse or garbage dump, a jeep came slowly onto my post.  I came to attention, held up my hand, and called “halt”.  The jeep rolled too close to me and stopped.  I called, “Dismount one and be recognized”.

A large man got slowly out of the jeep,  started walking toward me, and then ran at me and grabbed my rifle!  I knew I should not give up my weapon.  I knew I could spin it and slam the stock into the lieutenant’s face.   As the the lieutenant chewed me out, I remained sure I should not create an incident that might delay my official escape from  basic training.

 

WEAPONS  The U.S. did not seem to be at war in 1958 when I was soldiering.  The Korean was over, in a way, but the authorities were working up the Viet Nam War, and the U.S., unhappy with election results in Lebanon, launched an air and land attack.
A man I knew broke both legs in the air drop–that’s another story.

Image result for corporal missile photo     Corporal Missile with “erector” designed to pick up, transport, and put in place  the 40′ missile.

There were other missiles in development but our Corporal units  in the field in Germany, supposed prepared to send a nuclear bomb 250 miles.  I don’t remember anything said about radioactive fallout blowing back in our faces.

 

 

 

 

 

GUARD DUTY