"Sick Around America" on FRONTLINE

Monday, March 30, 2009
You may remember the FRONTLINE report, "Sick Around the World." It was the best job I have ever seen anyone do summarizing five different national health care systems--the U.K., Taiwan, Germany, Japan, and Switzerland--in just one hour.I recommended it when it originally ran and I recommend it today. You can still see it here.Now FRONTLINE has aired producer Jon Palfreman's effort to explain

"EGMN: Notes from the Road"--A Refreshing and Interesting Look Inside the World of Docs

I just discovered a relatively new blog that might be of interest to you: "EGMN: Notes from the Road."It comes from the publishers of a number of the periodicals physicians read, including Internal Medicine News, Cardiology News, and Family Practice News. Their unique take on the world of health care and policy blogging is to post from medical meetings, press conferences, and policy gatherings

"Take the $600 Billion in New Revenue from Obama's 'Cap and Trade' Climate Change Proposal and Use it To Pay for Health Care Reform"

Sunday, March 29, 2009
Last week Senate Majority Leader Harry Reid was quoted as raising the possibility we could take the $600 billion in new revenue projected from a "cap-and-trade" plan to cut green house-gas emissions and use some or all of it to help pay the estimated $1.5 trillion cost for comprehensive health care reform.Energy and climate change issues aside that would be a bad idea--a really bad idea.The

Preventing Tooth Decay

Saturday, March 28, 2009
Meet Sir Edward Mellanby, the discoverer of vitamin D. Along with his wife, Dr. May Mellanby, he identified dietary factors that control the formation and repair of teeth and bones. He also identified the cause of rickets (vitamin D deficiency) and the effect of phytic acid on mineral absorption. Truly a great man! This research began in the 1910s and continued through the 1940s.

What he discovered about tooth and bone formation is profound, disarmingly simple and largely forgotten. I remember going to the dentist as a child. He told me I had good teeth. I informed him that I tried to eat well and stay away from sweets. He explained to me that I had good teeth because of genetics, not my diet. I was skeptical at the time, but now I realize just how ignorant that man was.

Tooth structure is determined during growth. Well-formed teeth are highly resistant to decay while poorly-formed teeth are cavity-prone. Drs. Mellanby demonstrated this by showing a strong correlation between tooth enamel defects and cavities in British children. The following graph is drawn from several studies he compiled in the book Nutrition and Disease (1934). "Hypoplastic" refers to enamel that's poorly formed on a microscopic level.
The graph is confusing, so don't worry if you're having a hard time interpreting it. If you look at the blue bar representing children with well-formed teeth, you can see that 77% of them have no cavities, and only 7.5% have severe cavities (a "3" on the X axis). Looking at the green bar, only 6% of children with the worst enamel structure are without cavities, while 74% have severe cavities. Enamel structure is VERY strongly related to cavity prevalence.

What determines enamel structure during growth? Drs. Mellanby identified three dominant factors:
  1. The mineral content of the diet
  2. The fat-soluble vitamin content of the diet, chiefly vitamin D
  3. The availability of minerals for absorption, determined largely by the diet's phytic acid content
Teeth and bones are a mineralized protein scaffold. Vitamin D influences the quality of the protein scaffold that's laid down. For the scaffold to mineralize, the diet has to contain enough minerals, primarily calcium and phosphorus. Vitamin D allows the digestive system to absorb the minerals, but it can only absorb them if they aren't bound by phytic acid. Phytic acid is an anti-nutrient found primarily in unfermented seeds such as grains. So the process depends on getting minerals (sufficient minerals in the diet and low phytic acid) and putting them in the right place (fat-soluble vitamins).

Optimal tooth and bone formation occurs only on a diet that is sufficient in minerals, fat-soluble vitamins, and low in phytic acid
. Drs. Mellanby used dogs in their experiments, which it turns out are a good model for tooth formation in humans for a reason I'll explain later. From Nutrition and Disease:
Thus, if growing puppies are given a limited amount of separated [skim] milk together with cereals, lean meat, orange juice, and yeast (i.e., a diet containing sufficient energy value and also sufficient proteins, carbohydrates, vitamins B and C, and salts), defectively formed teeth will result. If some rich source of vitamin D be added, such as cod-liver oil or egg-yolk, the structure of the teeth will be greatly improved, while the addition of oils such as olive... leaves the teeth as badly formed as when the basal diet only is given... If, when the vitamin D intake is deficient, the cereal part of the diet is increased, or if wheat germ [high in phytic acid] replaces white flour, or, again, if oatmeal [high in phytic acid] is substituted for white flour, then the teeth tend to be worse in structure, but if, under these conditions, the calcium intake is increased, then calcification [the deposition of calcium in the teeth] is improved.
Other researchers initially disputed the Mellanbys' results because they weren't able to replicate the findings in rats. It turns out, rats produce the phytic acid-degrading enzyme phytase in their small intestine, so they can extract minerals from unfermented grains better than dogs. Humans also produce phytase, but at levels so low they don't significantly degrade phytic acid. The small intestine of rats has about 30 times the phytase activity of the human small intestine, again demonstrating that humans are not well adapted to eating grains. Our ability to extract minerals from seeds is comparable to that of dogs, which shows that the Mellanbys' results are applicable to humans.

Drs. Mellanby found that the same three factors determine bone quality in dogs as well, which I may discuss in another post.

Is there anything someone with fully formed enamel can do to prevent tooth decay? Drs. Mellanby showed (in humans this time) that not only can tooth decay be prevented by a good diet, it can be almost completely reversed even if it's already present. Dr. Weston Price used a similar method to reverse tooth decay as well. I'll discuss that in my next post.

"Will CIGNA Remake The Health Plan Marketplace?"--CIGNA Embraces Onsite Clinics

Will CIGNA Remake The Health Plan Marketplace?by BRIAN KLEPPERAmerica’s health plans are floundering. If their job has been to provide the nation’s mainstream families with access to affordable care (let’s leave quality out of it for the moment), they have failed miserably, though they were very profitable along the way, at least until Q1 2008. In 2008, the Milliman Medical Index – an estimate of

Anybody Know Where We Can Find a Quick Trillion Dollars?

Thursday, March 26, 2009
"Irrational exuberance" over the chances for health care reform meet the budget realities.The House and Senate Budget committees have begun work on the federal budget.Last week’s CBO report estimated the Obama budget would:Produce a nearly $9.3 trillion deficit over the next decade.Generate annual budget deficits of nearly $1 trillion in each year from fiscal year 2010 to 2019.Increase budget

Skin Texture, Cancer and Dietary Fat

Wednesday, March 25, 2009
Richard and I exchanged a series of e-mails last week in which he remarked that Thai people generally have nice skin, which is something I've also noticed in Thai immigrants to the U.S. I believe you can often tell what kind of fat a person eats by looking at their face, especially as people age or bear children.

People who eat predominantly traditional fats like butter and coconut oil usually have nice skin. It's smoother, rosier and it ages more gracefully than the skin of a person who eats industrial fats like soy and corn oil. Coconut is the predominant fat in the traditional Thai diet. Coconut fat is about 87% saturated, far more than any animal fat*. Coconut oil and butter are very low in omega-6 linoleic acid, while industrial vegetable oils and margarine contain a lot of it.

I saw a great movie last week called "The Betrayal", about a family of Lao refugees that immigrated to the U.S. in the late 1970s. The director followed the family for 23 years as they tried to carve out a life for themselves in Brooklyn. The main fats in the traditional Lao diet are lard and coconut milk. The mother of the family was a nice looking woman when she left Laos. She was thin and had great skin and teeth, despite having delivered half a dozen children at that point. After 23 years in the U.S., she was overweight and her skin was colorless and pasty. At the end of the movie, they return to Laos to visit their family there. The woman's mother was still alive. She was nearly 100 years old and looked younger than her daughter.

Well that's a pretty story, but let's hit the science. There's a mouse model of skin cancer called the Skh:HR-1 hairless mouse. When exposed to UV rays and/or topical carcinogens, these mice develop skin cancer just like humans (especially fair-skinned humans). Researchers have been studying the factors that determine their susceptibility to skin cancer, and fat is a dominant one. Specifically, their susceptibility to skin cancer is determined by the amount of linoleic acid in the diet.

In 1994, Drs. Cope and Reeve published a study using hairless mice in which they put groups of mice on two different diets (Cope, R. B. & Reeve, V. E. (1994) Photochem. Photobiol. 59: 24 S). The first diet contained 20% margarine; the second was identical but contained 20% butter. Mice eating margarine developed significantly more skin tumors when they were exposed to UV light or a combination of UV and a topical carcinogen. Researchers have known this for a long time. Here's a quote from a review published in 1987:
Nearly 50 years ago the first reports appeared that cast suspicion on lipids, or peroxidative products thereof, as being involved in the expression of actinically induced cancer. Whereas numerous studies have implicated lipids as potentiators of specific chemical-induced carcinogenesis, only recently has the involvement of these dietary constituents in photocarcinogenesis been substantiated. It has now been demonstrated that both level of dietary lipid intake and degree of lipid saturation have pronounced effects on photoinduced skin cancer, with increasing levels of unsaturated fat intake enhancing cancer expression. The level of intake of these lipids is also manifested in the level of epidermal lipid peroxidation.
Here's a quote from a study conducted in 1996:
A series of semi-purified diets containing 20% fat by weight, of increasing proportions (0, 5%, 10%, 15% or 20%) of polyunsaturated sunflower oil mixed with hydrogenated saturated cottonseed oil, was fed to groups of Skh:HR-1 hairless mice during induction and promotion of photocarcinogenesis. The photocarcinogenic response was of increasing severity as the polyunsaturated content of the mixed dietary fat was increased, whether measured as tumour incidence, tumour multiplicity, progression of benign tumours to squamous cell carcinoma, or reduced survival... These results suggest that the enhancement of photocarcinogenesis by the dietary polyunsaturated fat component is mediated by an induced predisposition to persistent immunosuppression caused by the chronic UV irradiation, and supports the evidence for an immunological role in dietary fat modulation of photocarcinogenesis in mice.
In other words, UV-induced cancer increased in proportion to the linoleic acid content of the diet, because linoleic acid suppresses the immune system's cancer-fighting ability!

It doesn't end at skin cancer. In animal models, a number of cancers are highly sensitive to the amount of linoleic acid in the diet, including breast cancer. Once again, butter beats margarine and vegetable oils. Spontaneous breast tumors develop only half as frequently in rats fed butter than in rats fed margarine or safflower oil (Yanagi, S. et al. (1989) Comparative effects of butter, margarine, safflower oil and dextrin on mammary tumorigenesis in mice and rats. In: The Pharmacological Effects of Lipids.). The development of breast tumors in rats fed carcinogens is highly dependent on the linoleic acid content of the diet. The effect plateaus around 4.4% of calories, after which additional linoleic acid has no further effect.

Conversely, omega-3 fish oil protects against skin cancer in the hairless mouse, even in large amounts. In another study, not only did fish oil protect against skin cancer, it doubled the amount of time researchers had to expose the mice to UV light to cause sunburn!

Thus, the amount of linoleic acid in the diet as well as the balance between omega-6 and omega-3 determine the susceptibility of the skin to damage from UV rays. This is a very straightforward explanation for the beautiful skin of people eating traditional fats like butter and coconut oil. It's also a straightforward explanation for the poor skin and sharply rising melanoma incidence of Western nations (source). Melanoma is the most deadly form of skin cancer. If you're dark-skinned, you're off the hook:

I believe the other factor contributing to rising melanoma incidence is sunscreen. Most sunscreens block sunburn-causing UVB rays but not melanoma-causing UVA rays. The fact that they allow you to remain in the sun for longer without burning means they increase your exposure to UVA. I've written about this before. Sunscreen also blocks vitamin D formation in the skin, a process that some researchers believe also promotes cancer. I'll end with a couple more graphs that are self-explanatory (source). "PUFA" stands for polyunsaturated faty acids, and primarily represents linoleic acid:





*Not only do Thais have clear skin, they also have clear arteries. Autopsies performed in the 1960s showed that residents of Bangkok had a low prevalence of atherosclerosis and a rate of heart attack (myocardial infarction) about 1/10 that of Americans living in Los Angeles.

Little Ado About Nothing—Part Deux

Last November, the insurance industry offered to do away with pre-existing conditions limitations. This week the health insurance trade associations have also offered to phase-out the practice of varying premiums based on health status in the individual market.From their letter to Congress this week:Specifically, by enacting an effective, enforceable requirement that all Americans assume

More Thoughts on the Glycemic Index

Monday, March 23, 2009
In the last post, I reviewed the controlled trials on the effect of the glycemic index (GI) of carbohydrate foods on health. I concluded that there is no convincing evidence that a low GI diet is better for health than a high GI diet, and in fact the long-term trials suggest that a high GI diet may even be better for insulin sensitivity.

Despite the graphs I presented in the last post, for the "average" individual the GI of carbohydrate foods can affect the glucose and insulin response to carbohydrate foods somewhat, even in the context of an actual meal. If you compare two meals of very different GI, the low GI meal will cause less insulin secretion and cause less total blood glucose in the plasma over the course of the day (although the differences in blood glucose may not apply to all individuals).

But is that biologically significant? In other words, do those differences matter when it comes to health? I would argue probably not, and here's why: there's a difference between post-meal glucose and insulin surges and chronically elevated glucose and insulin. Chronically elevated insulin is a marker of metabolic dysfunction, while post-meal insulin surges are not (although glucose surges in excess of 140 mg/dL indicate glucose intolerance). Despite what you may hear from some sectors of the low-carbohydrate community, insulin surges do not necessarily lead to insulin resistance. Just ask a Kitavan. They get 69% of their 2,200 calories per day from high-glycemic starchy tubers and fruit (380 g carbohydrate), with not much fat to slow down digestion. Yet they have low fasting insulin, very little body fat and an undetectable incidence of diabetes, heart attack and stroke. That's despite a significant elderly population on the island.

Furthermore, in the 4-month GI intervention trial I mentioned last time, they measured something called glycated hemoglobin (HbA1c). HbA1c is a measure of the amount of blood glucose that has "stuck to" hemoglobin molecules in red blood cells. It's used to determine a person's average blood glucose concentration over the course of the past few weeks. The higher your HbA1c, the poorer your blood glucose control, the higher your likelihood of having diabetes, and the higher your cardiovascular risk. The low GI group had a statistically significant drop in their HbA1c value compared to the high GI group. But the difference was only 0.06%, a change that is biologically meaningless.

OK, let's take a step back. The goal of thinking about all this is to understand what's healthy, right? Let's take a look at how healthy cultures eat their carbohydrate foods. Cultures that rely heavily on carbohydrate generally fall into three categories: they eat cooked starchy tubers, they grind and cook their grains, or they rely on grains that become very soft when cooked. In the first category, we have Africans, South Americans, Polynesians and Melanesians (including the Kitavans). In the second, we have various Africans, Europeans (including the villagers of the Loetschental valley), Middle Easterners and South Americans. In the third category, we have Asians, Europeans (the oat-eating residents of the outer Hebrides) and South Americans (quinoa-eating Peruvians).

The pattern here is one of maximizing GI, not minimizing it. That's not because high GI foods are inherently superior, but because traditional processing techniques that maximize the digestibility of carbohydrate foods also tend to increase their GI. I believe healthy cultures around the world didn't care about the glycemic index of foods, they cared about digestibility and nutritional value.

The reason we grind grains is simple. Ground grains are digested more easily and completely (hence the higher GI).  Furthermore, ground grains are more effective than intact grains at breaking down their own phytic acid when soaked, particularly if they're allowed to ferment. This further increases their nutritional value.

The human digestive system is delicate. Cows can eat whole grass seeds and digest them using their giant four-compartment stomach that acts as a fermentation tank. Humans that eat intact grains end up donating them to the waste treatment plant. We just don't have the hardware to efficiently extract the nutrients from cooked whole rye berries, unless you're willing to chew each bite 47 times. Oats, quinoa, rice, beans and certain other starchy seeds are exceptions because they're softened sufficiently by cooking.

Grain consumption and grinding implements appear simultaneously in the archaeological record. Grinding has always been used to increase the digestibility of tough grains, even before the invention of agriculture when hunter-gatherers were gathering wild grains in the fertile crescent. Some archaeologists consider grinding implements one of the diagnostic features of a grain-based culture. Carbohydrate-based cultures have always prioritized digestibility and nutritional value over GI.

Finally, I'd like to emphasize that some people don't have a good relationship with carbohydrate. Diabetics and others with glucose intolerance should be very cautious with carbohydrate foods. The best way to know how you deal with carbohydrate is to get a blood glucose meter and use it after meals. For $70 or less, you can get a cheap meter and 50 test strips that will give you a very good idea of your glucose response to typical meals (as opposed to a glucose bomb at the doctor's office). Jenny Ruhl has a tutorial that explains the process. It's also useful to pay attention to how you feel and look with different amounts of carbohydrate in your diet.

Integrated Nutrition, Lifestyle and Health Database

Saturday, March 21, 2009
Ricardo from the website Canibais e Reis has just released a fantastic resource for anyone who's interested in the relationship between nutrition, lifestyle and health. It's an excel spreadsheet that integrates information from several international sources, including:
  • UN Food and Agriculture Organization Statistical Yearbook
  • FAOSTAT food consumption database
  • British Heart Foundation Health Statistics database
  • World Health Organization Global Health Atlas
This database provides a wealth of information on 86 different countries, and even includes a macro feature that automatically plots variables. This is an empowering resource for those of us who like to do our own research and come to our own conclusions, and I thank Ricardo for his hard work.

You can read more about the database and download it here.

It's Time to Let Go of The Glycemic Index

Thursday, March 19, 2009
The glycemic index (GI) is a measure of how much an individual food elevates blood sugar when it's eaten. To measure it, investigators feed a person a food that contains a fixed amount of carbohydrate, and measure their blood glucose response over time. Then they determine the area under the glucose curve and compare it to a standard quickly digesting food such as white bread or pure glucose.

Each food must contain the same total amount of carbohydrate, so you might have to eat a big plate of carrots to compare with a slice of bread. You end up with a number that reflects the food's ability to elevate glucose when eaten in isolation. It typically depends on how quickly the carbohydrate is absorbed, with higher numbers usually resulting from faster absorption.

The GI is a standby of modern nutritional advice. It's easy to believe in because processed foods tend to have a higher glycemic index than minimally processed foods, high blood sugar is bad, and chronically high insulin is bad. But many people have criticized the concept, and rightly so.

Blood sugar responses to a carbohydrate-containing foods vary greatly from person to person. For example, I can eat a medium potato and a big slice of white bread (roughly 60 g carbohydrate) with nothing else and only see a modest spike in my blood sugar. I barely break 100 mg/dL and I'm back at fasting glucose levels within an hour and a half. You can see a graph of this experiment here. That's what happens when you have a well-functioning pancreas and insulin-sensitive tissues. Your body shunts glucose into the tissues almost as rapidly as it enters the bloodstream. Someone with impaired glucose tolerance might have gone up to 170 mg/dL for two and a half hours on the same meal.

The other factor is that foods aren't eaten in isolation. Fat, protein, acidity and other factors slow carbohydrate absorption in the context of a normal meal, to the point where the GI of the individual foods become much less pronounced.

It's time to put my money where my mouth is. Researchers have conducted a number of controlled trials comparing low-GI diets to high-GI diets. I've done an informal literature review to see what the overall findings are. I'm only interested in long-term studies-- 10 weeks or longer-- and I've excluded studies using subjects with metabolic disorders such as diabetes.

The question I'm asking with this review is, what are the health effects of a low-glycemic index diet on a healthy normal-weight or overweight person? I found a total of seven studies on PubMed in which investigators varied GI while keeping total carbohydrate about the same, for 10 weeks or longer. I'll present them out of chronological order because they flow better that way.

Study #1. Investigators put overweight women on a 12-week diet of either high-GI or low-GI foods with an equal amount of total carbohydrate. Both were unrestricted in calories. Body composition and total food intake were the same on both diets. The reason became apparent when they measured the subjects' glucose and insulin response to the high- and low-GI meals, and found that they were the same!

Study #2. Investigators divided 129 overweight young adults into four different diet groups for 12 weeks. Diet #1: high GI, high carbohydrate (60%). Diet #2: low GI, high carbohydrate. Diet #3: high GI, high-protein (28%). Diet #4: low GI, high protein. The high-protein diets were also a bit higher in fat. Although the differences were small and mostly not statistically significant, participants on diet #3 improved the most overall in my opinion. They lost the most weight, and had the greatest decrease in fasting insulin and calculated insulin resistance. Diet #2 came out modestly ahead of diet #1 on fat loss and fasting insulin.

Study #3. At 18 months, this is by far the longest trial. Investigators assigned 203 healthy Brazilian women to either a low-GI or high-GI energy-restricted diet. The difference in GI between the two diets was very large; the high-GI diet was double the low-GI diet. Weight loss was a meager 1/3 pound greater in the low-GI group, a difference that was not statistically significant at 18 months. Insulin resistance and fasting insulin decreased in the high-GI group but increased in the low-GI group, also not statistically significant.

Study #4. The FUNGENUT study. In this 12-week intervention, investigators divided 47 subjects with the metabolic syndrome into two diet groups. One was a high-glycemic, high-wheat group; the other was a low-glycemic, high-rye group. After 12 weeks, there was an improvement in the insulinogenic index (a marker of early insulin secretion in response to carbohydrate) in the rye group but not the wheat group. Glucose tolerance was essentially the same in both groups.

What makes this study unique is they went on to look at changes in gene expression in subcutaneous fat tissue before and after the diets. They found a decrease in the expression of stress and inflammation-related genes in the rye group, and an increase in stress and inflammation genes in the wheat group. They interpreted this as being the result of the different GIs of the two diets.

I have a different interpretation. I believe wheat is a uniquely unhealthy food, that promotes inflammation and general metabolic havoc over a long period of time. This probably relates at least in part to its gluten content, which is double that of rye. Dr. William Davis has had great success with his cardiac patients by counseling them to eliminate wheat. He agrees based on his clinical experience that wheat has uniquely damaging effects on the metabolism that other sources of starch do not have.

Study #5. This is the only study I've seen that has found a tangible benefit for glycemic index modification. Investigators divided 18 subjects with elevated cardiovascular disease risk markers into two diets differing in their GI, for 12 weeks. The low-glycemic group lost 4 kg (statistically significant), while the high-glycemic group lost 1.5 kg (not statistically significant).  In addition, the low-GI group ended up with lower 24-hour blood glucose measurements.  This study was a bit strange because of the fact that the high-GI group started off 14 kg heavier than the low-GI group, and the way the data are reported is difficult to understand.  Perhaps these limitations, along with the study's incongruence with other controlled trails, are what inspired the authors to describe it as a pilot study.

Study #6. 45 overweight females were divided between high-GI and low-GI diets for 10 weeks. The low-GI group lost a small amount more fat than the high-GI group, but the difference wasn't significant. The low-GI group also had a 10% drop in LDL cholesterol.

Study #7. This was the second-longest trial, at 4 months. 34 subjects with impaired glucose tolerance were divided into three diet groups. Diet #1: high-carbohydrate (60%), high-GI. Diet #2: high-carbohydrate, low-GI. Diet #3: "low-carbohydrate" (49%), "high-fat" (monounsaturated from olive and canola oil). The diet #1 group lost the most weight, followed by diet #2, while diet #3 gained weight. The differences were small but statistically significant. The insulin and triglyceride response to a test meal improved in diet group #1 but not #2. The insulin response also improved in group #3. The high-GI group came out looking pretty good. 

[Update 10/2011-- please see this post for a recent example of a 6 month controlled trial including 720 participants that tested the effect of glycemic index modification on body fatness and health markers-- it is consistent with the conclusion below]

Overall, these studies do not support the idea that lowering the glycemic index of carbohydrate foods is useful for weight loss, insulin or glucose control, or anything else besides complicating your life. 

Further reading:

The Fructose Index is the New Glycemic Index

The Newest Health Care Reform Arithmetic--Unbelievable!

"A coalition representing 30 health care organizations on Monday asked lawmakers in the House and Senate to suspend pay-as-you-go rules when drafting and passing health care overhaul legislation, saying much of the savings introduced by such a plan would be realized beyond the rules' 10-year budget window."That paragraph from last week's Kaiser Daily Health Policy Report caught my eye.I don't

Physician Payment Reform--Time for Hard Choices

Wednesday, March 18, 2009
I recently authored a guest editorial in the February 15th edition of Family Practice News--"The Leading Independent Newspaper for Family Physicians."Many years ago, the Congress established the Sustainable Growth Rate Formula (SGR) to control physician spending in Medicare. The concept is simple, if Medicare physician costs grow at a pace beyond affordability, next year's payments get cut to

The Intensifying Collapse of the Health Care System, Why It's Different This Time, and What We Need to Think About Along the Way

Tuesday, March 17, 2009
The Intensifying Collapse of the Health Care System, Why It's Different This Time, and What We Need to Think About Along the Wayby Brian Klepper and David C. KibbeMore than at any time in recent memory, powerful forces are buffeting the health care sector. We are in the midst of profound upheaval, driven by market and policy responses to the industry's long-term excesses. We can already see

Paleopathology at the Origins of Agriculture

Sunday, March 15, 2009
In April of 1982, archaeologists from around the globe converged on Plattsburgh, New York for a research symposium. Their goal:
...[to use] data from human skeletal analysis and paleopathology [the study of ancient diseases] to measure the impact on human health of the Neolithic Revolution and antecedent changes in prehistoric hunter-gatherer food economies. The symposium developed out of our perception that many widely debated theories about the origins of agriculture had testable but untested implications concerning human health and nutrition and our belief that recent advances in techniques of skeletal analysis, and the recent explosive increase in data available in this field, permitted valid tests of many of these propositions.
In other words, they got together to see what happened to human health as populations adopted agriculture. They were kind enough to publish the data presented at the symposium in the book Paleopathology at the Origins of Agriculture, edited by the erudite Drs. Mark Nathan Cohen and George J. Armelagos. It appears to be out of print, but luckily I have access to an excellent university library.

There are some major limitations to studying human health by looking at bones. The most obvious is that any soft tissue pathology will have been erased by time. Nevertheless, you can learn a lot from a skeleton. Here are the main health indicators discussed in the book:
  • Mortality. Archaeologists are able to judge a person's approximate age at death, and if the number of skeletons is large enough, they can paint a rough picture of the life expectancy and infant mortality of a population.
  • General growth. Total height, bone thickness, dental crowding, and pelvic and skull shape are all indicators of relative nutrition and health. This is particularly true in a genetically stable population. Pelvic depth is sensitive to nutrition and determines the size of the birth canal in women.
  • Episodic stress. Bones and teeth carry markers of temporary "stress", most often due to starvation or malnutrition. Enamel hypoplasia, horizontal bands of thinned enamel on the teeth, is probably the most reliable marker. Harris lines, bands of increased density in long bones that may be caused by temporary growth arrest, are another type.
  • Porotic hyperostosis and cribra orbitalia. These are both skull deformities that are caused by iron deficiency anemia, and are rather creepy to look at. They're typically caused by malnutrition, but can also result from parasites.
  • Periosteal reactions. These are bone lesions resulting from infections.
  • Physical trauma, such as fractures.
  • Degenerative bone conditions, such as arthritis.
  • Isotopes and trace elements. These can sometimes yield information about the nutritional status, diet composition and diet quality of populations.
  • Dental pathology. My favorite! This category includes cavities, periodontal disease, missing teeth, abscesses, tooth wear, and excessive dental plaque.
The book presents data from 19 regions of the globe, representing Africa, Asia, the Middle East, Europe, South America, with a particular focus on North America. I'll kick things off with a fairly representative description of health in the upper Paleolithic in the Eastern Mediterranean. The term "Paleolithic" refers to the period from the invention of stone tools by hominids 2.5 million years ago, to the invention of agriculture roughly 10,000 years ago. The upper Paleolithic lasted from about 40,000 to 10,000 years ago. From page 59:
In Upper Paleolithic times nutritional health was excellent. The evidence consists of extremely tall stature from plentiful calories and protein (and some microevolutionary selection?); maximum skull base height from plentiful protein, vitamin D, and sunlight in early childhood; and very good teeth and large pelvic depth from adequate protein and vitamins in later childhood and adolescence...
Adult longevity, at 35 years for males and 30 years for females, implies fair to good general health...
There is no clear evidence for any endemic disease.
The level of skeletal (including cranial and pelvic) development Paleolithic groups exhibited has remained unmatched throughout the history of agriculture. There may be exceptions but the trend is clear. Cranial capacity was 11% higher in the upper Paleolithic. You can see the pelvic data in this table taken from Paleopathology at the Origins of Agriculture.

There's so much information in this book, the best I can do is quote pieces of the editor's summary and add a few remarks of my own. One of the most interesting things I learned from the book is that the diet of many hunter-gatherer groups changed at the end of the upper Paleolithic, foreshadowing the shift to agriculture. From pages 566-568:
During the upper Paleolithic stage, subsistence seems focused on relatively easily available foods of high nutritional value, such as large herd animals and migratory fish. Some plant foods seem to have been eaten, but they appear not to have been quantitatively important in the diet. Storage of foods appears early in many sequences, even during the Paleolithic, apparently to save seasonal surpluses for consumption during seasons of low productivity.

As hunting and gathering economies evolve during the Mesolithic [period of transition between hunting/gathering and agriculture], subsistence is expanded by exploitation of increasing numbers of species and by increasingly heavy exploitation of the more abundant and productive plant species. The inclusion of significant amounts of plant food in prehistoric diets seems to correlate with increased use of food processing tools, apparently to improve their taste and digestibility. As [Dr. Mark Nathan] Cohen suggests, there is an increasing focus through time on a few starchy plants of high productivity and storability. This process of subsistence intensification occurs even in regions where native agriculture never developed. In California, for example, as hunting-gathering populations grew, subsistence changed from an early pattern of reliance on game and varied plant resources to to one with increasing emphasis on collection of a few species of starchy seeds and nuts.

...As [Dr. Cohen] predicts, evolutionary change in prehistoric subsistence has moved in the direction of higher carrying capacity foods, not toward foods of higher-quality nutrition or greater reliability. Early nonagricultural diets appear to have been high in minerals, protein, vitamins, and trace nutrients, but relatively low in starch. In the development toward agriculture there is a growing emphasis on starchy, highly caloric food of high productivity and storability, changes that are not favorable to nutritional quality but that would have acted to increase carrying capacity, as Cohen's theory suggests.
Why am I getting the feeling that these archaeologists have a better grasp of human nutrition than the average medical doctor or nutritionist? They have the Price-esque understanding that comes from comparing the diets and multi-generational health of diverse human populations.

One of the interesting things I learned from the book is that Mesolithic populations, groups that were halfway between farming and hunting-gathering, were generally as healthy as hunter-gatherers:
...it seems clear that seasonal and periodic physiological stress regularly affected most prehistoric hunting-gathering populations, as evidenced by the presence of enamel hypoplasias and Harris lines. What also seems clear is that severe and chronic stress, with high frequency of hypoplasias, infectious disease lesions, pathologies related to iron-deficiency anemia, and high mortality rates, is not characteristic of these early populations. There is no evidence of frequent, severe malnutrition, so the diet must have been adequate in calories and other nutrients most of the time. During the Mesolithic, the proportion of starch in the diet rose, to judge from the increased occurrence of certain dental diseases [with exceptions to be noted later], but not enough to create an impoverished diet... There is a possible slight tendency for Paleolithic people to be healthier and taller than Mesolithic people, but there is no apparent trend toward increasing physiological stress during the mesolithic.
Cultures that adopted intensive agriculture typically showed a marked decline in health indicators. This is particularly true of dental health, which usually became quite poor.
Stress, however, does not seem to have become common and widespread until after the development of high degrees of sedentism, population density, and reliance on intensive agriculture. At this stage in all regions the incidence of physiological stress increases greatly, and average mortality rates increase appreciably. Most of these agricultural populations have high frequencies of porotic hyperostosis and cribra orbitalia, and there is a substantial increase in the number and severity of enamel hypoplasias and pathologies associated with infectious disease. Stature in many populations appears to have been considerably lower than would be expected if genetically-determined maxima had been reached, which suggests that the growth arrests documented by pathologies were causing stunting... Incidence of carbohydrate-related tooth disease increases, apparently because subsistence by this time is characterized by a heavy emphasis on a few starchy food crops.
Infectious disease increased upon agricultural intensification:
Most [studies] conclude that infection was a more common and more serious problem for farmers than for their hunting and gathering forebears; and most suggest that this resulted from some combination of increasing sedentism, larger population aggregates, and the well-established synergism between infection and malnutrition.
There are some apparent exceptions to the trend of declining health with the adoption of intensive agriculture. In my observation, they fall into two general categories. In the first, health improves upon the transition to agriculture because the hunter-gatherer population was unhealthy to begin with. This is due to living in a marginal environment or eating a diet with a high proportion of wild plant seeds. In the second category, the culture adopted rice. Rice is associated with less of a decline in health, and in some cases an increase in overall health, than other grains such as wheat and corn. In chapter 21 of the book Ancient Health: Bioarchaeological Interpretations of the Human Past, Drs. Michelle T Douglas and Michael Pietrusewsky state that "rice appears to be less cariogenic [cavity-promoting] than other grains such as maize [corn]."

One pathology that seems to have decreased with the adoption of agriculture is arthritis. The authors speculate that it may have more to do with strenuous activity than other aspects of the lifestyle such as diet. Another interpretation is that the hunter-gatherers appeared to have a higher arthritis rate because of their longer lifespans:
The arthritis data are also complicated by the fact that the hunter-gatherers discussed commonly displayed higher average ages at death than did the farming populations from the same region. The hunter-gatherers would therefore be expected to display more arthritis as a function of age even if their workloads were comparable [to farmers].
In any case, it appears arthritis is normal for human beings and not a modern degenerative disease.

And the final word:
Taken as a whole, these indicators fairly clearly suggest an overall decline in the quality-- and probably in the length-- of human life associated with the adoption of agriculture.

Are the MK-4 and MK-7 Forms of Vitamin K2 Equivalent?

Wednesday, March 11, 2009
The vitamin K found in food can be divided into two categories: phylloquinone
(K1) and menaquinone (K2). K1 is concentrated in leafy greens and other green vegetables. K2 can be further subdivided into menaquinone-4 through -14. The number represents the length of the side chain attached to the napthoquinone ring. Menaquinone-"X" can be abbreviated MK-"X". MK-4 is the type synthesized by animals for their own use from K1 (and from MK-7 in rats). MK-5 through MK-14 are synthesized by bacteria. MK-7 in particular is made in large amounts by the bacterium Bacillus subtilis that ferments the infamous Japanese condiment natto. It's also sold as a supplement. Animals concentrate MK-4 in a number of organs, with smaller amounts of K1. Certain organs such as the brain, pancreas and salivary gland show an overwhelming preference for MK-4 over K1 in rodents and humans. The liver is a notable exception; in some animals, including humans, it concentrates longer menaquinones to a greater extent than MK-4 if they're present in the diet.

As far as I can tell, MK-4 is capable of performing all the functions of vitamin K. MK-4 can even activate blood clotting factors, which is a role traditionally ascribed to vitamin K1. Babies are often born clotting deficient, which is why we give newborns vitamin K1 injections in the U.S. to prevent hemorrhaging. In Japan, they give children MK-4 to prevent hemorrhage, an intervention that is very effective. Could that have to do with the fact that Japan has half the infant mortality rate of the U.S.?

Certain cultures would have had a predominance of MK-4 over other forms of vitamin K in the diet, which supports the idea that MK-4 can stand nearly alone. These cultures include heavy consumers of dairy like the Masai. Humans go through one of their most critical growth phases-- infancy-- with most of their vitamin K coming from MK-4. Colostrum, the first milk to come out, is particularly rich in MK-4.

Vitamin K is required to activate certain types of proteins, called Gla proteins. Gla stands for gamma-carboxyglutamic acid, a modified amino acid that's synthesized using vitamin K (by a reaction called gamma-carboxylation). Gla proteins are important: the class includes MGP, osteocalcin and blood clotting factors, important for keeping arteries clear, bones strong and blood clotting correctly.

I've said before that vitamin K's function is to carboxylate Gla proteins. In fact, that's a gross oversimplification. Research on vitamin K2 is turning up new functions all the time. One of the more exciting things that's been discovered is that it acts like hormone, activating a nuclear receptor called the steroid and xenobiotic receptor (SXR) and thereby influencing the expression of a number of genes. This puts it in the same category as vitamin A and D. It also acts as an antioxidant, a cofactor for sphingolipid synthesis in the brain, and an activator of protein kinase A signaling. These are all functions that have been studied in the context of MK-4, and for most of them, no one knows whether MK-7 has equivalent effects.

I'm always on the lookout for studies that can shed light on the question of whether MK-4 and MK-7 are equivalent. MK-7 is able to activate clotting factors and osteocalcin, so it can clearly function as a cofactor for gamma-carboxylation in some contexts. Osteocalcin is a Gla protein that's important for bone health. MK-7 supposedly hangs out in the blood for longer than MK-4 in humans, which is one of the things MK-7 supplement manufacturers like to mention, but these findings were conducted by MK-7 supplement vendors and the results have not been published. Interestingly, MK-4 and MK-7 have the exact same plasma half-life in rats, so I think the human experiment should be repeated. In any case, a longer plasma half-life is not evidence for superiority of one form over another in my opinion.

Today, I found another difference between MK-4 and MK-7. I was reading a paper about SXR-independent effects of vitamin K2 on gene expression. The investigators found that MK-4 strongly activates transcription of two specific genes in osteoblast cells. Osteoblasts are cells that create bone tissue. The genes are GDF15 and STC2 and they're involved in bone and cartilage formation. They tested K1 and MK-7, and in contrast to MK-4, they did not activate transcription of the genes in the slightest. This shows that MK-4 has effects on gene expression in bone tissue that MK-7 doesn't have.

I tend to think there's a reason why animals synthesize MK-4 rather than other forms of vitamin K2. Vitamin K2 MK-4 seems to be able to perform all the functions of vitamin K, including activating Gla proteins, participating in sphingomyelin synthesis, binding SXR, and activating transcription through protein kinase A. That's what you would expect for an animal that had evolved to use its own form of K2. Investigators haven't tested whether MK-7 is capable of performing all these functions, but apparently there's at least one it cannot perform.

I'd bet my bottom dollar there are other important functions of MK-4 that have not yet been identified, and functions whose full importance has not yet been appreciated. There's no way to know whether MK-7 can fully stand in for MK-4 as long as we don't know all of MK-4's functions. I also think it's worth mentioning that MK-4 is the only form of vitamin K2 that's been shown to reduce fracture risk in clinical trials.

That being said, MK-7 may still have a place in a healthy diet. Just because it can't do everything MK-4 can, doesn't mean it has no role. It may be able to fill in for MK-4 in some functions, or reduce the dietary need for MK-4. But no one really knows at this point. Hunter-gatherers would have had a source of longer menaquinones, including MK-7, from livers. So it's possible that we're adapted to a modest MK-7 intake on top of MK-4.

Margarine and Phytosterolemia

Monday, March 9, 2009
Margarine is one of my favorite foods. To rip on. It's just so easy!

The body has a number of ways of keeping bad things out while taking good things in. One of the things it likes to keep out are plant sterols and stanols (phytosterols), cholesterol-like molecules found in plants. The human body even has two enzymes dedicated to pumping phytosterols back into the gut as they try to diffuse across the intestinal lining: the sterolins. These enzymes actively block phytosterols from passing into the body, but allow cholesterol to enter. Still, a little bit gets through, proportional to the amount in the diet.

As a matter of fact, the body tries to keep most things out except for the essential nutrients and a few other useful molecules. Phytosterols, plant "antioxidants" like polyphenols, and just about anything else that isn't body building material gets actively excluded from circulation or rapidly broken down by the liver. And almost none of it gets past the blood-brain barrier, which protects one of our most delicate organs. It's not surprising once you understand that many of these substances are bioactive: they have drug-like effects that interfere with enzyme activity and signaling pathways. For example, the soy isoflavone genistein abnormally activates estrogen receptors. Your body does not like to hand over the steering wheel to plant chemicals, so it actively defends itself.

A number of trials have shown that large amounts of phytosterols in the diet lower total cholesterol and LDL. This has led to the (still untested) belief that phytosterols lower heart attack risk. The main problem with this belief is that statins are the only cholesterol- and LDL-lowering therapy that also lowers mortality (in some specific groups of people), and the survival benefit of statins is probably not even mediated by their cholesterol-lowering effects! See this article by Anthony Colpo for further reading.

Lowering total cholesterol and LDL through diet and drugs other than statins does not reduce mortality in controlled trials. Decades of controlled diet trials showed that replacing saturated fat with polyunsaturated vegetable oil lowers cholesterol, lowers LDL, but doesn't touch total mortality or death from cardiovascular disease. Soy contains a lot of phytosterols, which is one of the reasons it's heavily promoted as a health food.

All right, let's put on our entrepreneur hats. We know phytosterols lower cholesterol. We know soy is being promoted as a healthier alternative to meat. We know butter is being demonized as a source of artery-clogging saturated fat. I have an idea. Let's make a margarine that contains a massive dose of phytosterols and market it as heart-healthy. We'll call it Benecol, and we'll have doctors recommend it to cardiac patients.

Here are the ingredients:
Liquid Canola Oil, Water, Partially Hydrogenated Soybean Oil, Plant Stanol Esters, Salt, Emulsifiers, (Vegetable Mono- and Diglycerides, Soy Lecithin), Hydrogentated Soybean Oil, Potassium Sorbate, Citric Acid and Calcium Disodium EDTA to Preserve Freshness, Artificial Flavor, DL-alpha-Tocopheryl Acetate, Vitamin A Palmitate, Colored with Beta Carotene.
Are you kidding me? Partially hydrogenated soybean oil for cardiac patients? A nice big dose of omega-6 linoleic acid? A mega-dose of "heart-healthy" plant stanols? This stuff is like a molotov cocktail for your coronary arteries!

And I haven't even gotten to the best part yet. There's a little disorder called phytosterolemia that proponents of phytosterols conveniently ignore. These patients have a mutation in one of their sterolin genes that allows phytosterols (including stanols) to pass into their circulation more easily. They end up with 10-25 times more phytosterols in their circulation than a normal individual. What kind of health benefits do these people see? Premature atherosclerosis, an early death from heart attacks, abnormal accumulation of sterols and stanols in the tendons, and liver damage.

Despite the snappy-looking tub, margarine is just another industrial food-like substance that will help you get underground in a hurry. In the U.S., manufacturers can put the statement "no trans fat" on a product's label, and "0 g trans fat" on the nutrition label, if it contains less than 0.5 grams of trans fat per serving. A serving of Benecol is 14 grams. That means it could be up to 3.5 percent trans fat and still labeled "no trans fat". That's a crime. This stuff is being recommended to cardiac patients.

When deciding whether or not a food is healthy, the precautionary principle is in order. Margarine is a food that has not withstood the test of time. Show me a single healthy culture on this planet that eats margarine regularly. Cow juice may not be as flashy as the latest designer food, but it has sustained healthy cultures for generations. The U.S. used to belong to those ranks, when coronary heart disease was a twinkle in a cardiologist's eye. The likelihood of a new industrially processed food being healthy is about the same as a blindfolded monkey making a basket from half court. It's not impossible, but I wouldn't stake my life on it.

Latest Study on Vitamin K and Coronary Heart Disease

Saturday, March 7, 2009
A Dutch group led by Dr. Yvonne T. van der Schouw recently published a paper examining the link between vitamin K intake and heart attack (thanks Robert). They followed 16,057 women ages 49-70 years for an average of 8.1 years, collecting data on their diet and incidence of heart attack.

They found no relationship between K1 intake and heart attack incidence. K1 is the form found in leafy greens and other plant foods. They found that each 10 microgram increase in daily vitamin K2 consumption was associated with a 9% lower incidence of heart attack. Participants consumed an average of 29 micrograms K2 per day, with a range of 0.9 to 128 micrograms. That means that participants with the highest intake had a very much reduced incidence of heart attack on average. Vitamin K2 comes from animal foods (especially organs and pastured dairy)and fermented foods such as cheese, sauerkraut, miso and natto. Vitamin K is fat-soluble, so low-fat animal foods contain less of it. Animal foods contain the MK-4 subtype while fermentation produces longer menaquinones, MK-5 through MK-14.

There's quite a bit of evidence to support the idea that vitamin K2 inhibits and possibly reverses arterial calcification, which is possibly the best overall measure of heart attack risk. It began with the observations of Dr. Weston Price, who noticed an inverse relationship between the K2 MK-4 content of butter and deaths from coronary heart disease and pneumonia in several regions of the U.S. You can find those graphs in Nutrition and Physical Degeneration.

The 25% of participants eating the most vitamin K2 (and with the lowest heart attack risk) also had the highest saturated fat, cholesterol, protein and calcium intake. They were much less likely to have elevated cholesterol, but were more likely to be diabetic.

Here's where the paper gets strange. They analyzed the different K2 subtypes individually (MK-4 through MK-9). MK-7 and MK-6 had the strongest association with reduced heart attack risk per microgram consumed, while MK-4 had no significant relationship. MK-8 and MK-9 had a weak but significant protective relationship.

There are a few things that make me skeptical about this result. First of all, the studies showing prevention/reversal of arterial calcification in rats were done with MK-4. MK-4 inhibits vascular calcification in rats whereas I don't believe the longer menaquinones have been tested. Furthermore, they attribute a protective effect to MK-7 in this study, but the average daily intake was only 0.4 micrograms! You could get that amount of K2 if a Japanese person who had eaten natto last week sneezed on your food. I can't imagine that amount of MK-7 is biologically significant. That, among other things, makes me skeptical of what they're really observing.

I'm not convinced of their ability to parse the effect into the different K2 subtypes. They mentioned in the methods section that their diet survey wasn't very accurate at estimating the individual K2 subtypes. Combine that with the fact that the K2 content of foods varies quite a bit by animal husbandry practice and type of cheese, and you have a lot of variability in your data. Add to that the well-recognized variability inherent in these food questionnaires, and you have even more variabiltiy.

I'm open to the idea that longer menaquinones (K2 MK-5 and longer, including MK-7) play a role in preventing cardiovascular disease, but I don't find the evidence sufficient yet. MK-4 is the form of K2 that's made by animals, for animals. Mammals produce it in their breast milk and other animals produce it in eggs all the way down to invertebrates. I think we can assume they make MK-4, and not the longer menaquinones, for a reason.

MK-4 is able to play all the roles of vitamin K in the body, including activating blood clotting factors, a role traditionally assigned to vitamin K1. This is obvious because K2 MK-4 is the only significant source of vitamin K in the diet of infants before weaning. No one knows whether the longer menaquinones are able to perform all the functions of MK-4; it hasn't been tested and I don't know how you could ever be sure. MK-7 is capable of performing at least some of these functions, such as activating osteocalcin and clotting factors.

I do think it's worth noting that the livers of certain animals contain longer menaquinones, including MK-7. So it is possible that we're adapted to eating some of the longer menaquinones. Many cultures also have a tradition of fermented food (probably a relatively recent addition to the human diet), which could further increase the intake of longer menaquinones. The true "optimum", if there is one, may be to eat a combination of forms of K2, including MK-4 and the longer forms. But babies and healthy traditional cultures such as the Masai seem to do quite well on a diet heavily weighted toward MK-4, so the longer forms probably aren't strictly necessary. If you haven't already seen it, check out my post on 20th century butter consumption and coronary heart disease risk in the United States.

Well if you've made it this far, you're a hero (or a nerd)! Now for some humor. From the paper:
The concept of proposing beneficial effects to vitamin K2 seems to have different basis as for vitamin K1. Vitamin K1 has been associated with a heart-healthy dietary pattern in the earlier work in the USA and this attenuated their associations with CHD. Vitamin K2 has different sources and relate to different dietary patterns than vitamin K1. This suggests that the risk reduction with vitamin K2 is not driven by dietary patterns, but through biological effects.
If that reads like gobledygook, it's because it is. They're trying to justify the fact that participants in the highest K2 intake group ate the most saturated fat and cholesterol (since K2 came mostly from cheese, milk and meat), yet had the lowest heart attack incidence and the lowest serum cholesterol. Look at them squirm! Could this "paradox" be the reason the paper was published in an obscure journal? Here's more:
Thus, although our findings may have important practical implications on CVD prevention, it is important to mention that in order to increase the intake of vitamin K2, increasing the portion vitamin K2 rich foods in daily life might not be a good idea. Vitamin K2 might be, for instance more relevant in the form of a supplement or in low-fat dairy. More research into this is necessary.
Translation: "People who ate the most cheese, milk and meat had the lowest heart attack rate, but be careful not to eat those things because they might give you a heart attack. Get your K2 from low-fat dairy (barely contains any) and supplements (same way your ancestors got it). Gimme more money."

What Can Evolution Teach us About the Human Diet?

Wednesday, March 4, 2009
Vegetarians deserve our respect. They're usually thoughtful, conscientious people who make sacrifices for environmental and ethical reasons. I was vegetarian for a while myself, until I decided I could find ethical meat.

Vegetarianism and especially veganism can get pretty ideological sometimes. People who have strong beliefs like to think that their belief system is best for all aspects of their lives and the world, not just bits and pieces. Many vegetarians believe their way of eating is healthier than omnivory or carnivory. It's easy to believe, since mainstream nutrition research has a distinctly pro-vegetarian slant. One of the classic arguments for vegetarianism goes something like this: our closest living relatives, chimpanzees and bonobos, are mostly vegetarian, therefore that's the diet to which we're adapted as well. Here's the problem with that argument:

Where are chimps (Pan troglodytes) on this chart? They aren't on it, for two related reasons: they aren't in the genus Homo, and they diverged from us about 5 million years ago. Homo erectus diverged from our lineage about 1.5 million years ago. I don't know if you've ever seen a Homo erectus skull, but 1.5 million years is clearly enough time to do some evolving. Homo erectus hunted and ate animals as a significant portion of its diet.

If you look at the chart above, Homo rhodesiensis (typically considered a variant of Homo heidelbergensis) is our closest ancestor, and our point of divergence with neanderthals (Homo neanderthalensis). Some archaeologists believe H. heidelbergensis was the same species as modern Homo sapiens. I haven't been able to find any direct evidence of the diet of H. heidelbergensis from bone isotope ratios, but the indirect evidence indicates that they were capable hunters who probably got a large proportion of their calories from meat. In Europe, they hunted now-extinct megafauna such as wooly rhinos. These things make modern cows look like chicken nuggets, and you can bet their fat was highly saturated.

H. heidelbergensis was a skilled hunter and very athletic. They were top predators in their ecosystems, judged by the fact that they took their time with carcasses, butchering them thoroughly and extracting marrow from bones. No predator or scavenger was capable of driving them away from a kill.

Our closest recent relative was Homo neanderthalensis, the neanderthal. They died out around 30,000 years ago. There have been several good studies on the isotope ratios of neanderthal bones, all indicating that neanderthals were basically carnivores. They relied both on land and marine animals, depending on what was available. Needless to say, neanderthals are much more closely related to humans than chimpanzees, having diverged from us less than 500,000 years ago. That's less than one-tenth the time between humans and chimpanzees.

I don't think this necessarily means humans are built to be carnivores, but it certainly blows away the argument that we're built to be vegetarians. It also argues against the idea that we're poorly adapted to eating animal fat. Historical human hunter-gatherers had very diverse diets, but on average were meat-heavy omnivores. This fits well with the apparent diet of our ancestor H. heidelbergensis, except that we've killed most of the megafauna so modern hunter-gatherers have to eat frogs, bugs and seeds.

The Obama Health Care Budget Cost Cuts Pretty Small

In conversations with members of the press the last couple of days, it is apparent the scope of President Obama's proposed federal health care cuts is not well understood in context.Is the President really making hard choices to bring America's health care spending under control?You decide.Here are some facts from the Obama budget message spreadsheets (pages 117-127):In 2012, three years into the

"Five Recommendations for ONC Head Who Understands Health IT Innovation"

The team of David Kibbe and Brian Klepper are at it again with some advice on who best understands the health IT challenge in America:Five Recommendations for ONC Head Who Understands Health IT Innovationby DAVID KIBBE and BRIAN KLEPPERNow that the legislative language of the HITECH Act -- the $20 billion health IT allocation within the economic stimulus package -- has been set, it's time to

A Detailed Analysis of the Obama Health Care Reform Budget

Tuesday, March 3, 2009
Speaking about the imperative to bring America’s entitlement spending under control last December, Barack Obama said, "What we have done is kicked this can down the road. We are now at the end of the road and are not in a position to kick it any further. We have to signal seriousness in this by making sure some of the hard decisions are made under my watch, not someone else's.”Right-on!But in his

Statistics

Monday, March 2, 2009
Ricardo just sent me a link to the British Heart Foundation statistics website. It's a goldmine. They have data on just about every aspect of health and lifestyle in the U.K. I find it very empowering to have access to this kind of information on the internet.

I've just started sifting through it, but something caught my eye. The U.K. is experiencing an obesity epidemic similar to the U.S.:
Here's where it gets interesting. This should look familiar:

Hmm, those trends look remarkably similar. Just like in the U.S, the British are exercising more and getting fatter with each passing year. In fact, maybe exercise causes obesity. Let's see if there's any correlation between the two. I'm going to plot obesity on the X-axis and exercise on the Y-axis to see if there's a correlation. The data points only overlap on three years: 1998, 2003 and 2006. Let's take a look:
By golly, we've proven that exercise causes obesity! Clearly, the more people exercise, the fatter they get. The R-value is a measure of how closely the points fall on the best-fit line. 0.82 isn't bad for this type of data. If only we could get all British citizens to become couch potatoes, obesity would be a thing of the past! OK, I'm kidding. The obesity is obviously caused by something else. I'm illustrating the point that correlation does not equal causation. Even if an association conforms to our preconceived notions of how the world works, that does not justify saying one factor causes another without testing the hypothesis in a controlled experiment.

A Commission on Entitlement Reform--A Good Idea

The Kaisernetwork reported the following today:On Sunday, White House Office of Management and Budget Director Peter Orszag during an appearance on ABC's "This Week" said that Obama might establish a commission on entitlement reform, or broader health care reform, to take some of the authority over the development of legislation from Congress. Under such a commission, the Obama administration and