Sunday, June 27, 2010

Exercise and blood glucose levels: Insulin and glucose responses to exercise

The notion that exercise reduces blood glucose levels is widespread. That notion is largely incorrect. Exercise appears to have a positive effect on insulin sensitivity in the long term, but also increases blood glucose levels in the short term. That is, exercise, while it is happening, leads to an increase in circulating blood glucose. In normoglycemic individuals, that increase is fairly small compared to the increase caused by consumption of carbohydrate-rich foods, particularly foods rich in refined carbohydrates and sugars.

The figure below, from the excellent book by Wilmore and colleagues (2007), shows the variation of blood insulin and glucose in response to an endurance exercise session. The exercise session’s intensity was at 65 to 70 percent of the individuals’ maximal capacity (i.e., their VO2 max). The session lasted 180 minutes, or 3 hours. The full reference to the book by Wilmore and colleagues is at the end of this post.

As you can see, blood insulin levels decreased markedly in response to the exercise bout, in an exponential decay fashion. Blood glucose increased quickly, from about 5.1 mmol/l (91.8 mg/dl) to 5.4 mmol/l (97.2 mg/dl), before dropping again. Note that blood glucose levels remained somewhat elevated throughout the exercise session. But, still, the elevation was fairly small in the participants, which were all normoglycemic. A couple of bagels would easily induce a rise to 160 mg/dl in about 45 minutes in those individuals, and a much larger “area under the curve” glucose response than exercise.

So what is going on here? Shouldn’t glucose levels go down, since muscle is using glucose for energy?

No, because the human body is much more “concerned” with keeping blood glucose levels high enough to support those cells that absolutely need glucose, such as brain and red blood cells. During exercise, the brain will derive part of its energy from ketones, but will still need glucose to function properly. In fact, that need is critical for survival, and may be seen as a bit of an evolutionary flaw. Hypoglycemia, if maintained for too long, will lead to seizures, coma, and death.

Muscle tissue will increase its uptake of free fatty acids and ketones during exercise, to spare glucose for the brain. And muscle tissue will also consume glucose, in part for glycogenesis; that is, for making muscle glycogen, which is being depleted by exercise. In this sense, we can say that muscle tissue is becoming somewhat insulin resistant, because it is using more free fatty acids and ketones for energy, and thus less glucose. Another way of looking at this, however, which is favored by Wilmore and colleagues (2007), is that muscle tissue is becoming more insulin sensitive, because it is still taking up glucose, even though insulin levels are dropping.

Truth be told, the discussion in the paragraph above is mostly academic, because muscle tissue can take up glucose without insulin. Insulin is a hormone that allows the pancreas, its secreting organ, to communicate with two main organs – the liver and body fat. (Yes, body fat can be seen as an “organ”, since it has a number of endocrine functions.) Insulin signals to the liver that it is time to take up blood glucose and either make glycogen (to be stored in the liver) or fat with it (secreting that fat in VLDL particles). Insulin signals to body fat that it is time to take up blood glucose and fat (e.g., packaged in chylomicrons) and make more body fat with it. Low insulin levels, during exercise, will do the opposite, leading to low glucose uptake by the liver and an increase in body fat catabolism.

Resistance exercise (e.g., weight training) induces much higher glucose levels than endurance exercise; and this happens even when one has fasted for 20 hours before the exercise session. The reason is that resistance exercise leads to the conversion of muscle glycogen into energy, releasing lactate in the process. Lactate is in turn used by muscle tissues as a source of energy, helping spare glycogen. It is also used by the liver for production of glucose through gluconeogenesis, which significantly elevates blood glucose levels. That hepatic glucose is then used by muscle tissues to replenish their depleted glycogen stores. This is known as the Cori cycle.

Exercise seems to lead, in the long term, to insulin sensitivity; but through a fairly complex and longitudinal process that involves the interaction of many hormones. One of the mechanisms may be an overall reduction in insulin levels, leading to increased insulin sensitivity as a compensatory adaptation. In the short term, particularly while it is being conducted, exercise nearly always increases blood glucose levels. Even in the first few months after the beginning of an exercise program, blood glucose levels may increase. If a person who was on a low carbohydrate diet started a 3-month exercise program, it is quite possible that the person’s average blood glucose would go up a bit. If low carbohydrate dieting began together with the exercise program, then average blood glucose might drop significantly, because of the acute effect of this type of dieting on average blood glucose.

Still exercise is health-promoting. The combination of the long- and short-term effects of exercise appears to lead to an overall slowing down of the progression of insulin resistance with age. This is a good thing.


Wilmore, J.H., Costill, D.L., & Kenney, W.L. (2007). Physiology of sport and exercise. Champaign, IL: Human Kinetics.

Thursday, June 24, 2010

Interview with Jimmy Moore

About two months ago, I did an interview with Jimmy Moore of the Livin' la Vida Low Carb internet empire. I hardly remember what we talked about, but I think it went well. I enjoyed Jimmy's pleasant and open-minded attitude. Head over to Jimmy's website and listen to the interview here.

I do recall making at least one mistake. When discussing heart attacks,I said "atrial fibrillation" when I meant "ventricular fibrillation".

Wednesday, June 23, 2010

Compensatory adaptation as a unifying concept: Understanding how we respond to diet and lifestyle changes

Trying to understand each body response to each diet and lifestyle change, individually, is certainly a losing battle. It is a bit like the various attempts to classify organisms that occurred prior to solid knowledge about common descent. Darwin’s theory of evolution is a theory of common descent that makes classification of organisms a much easier and logical task.

Compensatory adaptation (CA) is a broad theoretical framework that hopefully can help us better understand responses to diet and lifestyle changes. CA is a very broad idea, and it has applications at many levels. I have discussed CA in the context of human behavior in general (Kock, 2002), and human behavior toward communication technologies (Kock, 2001; 2005; 2007). Full references and links are at the end of this post.

CA is all about time-dependent adaptation in response to stimuli facing an organism. The stimuli may be in the form of obstacles. From a general human behavior perspective, CA seems to be at the source of many success stories. A few are discussed in the Kock (2002) book; the cases of Helen Keller and Stephen Hawking are among them.

People who have to face serious obstacles sometimes develop remarkable adaptations that make them rather unique individuals. Hawking developed remarkable mental visualization abilities, which seem to be related to some of his most important cosmological discoveries. Keller could recognize an approaching person based on floor vibrations, even though she was blind and deaf. Both achieved remarkable professional success, perhaps not as much in spite but because of their disabilities.

From a diet and lifestyle perspective, CA allows us to make one key prediction. The prediction is that compensatory body responses to diet and lifestyle changes will occur, and they will be aimed at maximizing reproductive success, but with a twist – it’s reproductive success in our evolutionary past! We are stuck with those adaptations, even though we live in modern environments that differ in many respects from the environments where our ancestors lived.

Note that what CA generally tries to maximize is reproductive success, not survival success. From an evolutionary perspective, if an organism generates 30 offspring in a lifetime of 2 years, that organism is more successful in terms of spreading its genes than another that generates 5 offspring in a lifetime of 200 years. This is true as long as the offspring survive to reproductive maturity, which is why extended survival is selected for in some species.

We live longer than chimpanzees in part because our ancestors were “good fathers and mothers”, taking care of their children, who were vulnerable. If our ancestors were not as caring or their children not as vulnerable, maybe this blog would have posts on how to control blood glucose levels to live beyond the ripe old age of 50!

The CA prediction related to responses aimed at maximizing reproductive success is a straightforward enough prediction. The difficult part is to understand how CA works in specific contexts (e.g., Paleolithic dieting, low carbohydrate dieting, calorie restriction), and what we can do to take advantage (or work around) CA mechanisms. For that we need a good understanding of evolution, some common sense, and also good empirical research.

One thing we can say with some degree of certainty is that CA leads to short-term and long-term responses, and that those are likely to be different from one another. The reason is that a particular diet and lifestyle change affected the reproductive success of our Paleolithic ancestors in different ways, depending on whether it was a short-term or long-term change. The same is true for CA responses at different stages of one’s life, such as adolescence and middle age; they are also different.

This is the main reason why many diets that work very well in the beginning (e.g., first months) frequently cease to work as well after a while (e.g., a year).

Also, CA leads to psychological responses, which is one of the key reasons why most diets fail. Without a change in mindset, more often than not one tends to return to old habits. Hunger is not only a physiological response; it is also a psychological response, and the psychological part can be a lot stronger than the physiological one.

It is because of CA that a one-month moderately severe calorie restriction period (e.g., 30% below basal metabolic rate) will lead to significant body fat loss, as the body produces hormonal responses to several stimuli (e.g., glycogen depletion) in a compensatory way, but still “assuming” that liberal amounts of food will soon be available. Do that for one year and the body will respond differently, “assuming” that food scarcity is no longer short-term and thus that it requires different, and possibly more drastic, responses.

Among other things, prolonged severe calorie restriction will lead to a significant decrease in metabolism, loss of libido, loss of morale, and physical as well as mental fatigue. It will make the body hold on to its fat reserves a lot more greedily, and induce a number of psychological responses to force us to devour anything in sight. In several people it will induce psychosis. The results of prolonged starvation experiments, such as the Biosphere 2 experiments, are very instructive in this respect.

It is because of CA that resistance exercise leads to muscle gain. Muscle gain is actually a body’s response to reasonable levels of anaerobic exercise. The exercise itself leads to muscle damage, and short-term muscle loss. The gain comes after the exercise, in the following hours and days (and with proper nutrition), as the body tries to repair the muscle damage. Here the body “assumes” that the level of exertion that caused it will continue in the near future.

If you increase the effort (by increasing resistance or repetitions, within a certain range) at each workout session, the body will be constantly adapting, up to a limit. If there is no increase, adaptation will stop; it will even regress if exercise stops altogether. Do too much resistance training (e.g., multiple workout sessions everyday), and the body will react differently. Among other things, it will create deterrents in the form of pain (through inflammation), physical and mental fatigue, and even psychological aversion to resistance exercise.

CA processes have a powerful effect on one’s body, and even on one’s mind!


Kock, N. (2001). Compensatory Adaptation to a Lean Medium: An Action Research Investigation of Electronic Communication in Process Improvement Groups. IEEE Transactions on Professional Communication, 44(4), 267-285.

Kock, N. (2002). Compensatory Adaptation: Understanding How Obstacles Can Lead to Success. Infinity Publishing, Haverford, PA. (Additional link.)

Kock, N. (2005). Compensatory adaptation to media obstacles: An experimental study of process redesign dyads. Information Resources Management Journal, 18(2), 41-67.

Kock, N. (2007). Media Naturalness and Compensatory Encoding: The Burden of Electronic Media Obstacles is on Senders. Decision Support Systems, 44(1), 175-187.

Tuesday, June 22, 2010

In Search of Traditional Asian Diets

It's been difficult for me to find good information on Asian diets prior to modernization. Traditional Chinese, Taiwanese and Japanese diets are sometimes portrayed as consisting mostly of white rice, with vegetables and a bit of meat and soy, but I find that implausible. Rice doesn't grow everywhere, and removing all the bran was prohibitively labor-intensive before the introduction of modern machine milling. One hundred years ago, bran was partially removed by beating or grinding in a mortar and pestle, as it still is in parts of rural Asia today. Only the wealthy could afford true white rice.

Given the difficulty of growing rice in most places, and hand milling it, the modern widespread consumption of white rice in Asia must be a 20th century phenomenon, originating in the last 20-100 years depending on location. Therefore, white rice consumption does not predate the emergence of the "diseases of civilization" in Asia.
In the book Western Diseases: Their Emergence and Prevention, there are several accounts of traditional Asian diets I find interesting.

Taiwan in 1980

The staple constituent of the diet is polished white rice. Formerly in the poorer areas along the sea coast the staple diet was sweet potato, with small amounts of white rice added. Formerly in the mountains sweet potato, millet and taro were the staple foods. During the last 15 years, with the general economic development of the whole island, white polished rice has largely replaced other foods. There is almost universal disinclination to eat brown (unpolished) rice, because white rice is more palatable, it bears kudos, cooking is easier and quicker, and it can be stored for a much longer period.

Traditionally, coronary heart disease and high blood pressure were rare, but the prevalence is now increasing rapidly. Stroke is common. Diabetes was rare but is increasing gradually.

Mainland China

China is a diverse country, and the food culture varies by region.

Snapper (1965)… quoted an analysis by Guy and Yeh of Peiping (Peking) diets in 1938. There was a whole cereal/legume/vegetable diet for poorer people and a milled-cereal/meat/vegetable diet for the richer people.

Symptoms of vitamin A, C and D deficiency were common in the poor, although coronary heart disease and high blood pressure were rare. Diabetes occurred at a higher rate than in most traditionally-living populations.


On the Japanese island of Okinawa, the traditional staple is the sweet potato, with a smaller amount of rice eaten as well. Seafood, vegetables, pork and soy are also on the menu. In Akira Kurosawa’s movie Seven Samurai, set in 16th century mainland Japan, peasants ate home-processed millet and barley, while the wealthy ate white rice. Although a movie may not be the best source of information, I suspect it has some historical basis.

White Rice: a Traditional Asian Staple?

It depends on your perspective. How far back do you have to go before you can call a food traditional? Many peoples' grandparents ate white rice, but I doubt their great great grandparents ate it frequently. White rice may have been a staple for the wealthy for hundreds of years in some places. But for most of Asia, in the last few thousand years, it was probably a rare treat. The diet most likely resembled that of many non-industrial African cultures: an assortment of traditionally prepared grains, root vegetables, legumes, vegetables and a little meat.

Please add any additional information you may have about traditional Asian diets to the comments section.

Monday, June 21, 2010

What about some offal? Boiled tripes in tomato sauce

Tripe dishes are made with the stomach of various ruminants. The most common type of tripe is beef tripe from cattle. Like many predators, our Paleolithic ancestors probably ate plenty of offal, likely including tripe. They certainly did not eat only muscle meat. It would have been a big waste to eat only muscle meat, particularly because animal organs and other non-muscle parts are very rich in vitamins and minerals.

The taste for tripe is an acquired one. Many national cuisines have traditional tripe dishes, including the French, Chinese, Portuguese, and Mexican cuisines – to name only a few. The tripe dish shown in the photo below was prepared following a simple recipe. Click on the photo to enlarge it.

Here is the recipe:

- Cut up about 2 lbs of tripe into rectangular strips. I suggest rectangles of about 5 by 1 inches.
- Boil the tripe strips in low heat for 5 hours.
- Drain the boiled tripe strips, and place them in a frying or sauce pan. You may use the same pan you used for boiling.
- Add a small amount of tomato sauce, enough to give the tripe strips color, but not to completely immerse them in the sauce. Add seasoning to taste. I suggest some salt, parsley, garlic powder, chili powder, black pepper, and cayenne pepper.
- Cook the tripe strips in tomato sauce for about 15 minutes.

Cooked tripe has a strong, characteristic smell, which will fill your kitchen as you boil it for 5 hours. Not many people will be able to eat many tripe strips at once, so perhaps this should not be the main dish of a dinner with friends. I personally can only eat about 5 strips at a time. I know folks who can eat a whole pan full of tripe strips, like the one shown on the photo in this post. But these folks are not many.

In terms of nutrition, 100 g of tripe prepared in this way will have approximately 12 g of protein, 4 g of fat, 157 g of cholesterol, and 2 g of carbohydrates. You will also be getting a reasonable amount of vitamin B12, zinc, and selenium.

Thursday, June 17, 2010

Pretty faces are average faces: Genetic diversity and health

Many people think that the prettiest faces are those with very unique features. Generally that is not true. Pretty faces are average faces. And that is not only because they are symmetrical, even though symmetry is an attractive facial trait. Average faces are very attractive, which is counterintuitive but makes sense in light of evolution and genetics.

The faces in the figure below (click to enlarge) are from a presentation I gave at the University of Houston in 2008. The PowerPoint slides file for the presentation is available here. The photos were taken from the German web site This site summarizes a lot of very interesting research on facial attractiveness.

The face on the right is a composite of the two faces on the left. It simulates what would happen if you were to morph the features of the two faces on the left into the face on the right. That is, the face on the right is the result of an “averaging” of the two faces on the left.

If you show these photos to a group of people, like I did during my presentation in Houston, most of the people in the group will say that the face on the right is the prettiest of the three. This happens even though most people will also say that each of the three faces is pretty, if shown each face separately from the others.

Why are average faces more beautiful?

The reason may be that we have brain algorithms that make us associate a sense of “beauty” with features that suggest an enhanced resistance to disease. This is an adaptation to the environments our ancestors faced in our evolutionary past, when disease would often lead to observable distortions of facial and body traits. Average faces are the result of increased genetic mixing, which leads to increased resistance to disease.

This interpretation is a variation of Langlois and Roggman’s “averageness hypothesis”, published in a widely cited 1990 article that appeared in the journal Psychological Science.

By the way, many people think that the main survival threats ancestral humans faced were large predators. I guess it is exciting to think that way; our warrior ancestors survived due to their ability to fight off predators! The reality is that, in our ancestral past, as today, the biggest killer of all by far was disease. The small organisms, the ones our ancestors couldn’t see, were the most deadly.

People from different populations, particularly those that have been subjected to different diseases, frequently carry genetic mutations that protect them from those diseases. Those are often carried as dominant alleles (i.e., variations of a gene). When two people with diverse genetic protections have children, the children inherit the protective mutations of both parents. The more genetic mixing, the more likely it is that multiple protective genetic mutations will be carried. The more genetic mixing, the higher is the "averageness" score of the face.

The opposite may happen when people who share many genes (e.g., cousins) have children. The term for this is inbreeding. Since alleles that code for diseases are often carried in recessive form, a child of closely related parents has a higher chance of having a combination of two recessive disease-promoting alleles. In this case, the child will be homozygous recessive for the disease, which will increase dramatically its chances of developing the disease.

In a nutshell: gene mixing = health; inbreeding = disease.

Finally, if you have some time, make sure to take a look at this page on the Virtual Miss Germany!

Wednesday, June 16, 2010

Low Micronutrient Intake may Contribute to Obesity

[2013 update: I'm skeptical of the idea that micronutrient insufficiency/deficiency promotes obesity.  Although the trial discussed below suggested it might be a factor, it has not been a general finding that micronutrient supplementation causes fat loss, and the result needs to be repeated to be believable in my opinion.  Also, conditions of frank micronutrient deficiency are not usually associated with fat gain]

Lower Micronutrient Status in the Obese

Investigators have noted repeatedly that obese people have a lower blood concentration of a number of nutrients, including vitamin A, vitamin D, vitamin K, several B vitamins, zinc and iron (1). Although there is evidence that some of these may influence fat mass in animals, the evidence for a cause-and-effect relationship in humans is generally slim. There is quite a bit of indirect evidence that vitamin D status influences the risk of obesity (2), although a large, well-controlled study found that high-dose vitamin D3 supplementation does not cause fat loss in overweight and obese volunteers over the course of a year (3). It may still have a preventive effect, or require a longer timescale, but that remains to be determined.

Hot off the Presses

A new study in the journal Obesity, by Y. Li and colleagues, showed that compared to a placebo, a low-dose multivitamin caused obese volunteers to lose 7 lb (3.2 kg) of fat mass in 6 months, mostly from the abdominal region (4). The supplement also reduced LDL by 27%, increased HDL by a whopping 40% and increased resting energy expenditure. Here's what the supplement contained:

Vitamin A(containing natural mixed b-carotene) 5000 IU
Vitamin D 400 IU
Vitamin E 30 IU
Thiamin 1.5 mg
Riboflavin 1.7 mg
Vitamin B6 2 mg
Vitamin C 60 mg
Vitamin B12 6 mcg
Vitamin K1 25 mcg
Biotin 30 mcg
Folic acid 400 mcg
Nicotinamide 20 mg
Pantothenic acid 10 mg
Calcium 162 mg
Phosphorus 125 mg
Chlorine 36.3 mg
Magnesium 100 mg
Iron 18 mg
Copper 2 mg
Zinc 15 mg
Manganese 2.5 mg
Iodine 150 mcg
Chromium 25 mcg
Molybdenum 25 mcg
Selenium 25 mcg
Nickel 5 mcg
Stannum 10 mcg
Silicon 10 mcg
Vanadium 10 mcg

Although the result needs to be repeated, if we take it at face value, it has some important implications:
  • The nutrient density of a diet may influence obesity risk, as I speculated in my recent audio interview and related posts (5, 6, 7, 8, 9).
  • Many nutrients act together to create health, and multiple insufficiencies may contribute to disease. This may be why single nutrient supplementation trials usually don't find much.
  • Another possibility is that obesity can result from a number of different nutrient insufficiencies, and the cause is different in different people. This study may have seen a large effect because it corrected many different insufficiencies.
  • This result, once again, kills the simplistic notion that body fat is determined exclusively by voluntary food consumption and exercise behaviors (sometimes called the "calories in, calories out" idea, or "gluttony and sloth"). In this case, a multivitamin was able to increase resting energy expenditure and cause fat loss without any voluntary changes in food intake or exercise, suggesting metabolic effects and a possible downward shift of the body fat "setpoint" due to improved nutrient status.
Practical Implications

Does this mean we should all take multivitamins to stay or become thin? No. There is no multivitamin that can match the completeness and balance of a nutrient-dense, whole food, omnivorous diet. Beef liver, leafy greens and sunlight are nature's vitamin pills. Avoiding refined foods instantly doubles the micronutrient content of the typical diet. Properly preparing whole grains by soaking and fermentation is equivalent to taking a multi-mineral along with conventionally prepared grains, as absorption of key minerals is increased by 50-300% (10). Or you can eat root vegetables instead of grains, and enjoy their naturally high mineral availability. Or both.

Tuesday, June 15, 2010

Soccer as play and exercise: Resistance and endurance training at the same time

Many sports combine three key elements that make them excellent fitness choices: play, resistance exercise, and endurance exercise; all at the same time. Soccer is one of those sports. Its popularity is growing, even in the US! The 2010 FIFA World Cup, currently under way in South Africa, is a testament to that. It helps that the US team qualified and did well in its first game against England.

Pelé is almost 70 years old in the photo below, from Wikipedia. He is widely regarded as the greatest soccer player of all time. But not by Argentineans, who will tell you that Pelé is probably the second greatest soccer player of all time, after Maradona.

Even though Brazil is not a monarchy, Pelé is known there as simply “The King”. How serious are Brazilians about this? Well, consider this. Fernando Henrique Cardoso was one of the most popular presidents of Brazil. He was very smart; he appointed Pelé to his cabinet. But when Cardoso had a disagreement with Pelé he was broadly chastised in Brazil for disrespecting “The King”, and was forced to publicly apologize or blow his political career!

Arguably soccer is a very good choice of play activity to be used in combination with resistance exercise. When used alone it is likely to lead to much more lower- than upper-body muscle development. Unlike before the 1970s, most soccer players today use whole body resistance exercise as part of their training. Still, you often see very developed leg muscles and relatively slim upper bodies.

What leads to leg muscle gain are the sprints. Interestingly, it is the eccentric part of the sprints that add the most muscle, by causing the most muscle damage. That is, it not the acceleration, but the deceleration phase that leads to the largest gains in leg muscle.

This eccentric phase effect is true for virtually all types of anaerobic exercise, and a well known fact among bodybuilders and exercise physiologists (see, e.g., Wilmore et al., 2007; full reference at the end of the post). For example, it is not the lifting, but the lowering of the bar in the chest press, which leads to the most muscle gain.

Like many sports practiced at high levels of competition, professional soccer can lead to serious injuries. So can non-professional, but highly competitive play. Common areas of injury are the ankles and the knees. See Mandelbaum & Putukian (1999) for a discussion of possible types of health problems associated with soccer; it focuses on females, but is broad enough to serve as a general reference. The full reference and link to the article are given below.


Mandelbaum, B.R., & Putukian, M. (1999). Medical concerns and specificities in female soccer players. Science & Sports, 14(5), 254-260.

Wilmore, J.H., Costill, D.L., & Kenney, W.L. (2007). Physiology of sport and exercise. Champaign, IL: Human Kinetics.

Monday, June 14, 2010

New Layout

I thought I'd spruce the place up a bit! Let me know what you think in the comments.

Friday, June 11, 2010

Fructose in fruits may be good for you, especially if you are low in glycogen

Excessive dietary fructose has been shown to cause an unhealthy elevation in serum triglycerides. This and other related factors are hypothesized to have a causative effect on the onset of the metabolic syndrome. Since fructose is found in fruits (see table below, from Wikipedia; click to enlarge), there has been some concern that eating fruit may cause the metabolic syndrome.

Vegetables also have fructose. Sweet onions, for example, have more free fructose than peaches, on a gram-adjusted basis. Sweet potatoes have more sucrose than grapes (but much less overall sugar), and sucrose is a disaccharide derived from glucose and fructose. Sucrose is broken down to fructose and glucose in the human digestive tract.

Dr. Robert Lustig has given a presentation indicting fructose as the main cause of the metabolic syndrome, obesity, and related diseases. Yet, even he pointed out that the fructose in fruits is pretty harmless. This is backed up by empirical research.

The problem is over-consumption of fructose in sodas, juices, table sugar, and other industrial foods with added sugar. Table sugar is a concentrated form of sucrose. In these foods the fructose content is unnaturally high; and it comes in an easily digestible form, without any fiber or health-promoting micronutrients (vitamins and minerals).

Dr. Lustig’s presentation is available from this post by Alan Aragon. At the time of this writing, there were over 450 comments in response to Aragon’s post. If you read the comments you will notice that they are somewhat argumentative, as if Lustig and Aragon were in deep disagreement with one other. The reality is that they agree on a number of issues, including that the fructose found in fruits is generally healthy.

Fruits are among the very few natural plant foods that have been evolved to be eaten by animals, to facilitate the dispersion of the plants’ seeds. Generally and metaphorically speaking, plants do not “want” animals to eat their leaves, seeds, or roots. But they “want” animals to eat their fruits. They do not “want” one single animal to eat all of their fruits, which would compromise seed dispersion and is probably why fruits are not as addictive as doughnuts.

From an evolutionary standpoint, the idea that fruits can be unhealthy is somewhat counterintuitive. Given that fruits are made to be eaten, and that dead animals do not eat, it is reasonable to expect that fruits must be good for something in animals, at least in one important health-related process. If yes, what is it?

Well, it turns out that fructose, combined with glucose, is a better fuel for glycogen replenishment than glucose alone; in the liver and possibly in muscle, at least according to a study by Parniak and Kalant (1988). A downside of this study is that it was conduced with isolated rat liver tissue; this is a downside in terms of the findings’ generalization to humans, but helped the researchers unveil some interesting effects. The full reference and a link to the full-text version are at the end of this post.

The Parniak and Kalant (1988) study also suggests that glycogen synthesis based on fructose takes precedence over triglyceride formation. Glycogen synthesis occurs when glycogen reserves are depleted. The liver of an adult human stores about 100 g of glycogen, and muscles store about 500 g. An intense 30-minute weight training session may use up about 63 g of glycogen, not much but enough to cause some of the responses associated with glycogen depletion, such as an acute increase in adrenaline and growth hormone secretion.

Liver glycogen is replenished in a few hours. Muscle glycogen takes days. Glycogen synthesis is discussed at some length in this excellent book by Jack H. Wilmore, David L. Costill, and W. Larry Kenney. That discussion generally assumes no blood sugar metabolism impairment (e.g., diabetes), as does this post.

If one’s liver glycogen tank is close to empty, eating a couple of apples will have little to no effect on body fat formation. This will be so even though two apples have close to 30 g of carbohydrates, more than 20 g of which being from sugars. The liver will grab everything for itself, to replenish its 100 g glycogen tank.

In the Parniak and Kalant (1988) study, when glucose and fructose were administered simultaneously, glycogen synthesis based on glucose was increased by more than 200 percent. Glycogen synthesis based on fructose was increased by about 50 percent. In fruits, fructose and glucose come together. Again, this was an in vitro study, with liver cells obtained after glycogen depletion (the rats were fasting).

What leads to glycogen depletion in humans? Exercise does, both aerobic and anaerobic. So does intermittent fasting.

What happens when we consume excessive fructose from sodas, juices, and table sugar? The extra fructose, not used for glycogen replenishment, is converted into fat by the liver. That fat is packaged in the form of triglycerides, which are then quickly secreted by the liver as small VLDL particles. The VLDL particles deliver their content to muscle and body fat tissue, contributing to body fat accumulation. After delivering their cargo, small VLDL particles eventually become small-dense LDL particles; the ones that can potentially cause atherosclerosis.


Parniak, M.A. and Kalant, N. (1988). Enhancement of glycogen concentrations in primary cultures of rat hepatocytes exposed to glucose and fructose. Biochemical Journal, 251(3), 795–802.

Thursday, June 10, 2010

Nitrate: a Protective Factor in Leafy Greens

Cancer Link and Food Sources

Nitrate (NO3) is a molecule that has received a lot of bad press over the years. It is thought to promote digestive cancers, in part due to its ability to form carcinogens when used as a preservative for processed meat. Because of this (1), nitrate was viewed with suspicion and a number of countries imposed strict limits on its use as a food additive.

But what if I told you that by far the greatest source of nitrate in the modern diet isn't processed meat-- but vegetables, particularly leafy greens (2)? And that the evidence linking exposure to nitrate itself has largely failed to materialize? For example, one study found no difference in the incidence of gastric cancer between nitrate fertilizer plant workers and the general population (3). Most other studies in animals and humans have not supported the hypothesis that nitrate itself is carcinogenic (4, 5, 6), but rather that they are only carcinogenic in the context of processed meats due to the formation of carcinogenic nitrosamines. This, combined with recent findings on nitrate biology, has changed the way we think about this molecule in recent years.

A New Example of Human Symbiosis

In 2003, Dr. K. Cosby and colleagues showed that nitrite (NO2; not the same as nitrate) dilates blood vessels in humans when infused into the blood (7). Investigators subsequently uncovered an amazing new example of human-bacteria symbiosis: dietary nitrate (NO3) is absorbed from the gut into the bloodstream and picked up by the salivary glands. It's then secreted into saliva, where oral bacteria use it as an energy source, converting it to nitrite (NO2). After swallowing, the nitrite is reabsorbed into the bloodstream (8). Humans and oral bacteria may have co-evolved to take advantage of this process. Antibacterial mouthwash prevents it.

Nitrate Protects the Cardiovascular System

In 2008, Dr. Andrew J. Webb and colleagues showed that nitrate in the form of 1/2 liter of beet juice (equivalent in volume to about 1.5 soda cans) substantially lowers blood pressure in healthy volunteers for over 24 hours. It also preserved blood vessel performance after brief oxygen deprivation, and reduced the tendency of the blood to clot (9). These are all changes that one would expect to protect against cardiovascular disease. Another group showed that in monkeys, the ability of nitrite to lower blood pressure did not diminish after two weeks, showing that the animals did not develop a tolerance to it on this timescale (10).

Subsequent studies showed that dietary nitrite reduces blood vessel dysfunction and inflammation (CRP) in cholesterol-fed mice (11). Low doses of nitrite also dramatically reduce tissue death in the hearts of mice exposed to conditions mimicking a heart attack, as well as protecting other tissues against oxygen deprivation damage (12). The doses used in this study were the equivalent of a human eating a large serving (100 g; roughly 1/4 lb) of lettuce or spinach.


Nitrite is thought to protect the cardiovascular system by serving as a precursor for nitric oxide (NO), one of the most potent anti-inflammatory and blood vessel-dilating compounds in the body (13). A decrease in blood vessel nitric oxide is probably one of the mechanisms of diet-induced atherosclerosis and increased clotting tendency, and it is likely an early consequence of eating a poor diet (14).

The Long View

Leafy greens were one of the "protective foods" emphasized by the nutrition giant Sir Edward Mellanby (15), along with eggs and high-quality full-fat dairy. There are many reasons to believe greens are an excellent contribution to the human diet, and what researchers have recently learned about nitrate biology certainly reinforces that notion. Leafy greens may be particularly useful for the prevention and reversal of cardiovascular disease, but are likely to have positive effects on other organ systems both in health and disease. It's ironic that a molecule suspected to be the harmful factor in processed meats is turning out to be one of the major protective factors in vegetables.

Wednesday, June 9, 2010

Cortisol, stress, excessive gluconeogenesis, and visceral fat accumulation

Cortisol is a hormone that plays several very important roles in the human body. Many of these are health-promoting, under the right circumstances. Others can be disease-promoting, especially if cortisol levels are chronically elevated.

Among the disease-promoting effects of chronically elevated blood cortisol levels are that of excessive gluconeogenesis, causing high blood glucose levels even while a person is fasting. This also causes muscle wasting, as muscle tissue is used to elevate blood glucose levels.

Cortisol also seems to transfer body fat from subcutaneous to visceral areas. Presumably cortisol promotes visceral fat accumulation to facilitate the mobilization of that fat in stressful “fight-or-flight” situations. Visceral fat is much easier to mobilize than subcutaneous fat, because visceral fat deposits are located in areas where vascularization is higher, and are closer to the portal vein.

The problem is that modern humans often experience stress without the violent muscle contractions of a “fight-or-flight” response that would have normally occurred among our hominid ancestors. Arguably those muscle contractions would have normally been in the anaerobic range (like a weight training set) and be fueled by both glycogen and fat. Recovery from those anaerobic "workouts" would induce aerobic metabolic responses, for which the main fuel would be fat.

Coates and Herbert (2008) studied hormonal responses of a group of London traders. Among other interesting results, they found that a trader’s blood cortisol level rises with the volatility of the market. The figure below (click to enlarge) shows the variation in cortisol levels against a measure of market volatility.

On a day of high volatility cortisol levels can be significantly higher than those on a day with little volatility. The correlation between cortisol levels and market volatility in this study was a very high 0.93. This is almost a perfectly linear association. Market volatility is associated with traders’ stress levels; stress that is experienced without heavy physical exertion.

Cortisol levels go up a lot with stress. And modern humans live in hyper-stressful environments. Unfortunately stress in modern urban environments is often experienced while sitting down. In the majority of cases stress is experienced without any vigorous physical activity in response to it.

As Geoffrey Miller pointed out in his superb book, The Mating Mind, the lives of our Paleolithic ancestors would probably look rather boring to a modern human. But that is the context in which our endocrine responses evolved.

Our insatiable appetite for over stimulation may be seen as a disease. A modern disease. A disease of civilization.

Well, it is no wonder that heavy physical activity is NOT a major trigger of death by sudden cardiac arrest. Bottled up modern human stress likely is.

We need to learn how to make stress management techniques work for us.

Visiting New Zealand at least once and watching this YouTube video clip often to remind you of the experience does not hurt either! Note the “honesty box” at around 50 seconds into the clip.


Coates, J.M., & Herbert, J. (2008). Endogenous steroids and financial risk taking on a London trading floor. Proceedings of the National Academic of Sciences of the U.S.A., 105(16), 6167–6172.

Elliott, W.H., & Elliott, D.C. (2009). Biochemistry and molecular biology. 4th Edition. New York: NY: Oxford University Press.

Monday, June 7, 2010

Niacin turbocharges the growth hormone response to anaerobic exercise: A delayed effect

Niacin is also known as vitamin B3, or nicotinic acid. It is an essential vitamin whose deficiency leads to pellagra. In large doses of 1 to 3 g per day it has several effects on blood lipids, including an increase in HDL cholesterol and a marked decreased in fasting triglycerides. Niacin is also a powerful antioxidant.

Among niacin’s other effects, when taken in large doses of 1 to 3 g per day, is an acute elevation in growth hormone secretion. This is a delayed effect, frequently occurring 3 to 5 hours after taking niacin. This effect is independent of exercise.

It is important to note that large doses of 1 to 3 g of niacin are completely unnatural, and cannot be achieved by eating foods rich in niacin. For example, one would have to eat a toxic amount of beef liver (e.g., 15 lbs) to get even close to 1 g of niacin. Beef liver is one of the richest natural sources of niacin.

Unless we find out something completely unexpected about the diet of our Paleolithic ancestors in the future, we can safely assume that they never benefited from the niacin effects discussed in this post.

With that caveat, let us look at yet another study on niacin and its effect on growth hormone. Stokes and colleagues (2008) conducted a study suggesting that, in addition to the above mentioned beneficial effects of niacin, there is another exercise-induced effect: niacin “turbocharges” the growth hormone response to anaerobic exercise. The full reference to the study is at the end of this post. Figure 3, shown below, illustrates the effect and its magnitude. Click on it to enlarge.

The closed diamond symbols represent the treatment group. In it, participants ingested a total of 2 g of niacin in three doses: 1 g ingested at 0 min, 0.5 g at 120 min, and 0.5 g at 240 min. The control group ingested no niacin, and is represented by the open square symbols. (The researchers did not use a placebo in the control group; they justified this decision by noting that the niacin flush nullified the benefits of using a placebo.) The arrows indicate points at which all-out 30-second cycle ergometer sprints occurred.

Ignore the lines showing the serum growth hormone levels in between 120 and 300 min; they were not measured within that period.

As you can see, the peak growth hormone response to the first sprint was almost two times higher in the niacin group. In the second sprint, at 300 min, the rise in growth hormone is about 5 times higher in the niacin group.

We know that growth hormone secretion may rise 300 percent with exercise, without niacin. According to this study, this effect may be “turbocharged” up to a 600 percent rise with niacin within 300 min (5 h) of taking it, and possibly 1,500 percent soon after 300 min passed since taking niacin.

That is, not only does niacin boost growth hormone secretion anytime after it is taken, but one still gets the major niacin increase in growth hormone at around 300 min of taking it (which is about the same, whether you exercise or not). Its secretion level at this point is, by the way, higher than its highest level typically reached during deep sleep.

Let me emphasize that the peak growth hormone level achieved in the second sprint is about the same you would get without exercise, namely a bit more than 20 micrograms per liter, as long as you took niacin (see Quabbe's articles at the end of this post).

Still, if you time your exercise session to about 300 min after taking niacin you may have some extra benefits, because getting that peak growth hormone secretion at the time you are exercising may help boost some of the benefits of exercise.

For example, the excess growth hormone secretion may reduce muscle catabolism and increase muscle anabolism, at the same time, leading to an increase in muscle gain. However, there is evidence that growth hormone-induced muscle gain occurs only when testosterone levels are elevated. This explains why growth hormone levels are usually higher in young women than young men, and yet young women do not put on much muscle in response to exercise.


Stokes, K.A., Tyler, C., & Gilbert, K.L. (2008). The growth hormone response to repeated bouts of sprint exercise with and without suppression of lipolysis in men. Journal of Applied Physiology, 104(3), 724-728.

Saturday, June 5, 2010

Fermented Grain Recipes from Around the World

In my last two posts on grains, I described how traditional food processing methods make grains more nutritious and digestible (1, 2). I promised to briefly describe a few recipes from around the world, then got distracted by other things. Here they are.

Africa: Ogi

Grain fermentation is widespread in Africa and is probably nearly as old as agriculture on the continent. The nutritional importance of fermentation is suggested by the amount of time and effort that many African cultures put into it, when they could save themselves a lot of trouble by simply soaking and cooking their grains.

Ogi is a common West African porridge that's eaten as a staple food by people of all ages. It's even used as a weaning food. It's made in essentially the same manner from corn, sorghum or millet.

Whole grain is soaked in water for one to three days. It's then wet milled, mixed with water and sieved to remove a portion of the bran. Extra bran is fed to animals, while the white, starchy sediment is fermented for two to three days. This is then cooked into a thin or thick porridge and eaten.

South America: Pozol

At first glance, some people may think I left the 'e' off the word 'pozole', a traditional Mexican stew. However, pozol is an entirely different beast, an ancient food almost totally unknown in the US, but which fueled the Mayan empire and remains a staple food in Southeastern Mexico.

To make pozol, first the corn must be 'nixtamalized': whole kernels are boiled in a large volume of water with calcium hydroxide (10% w/v). This is a processing step in most traditional South American corn recipes, as it allows a person to avoid pellagra (niacin deficiency)! The loosened bran is removed from the kernels by hand.

The kernels are then ground into dough, formed into balls and placed into banana leaves to ferment for one to 14 days. Following fermentation, pozol is diluted in water and consumed raw.

Europe: Sourdough Bread

Sourdough bread is Europe's quintessential fermented grain food. Before purified yeast strains came into widespread use in the 20th century, all bread would have been some form of sourdough.

Although in my opinion wheat is problematic for many people, sourdough fermentation renders it more nutritious and better tolerated by those with gluten/wheat sensitivity. In an interesting series of studies, Dr. Marco Gobbetti's group, among others, has shown that fermentation partially degrades gluten, explaining the ability of fermentation to decrease the adverse effects of gluten in those who are sensitive to it (3). They even showed that people with celiac disease can safely eat wheat bread that has been long-fermented with selected bacteria and yeasts under laboratory conditions (4). Rye contains about half the gluten of bread wheat, and is generally nutritionally superior to wheat, so sourdough rye is a better choice in my opinion.

To make sourdough bread, first the dry grains are ground into flour. Next, the flour is sifted through a screen to remove a portion of the bran. The earliest bread eaters probably didn't do this, although there is evidence of the wealthy eating sifted flour in societies as old as ancient Egypt and ancient Rome. I don't know what the optimum amount of bran to include in flour is, but it's not zero. I would be inclined to keep at least half of it, recognizing that the bran is disproportionately rich in nutrients.

Next, a portion of flour is mixed with water and a "sourdough starter", until it has a runny consistency. The starter is a diverse culture of bacteria and yeast that is carefully maintained by the bread maker. This culture acidifies the batter and produces carbon dioxide gas. The mixture is allowed to ferment for 8-12 hours. Finally, flour and salt are added to the batter and formed into dough balls. These are allowed to ferment and rise for a few hours, then baked.

My Experience

I've tried making ogi (millet) and pozol, and I have to admit that neither attempt was successful. Pozol in particular may depend on local populations of bacteria and yeast, as the grains' microorganisms are killed during processing. However, I do eat fermented grains regularly in the form of homemade brown rice 'uthappam' and sourdough buckwheat 'crepes'. The buckwheat crepes are tasty and easy to make. I'll post a recipe at some point.

The first two recipes are from the FAO publication Fermented Cereals: a Global Perspective (5).

Friday, June 4, 2010

Growth hormone secretion drops with age, but not exactly in the way you would expect

Many people assume that growth hormone secretion drops with age in a somewhat linear fashion, as implied by this diagram. This assumption probably stems from attempts to model growth hormone variations with linear regression algorithms. This assumption is wrong.

Actual plots of growth hormone secretion patterns, with age on the horizontal axes, tell a different story. See, for example, the graphs below, from They match the graphs one sees in empirical academic papers. The graphs below (click to enlarge) are particularly good at highlighting some interesting patterns of variation.

On the left side, bar charts show secretion patterns grouped by age ranges during a 24 h period (at the top), during wake time (at the middle), and during sleep (at the bottom). On the right side is the actual data used to build the bar charts. As you can see from the graphs on the right side, the drop in growth hormone secretion follows a pattern that looks a lot more like an exponential decay than a linear pattern.

The drop is very steep from 15 to 40 years of age, after which it shows some fluctuations, going up and down. Interestingly, people in their 50s and 60s, at least in this dataset, have on average higher growth hormone levels than people in their 40s. Of course this may be due to sample bias, but the graphs suggest that there is a major drop in growth hormone secretion, on average, around age 45.

As you can see, there is a lot of individual variation in growth hormone levels. If you look carefully at the graph on the top-right corner, you will see a 50 year old who has a higher 24 h growth hormone secretion than many folks in 15-30 age range. This pattern of individual variation is common for the vast majority of traits anyway, and often the distribution of traits follows a normal, or bell-shaped, distribution. The bell-shaped distribution becomes clear when the traits are plotted based on frequency.

Growth hormone is secreted in pulses. In case you are wondering, growth hormone secretion in young women is higher than in young men. See the graphs below (click to enlarge), from this excellent article on growth hormone by Cummings and Merrian.

Yet, women do not put on a lot of muscle mass in response to weight training, regardless of the age at which they do weight training. This means that growth hormone, by itself, does not lead to significant gains in muscle mass. Androgenic hormones, like testosterone, play a key moderator role here. Muscle mass gain is the result of a number of things, including the combined action of various hormones. To complicate things further, not only do these hormones act together in an additive fashion, but they also influence each other.

Another reasonable conclusion from the data above on growth hormone secretion in young women and men is that growth hormone must indeed have major health-promoting effects, as most of the empirical data suggests. The reason is that, from an evolutionary standpoint, young (or pre-menopausal) women have always been the evolutionary bottleneck of any population of ancestral hominids. High survival rates among young women were a lot more important than high survival rates among men in general, in terms of the chances of survival of any population of ancestral hominids.

Higher survival rates among young ancestral women may have been enabled by higher levels of growth hormone, among other things. The onset of the metabolic syndrome, which is frequently in modern humans around age 45, may also be strongly influenced by falling growth hormone levels.

How can growth hormone secretion be increased after age 45? One obvious option is vigorous exercise, particularly resistance exercise.

Wednesday, June 2, 2010

Cortisol response to stress is much more elevated with ingestion of glucose than with protein or fat

Cortisol is a hormone that does a number of different things; a jack of all trades among hormones, so to speak. It tells the liver to produce glucose, preventing hypoglycemia. It also tells the liver to synthesize glycogen, which is in some ways the opposite of producing glucose. It tells the stomach to secret gastric acid. It is an anti-diuretic hormone. It suppresses the immune system, which is why it is frequently used to reduce inflammation, and treat allergies and various autoimmune diseases. It jump-starts an increase in free fatty acids in circulation, thus helping provide an important source of energy for endurance exercise.

Cortisol, together with epinephrine (a.k.a. adrenaline), even contributes to the creation of surprise-induced memories. It is because of this action of cortisol that Americans reading this post, especially those who lived in the East Coast in 2001, remember vividly where they were, what they were doing, and who they were with, when they first heard about the September 11, 2001 Attacks. I was living in Philadelphia at the time, and I remember those details very vividly, even though the Attacks happened almost 10 years ago. That is one of the fascinating things that cortisol does; it instantaneously turns short-term contextual memories temporally associated with a surprise event (i.e., a few minutes before and after the event) into long-term memories.

Similarly to insulin, you don’t want cortisol levels to be more elevated than they should naturally be. Natural levels being those experienced by our hominid ancestors on a regular basis. You need cortisol, but you don’t need too much of it. Many tissues in the body become resistant to hormones that are more elevated than they should be, like insulin and leptin, and this is also true for cortisol. It is a bit like people constantly shouting in your ears; after a while you cover your ears, or they get damaged, so people have to shout louder. If you frequently have acute elevations of cortisol levels, they may become chronically elevated due to cortisol resistance.

Chronically elevated cortisol levels are associated with the metabolic syndrome, the hallmark of the degenerative diseases of civilization.

Stress causes elevated cortisol levels. And those levels are significantly elevated if you consume foods that lead to a high blood glucose response after a meal. That is what an interesting experimental study by Gonzalez-Bono and colleagues (2002) suggests. The full reference and link to the study are at the end of this post. They used glucose, but we can reasonably conclude based on glucose metabolism research that foods rich in refined carbohydrates and sugars would have a very similar effect. If we think about the typical American breakfast, possibly even a stronger effect.

In order to do their study they needed to put the participants under stress. To cause stress the researchers did what many college professors have their students do at the end of the semester, which is also something that trial lawyers and preachers are good at, and something that most people hate doing. You guessed it. The researchers had their subjects do, essentially, some public speaking. The experimental task they used was a variation of the “Trier Social Stress Test” (TSST). The researchers asked the participants to conduct a 5-minute speech task and a 5-minute mental arithmetic task in front of an audience.

The participants were 37 healthy men who fasted for at least 8 h prior to the study. They were randomly assigned to one of four groups. The glucose group consumed 75 g of glucose dissolved in water. The fat group consumed 200 g of avocado. The protein group drank 83 g of proteins dissolved in water. The fourth group, the water group, drank plain water.

From a real world perspective, the fat and protein groups, unlike the glucose group, were arguably overloaded with their respective nutrients. Many people would not normally consume that much fat or protein in one single meal. This makes the results even more interesting, because it seems that fat and protein lead to virtually the same response as water, regardless of the amount ingested. The table below shows the cortisol responses for all groups.

As you can see, the cortisol response for the glucose group is a lot more elevated. How much more elevated? In the inner square at the top-left part of the figure you have the areas under the curve (AUC), which are essentially the estimates of the integrals of the cortisol curves for each of the groups. Usually AUC is a key measure when one looks at the potential negative impact of the elevated levels of a substance in the blood. Note that the cortisol AUC for the glucose group is much larger, about two times larger, than the cortisol AUCs for the other groups.

When one has a morning car commute, what is going to happen? Typically cortisol levels will be elevated, unless the commute is uneventful and done completely on “automatic pilot”; which is not very common, as people cut off in front of each other, make irritating mistakes etc.

What if, before that commute, one eats a “solid” breakfast with plenty of “healthy” sugary cereal covered with honey, a glass of “healthy” low-fat milk (of course, because fat “raises bad cholesterol”), and maybe three pancakes covered with syrup?

Cortisol levels will be much more elevated.

Doing this often, maybe after several years a person will become eligible for death by sudden cardiac arrest while doing some light activity.


Gonzalez-Bono, E., Rohleder, N., Hellhammer, D.H., Salvador, A., & Kirschbaum, C. (2002). Glucose but Not Protein or Fat Load Amplifies the Cortisol Response to Psychosocial Stress. Hormones and Behavior, 41(3), 328–333.