Preserving Mental Function: Low Tech vs. High Tech
Preserving Mental Function: Low Tech vs. High Tech
A number of years ago I was invited to lunch by a friend of one of my colleagues. When we met, the individual explained the reason he wanted to talk to me. He was in the process of changing careers and wanted to become involved with a cutting-edge medical technology. Which developing technology, he asked, did I believe would have he greatest impact on people’s health in the future? My answer did not please him. “What will have the greatest impact on the health of the average individual,” I replied, “is not high-tech, but low-tech.” I explained that a return to the basics of drinking pure water, eating real food, performing physical activities regularly, decreasing exposure to toxins, and supplementing the body’s nutritional needs will have a far greater impact upon the health of individuals than all advanced technologies combined. The gentleman frowned and had little to say during the remainder of the time we spent together. It was quite clear that he felt the conversation had been a colossal waste of time. My opinion has not changed. On the contrary, the more I learn about the capabilities of the human body the more convinced I become that those who pursue a low-tech path to optimum health will fare far better than people who expect advancing technology to provide answers to the many health challenges facing our nation and the world today. The ongoing political debate over government funding of embryonic stem cell research is a prime example of looking for high-tech answers to health challenges. It is difficult to imagine a more advanced technological achievement than harvesting an egg from an ovary in the pelvis of a living human being, fertilizing that egg with sperm in a laboratory environment, nurturing the fertilized egg so that it grows into an embryo, taking cells from the embryo and implanting them into another person to restore lost function. Degenerative neurological conditions such as Alzheimer’s disease or Parkinson’s disease are high on the list of diseases embryonic stem cell researchers hope to treat when the technology is perfected. While stem cell research, using not only embryonic but also post-natal and adult stem cells has received a great deal of attention, advances in our understanding of how low-tech approaches to health can significantly impact neurological conditions are rarely reported in the popular media. Publishers and broadcasters are apparently as unimpressed by the concept of taking a low-tech approach to optimum health as the man who was seeking a high-tech career. Prior to the 1990s it taught that the body was incapable of producing new nerve cells. I was taught that all of the brain cells a person would ever have were present at birth and that if they were lost for any reason, they could never be replaced. There was also little, if any, understanding of the brain’s rewiring capabilities. The possibility that the brain could reassign lost functions to other areas had not yet been considered. It is now known that new neurons do form in the brain. Interestingly, the area in which they are most likely to appear is the hippocampus, a structure that is critical to memory formation. The recognition that the adult brain is able to form new neurons has led to research into the factors that encourage such growth. That research is showing that the same factors that support the growth and development of new cells are critical for maintaining existing cells as well. In the March, 2005 issue I wrote of the importance now being ascribed to neurotrophic (nerve-nurturing) agents in the prevention and treatment of depression. Neurotrophic agents are also moving to the forefront of research into ways to prevent and potentially reverse degenerative diseases such as Alzheimer’s disease and Parkinson’s disease. One of these chemicals is called brain-derived neurotrophic factor (BDNF). BDNF is manufactured by neurons. The highest concentration of BDNF is found in the hippocampus, a major memory center of the brain. BDNF can be transferred from nerve to nerve and it can also be manufactured in the spinal cord. A great deal of interest in BDNF has been generated by the discovery that the chemical supports the survival of cells in those areas of the brain known to degenerate in Alzheimer’s disease. The finding that BDNF protein levels are decreased in the brains of people with Alzheimer’s disease has given added impetus to the research efforts. It is now believed that the importance of BDNF goes far beyond its ability to help nerves survive and rebound from injury. Evidence suggests that memory is accomplished by storing information in synapses, the points at which nerve cells connect to each other. BDNF has been shown to play a pivotal role in the ability of synapses to encode memory. When BDNF levels fall, as occurs in Alzheimer’s disease, the ability of nerve endings in the hippocampus and elsewhere to encode new information into the memory bank is diminished. Low BDNF levels also make the cells more vulnerable to injury and death. Genetic abnormalities related to BDNF function have been identified. These abnormalities have been shown to adversely affect cognitive function in young adults between the ages of 25 and 45 who show no evidence of dementia. Three genetic variants of BDNF have been found to be associated with an increased risk of developing Alzheimer’s disease. BDNF deficiencies have also been shown to be present in epilepsy, depression, and obsessive compulsive disorder. Research is suggesting a link to bipolar disorder. If the amount of BDNF in the brain was determined solely by one’s genetic code, variations in its level of activity would provide an interesting explanation for how certain diseases develop, but that knowledge would be of little practical value. BDNF research is not only demonstrating the relationship of the substance to various disease states, however, it is also showing that BDNF levels may be significantly increased through the use of several low-tech strategies. One of the most effective ways to increase BDNF activity is to regularly perform physical activities. Animal studies have confirmed that exercise increases the level of BDNF in the hippocampus and cortex of the brain, areas in which learning takes place and memory is stored. This is consistent with the finding that physically active human adults have a lower risk of encountering cognitive challenges or of developing depression or Alzheimer’s disease and other forms of dementia than their sedentary counterparts. The act of learning itself stimulates BDNF production, particularly in the hippocampus. As in the case of exercise, this is consistent with the finding that humans who dedicate themselves to life-long learning are less likely to experience depression or dementia as they age. Activities as simple as doing a crossword puzzle or completing a word-search puzzle have been shown to decrease the risk of developing Alzheimer’s disease when performed on a regular basis. Evidence is also accumulating that dietary factors play a role in supporting levels of BDNF and in preventing loss of neurons and cognitive ability. Animal studies have demonstrated that restricting calories increases the number of neurons in the hippocampus and increases BDNF activity. Other studies are revealing that diets high in saturated fat and sucrose (sugar), typical of the standard American diet, lower levels of BDNF and bring about a corresponding decline in cognitive ability. On the other hand, increasing the amount of omega-3 fats, such as those found in oil from cold-water fish, has been found to reduce the deposition of beta-amyloid in the brain by 70 percent. Amyloid deposits are one of the pathologic landmarks of Alzheimer’s disease. Curcumin, found in the spice turmeric, has a similar effect on reducing amyloid deposition in the brain. Garlic, berries, and nuts are also being shown to decrease the number of amyloid deposits in the brain. Almonds may be particularly beneficial, as animal studies have shown that a diet high in almonds decreases amyloid deposition and increases cognitive performance. The pharmaceutical industry is paying attention. Studies have been published showing that drugs that increase serotonin levels also boost synaptic BDNF activity. Research is now underway to find drugs that imitate the action of serotonin in the nerve synapse. One of these, PRX-03140, recently completed phase 1 trials in human subjects and is expected to proceed to phase 2 trials later this year. As I was researching the relationship between diet and BDNF I was struck by the number of articles recommending against the consumption of soy foods. All of the authors quoted a single study published by L. R. White and associates in 2000. The rush to condemn the eating of soy on the basis of the White study is yet another example of what I have come to call the Billy Joel principle. During an interview a reporter complemented singer/song writer Billy Joel on being an extraordinary musician. “No,” corrected Mr. Joel, “I’m a competent musician. In this business that makes me extraordinary!” It seems as though competent reporters in the medical world are as uncommon as competent musicians in the popular music industry. Unfortunately, their articles tend to be quoted as fact just as noise is hailed as music in today’s society. Since so many people are being advised to eliminate soy from their diets, it is worthwhile exploring how the White group reached their conclusion that the consumption of soy causes a loss of brain function. In 1965 a group of Hawaiian men were asked to participate in a research project on heart disease, stroke, and cancer. As part of the original study the men were interviewed on two occasions, once at the start of their participation (1965 – 1967) and again six years later (1971 – 1974). As a part of the interview they were asked to check off on a list of 26 foods those they had eaten during the previous week. The foods listed during the second interview did not correlate well with the foods listed on the first questionnaire, but White and his associates were undeterred. Between 1991 and 1993 they identified survivors of the study, who were then 71 to 93 years of age. They tested the men’s performance on cognitive tests and looked for brain atrophy on imaging studies or at autopsy. Men who reported eating tofu on at least two occasions during one week in the 1960s and one week in the 1970s were found to have, on average, lower scores on the cognitive tests and evidence of greater brain atrophy than those men who reported having eaten less than two meals containing tofu during one week in the 1960s and no tofu for one week in the 1970s. White’s conclusion that “high tofu consumption in mid-life is associated with indicators of cognitive impairment and brain atrophy in late life” has been widely accepted as evidence that including soy food in one’s diet is risky and should be avoided. Is the conclusion scientifically sound, or is there more to the story? Is there reason to suspect that the two servings of tofu reported were not the cause of the decline in mental function? There is reason indeed. To put the data upon which this study was based into perspective, if a man happened to have gone to a Chinese restaurant, ordered a dish containing tofu, and taken a “to go” box home to have leftovers for lunch the next day he was defined as a “high tofu” consumer for the purposes of the study. No attempt was made to determine whether the foods reportedly eaten during two weeks of the men’s lives represented their dietary habits over the course of their lifetime, nor was there any effort to quantify the total lifetime soy consumption in any way. Important questions went unasked. What was the source of the tofu? (A different study found that tofu in Hawaii contained significantly higher amounts of aluminum than tofu in Japan. Aluminum is an element commonly found in the characteristic lesions of Alzheimer’s disease.) How was it prepared? What seasonings were used? What condiments were applied? With what other foods was it eaten? (I am unaware of anyone who eats free-standing slices of tofu. It is generally added to stir-fry dishes or used to add protein in various recipes. Since food associations are important and may lead to a much different conclusion than when a limited food survey is considered, dietary research questionnaires today include in excess of 100 common foods, not the 26 listed in the Hawaiian atherosclerosis study.) The men in the “high tofu” group were older than the men in the “low tofu” group. As cognitive function declines and brain atrophy increases with age one would predict on that basis alone that the “high tofu” group would fare more poorly. The men in the “high tofu” group had a lower level of education and had performed less complex occupations during their working years. Educational level and performance of complex mental activities are known to protect against cognitive decline over time. The men in the “high tofu” group had also spent more of their lives in Japan, suggesting that there may well have been additional lifestyle differences between the two groups. The White group, while acknowledging the effects of age, educational level, and work status, concluded that those factors were not enough to explain the differences on their own and labeled soy an independent risk factor for the loss of mental capacity. If there were no other studies from which to draw a conclusion, the widespread endorsement of their findings might be understandable. The evidence of studies in which the characteristics of the participants were carefully matched and the actual soy intake monitored supports a far different conclusion, however. One can begin with the observation that the incidence of dementia and Alzheimer’s Disease in East Asia, where the consumption of soy is high, is far lower than in the United States and Europe, where soy consumption is generally low. Men of Japanese ancestry living in Hawaii, who tend to eat less soy, develop dementia and Alzheimer’s Disease more frequently than those living in Japan. Japanese women living in Western countries who continue their traditional soy-rich diet have a higher cognitive function than those who do not. In 2001 a study looking at the effect of soy consumption and mental function was published in the journal Psychopharmacology. Unlike the White study which did not monitor soy consumption over time and did not match the characteristics of the participants, the 2001 study did both. The participants were divided into two groups that were matched according to age, IQ, educational background, and the presence of anxiety or depression. Each group contained both men and women. The diets of both groups were carefully supervised; one received foods that provided 100 mg. of soy isoflavones (plant chemicals) daily and the other a diet that was limited to 0.5 mg. of soy isoflavones daily. The diets were consumed for 10 weeks. At the end of that time the high soy group showed significant improvement in their performance on tests of short-term memory and performance of a complex task. A 2002 report on the results of supplementing a group of women between the ages of 50 and 65 with 60 mg. of soy isoflavones over a 3 month period found that the women showed significant improvements in short term memory and in the performance of complex mental tasks. A control group that had been matched according to age, IQ, and education did not demonstrate the improvements. The SOPHIA (soy and post-menopausal health in aging – researchers love acronyms) study reported similar results. In the SOPHIA study, well-matched women between the ages of 55 and 74 were randomly given 110 mg. of soy isoflavones or a placebo daily for six months. Those receiving the soy showed improvement, with the greatest benefit seen in the older women. I am not saying that you must rush to the supermarket and stock up on soy foods to preserve your mental function. What I am saying is that the available evidence does not support the recommendation that soy be excluded from a balanced diet. To the contrary, current evidence is heavily weighted in favor of including soy in the diet. High-tech solutions to degenerative conditions like Alzheimer’s disease are at least years away from general application. Trials to date have been fraught with problems, including the growth of tumors from implanted stem cells. Drugs such as PRX-03140 are showing promise, but I have yet to find a drug that is free of adverse effects. What the side effects of PRX-03140 and similar drugs will be remains to be seen. It is safe to say that the costs of stem cell transplants and new drugs will keep them out of the reach of the masses for years to come. Low-tech solutions exist today. They are free of adverse effects and nearly everyone can afford to implement them. They are remarkably similar to the basic habits needed to support overall health. People who have increased their mental ability with exercise have done so by beginning with an easy 15 minute stroll that was increased to a brisk 45 minute walk as physical conditioning improved. The walk was performed three days a week and improvements in cognitive ability were seen within six months. Cutting back on total calories, saturated fat, and refined sugar will provide the average adult with benefits far beyond lowering the risk of developing dementia. The same can be said of the use of omega-3 supplements. There is no risk and potential for significant gain by including garlic, berries, and almonds in one’s diet. Curcumin is available in capsule form for those who do not like recipes that include turmeric, such as curry. Supplements such as NN dimethylglycine and 5-HTP, which support optimum serotonin levels are readily available and should be considered, especially when depression is a part of the picture. When it comes to the prevention and management of dementia the guiding principle should be the same as for any other health challenge. For most people the answer is low-tech, not high-tech. |