Iron Is the New Cholesterol


Cheerios are the best-selling breakfast cereal in America. The multi-grain version contains 18 milligrams of iron per serving, according to the label. Like almost any refined food made with wheat flour, it is fortified with iron. As it happens, there’s not a ton of oversight in the fortification process. One study measured the actual iron content of 29 breakfast cereals, and found that 21 contained 20 percent1 more than the label value, and 8 contained 50 percent more.1 One contained nearly 200 percent of the label value.

If your bowl of cereal actually contains 120 percent more iron than advertised, that’s about 22 mg. A safe assumption is that people tend to consume at least two serving sizes at a time.1 That gets us to 44 mg. The recommended daily allowance of iron is 8 mg for men and 18 mg for pre-menopausal women. The tolerable upper intake—which is the maximum daily intake thought to be safe by the National Institutes of Health—is 45 mg for adults.

Dalton_BREAKER-2

It is entirely feasible that an average citizen could get awfully close to exceeding the maximum daily iron intake regarded as safe with a single bowl of what is supposed to be a pretty healthy whole-grain breakfast option.

And that’s just breakfast.

At the same time that our iron consumption has grown to the borders of safety, we are beginning to understand that elevated iron levels are associated with everything from cancer to heart disease. Christina Ellervik, a research scientist at Boston Children’s Hospital who studies the connection between iron and diabetes, puts it this way: “Where we are with iron now is like where we were with cholesterol 40 years ago.”

The story of energy metabolism—the basic engine of life at the cellular level—is one of electrons flowing much like water flows from mountains to the sea. Our cells can make use of this flow by regulating how these electrons travel, and by harvesting energy from them as they do so. The whole set-up is really not so unlike a hydroelectric dam.

The sea toward which these electrons flow is oxygen, and for most of life on earth, iron is the river. (Octopuses are strange outliers here—they use copper instead of iron, which makes their blood greenish-blue rather than red). Oxygen is hungry for electrons, making it an ideal destination. The proteins that facilitate the delivery contain tiny cores of iron, which manage the handling of the electrons as they are shuttled toward oxygen.

This is why iron and oxygen are both essential for life. There is a dark side to this cellular idyll, though.

Oxygen and iron are essential for the production of energy, but may also conspire to destroy the delicate order of our cells.

Normal energy metabolism in cells produces low levels of toxic byproducts. One of these byproducts is a derivative of oxygen called superoxide. Luckily, cells contain several enzymes that clean up most of this leaked superoxide almost immediately. They do so by converting it into another intermediary called hydrogen peroxide, which you might have in your medicine cabinet for treating nicks and scrapes. The hydrogen peroxide is then detoxified into water and oxygen.

Things can go awry if either superoxide or hydrogen peroxide happen to meet some iron on the way to detoxification. What then happens is a set of chemical reactions (described by Haber-Weiss chemistry and Fenton chemistry) that produce a potent and reactive oxygen derivative known as the hydroxyl radical. This radical—also called a free radical—wreaks havoc on biological molecules everywhere. As the chemists Barry Halliwell and John Gutteridge—who wrote the book on iron biochemistry—put it, “the reactivity of the hydroxyl radicals is so great that, if they are formed in living systems, they will react immediately with whatever biological molecule is in their vicinity, producing secondary radicals of variable reactivity.”2

Such is the Faustian bargain that has been struck by life on this planet. Oxygen and iron are essential for the production of energy, but may also conspire to destroy the delicate order of our cells. As the neuroscientist J.R. Connor has said, “life was designed to exist at the very interface between iron sufficiency and deficiency.”3

Hemoglobin, ferritin, and transferrin

At the end of the 20th century, the metabolism of iron in the human body was still a bit of a mystery. Scientists knew of only two ways that the body could excrete iron—bleeding, and the routine sloughing of skin and gastrointestinal cells. But these processes amount to only a few milligrams per day. That meant that the body must have some way to tightly regulate iron absorption from the diet. In 2000 a major breakthrough was announced—a protein was found that functioned as the master regulator for iron. The system, as so many biological systems are, is perfectly elegant. When iron levels are sufficient, the protein, called hepcidin, is secreted into the blood by the liver. It then signals to gastrointestinal cells to decrease their absorption of iron, and for other cells around the body to sequester their iron into ferritin, a protein that stores iron. When iron levels are low, blood levels of hepcidin fall, and intestinal cells begin absorbing iron again. Hepcidin has since become recognized as the principal governor of iron homeostasis in the human body.

But if hepcidin so masterfully regulates absorption of iron from the diet to match the body’s needs, is it possible for anyone to absorb too much iron?

In 1996, a team of scientists announced that they had discovered the gene responsible for hereditary hemochromatosis, a disorder causing the body to absorb too much iron. They called it HFE. Subsequent work revealed that the product of the HFE gene was instrumental in regulating hepcidin. People with a heritable mutation in this gene effectively have a gross handicap in the entire regulatory apparatus that hepcidin coordinates.

This, then, leaves open the possibility that some of us could in fact take in more iron than the body is able to handle. But how common are these mutations? Common enough to matter for even a minority of people reading these words?

Dalton_BREAKER-1

Surprisingly, the answer is yes. The prevalence of hereditary hemochromatosis, in which two defective copies of the HFE gene are present and there are clinical signs of iron overload, is actually pretty high—as many as 1 in 200 in the United States. And perhaps 1 in 40 may have two defective HFE genes without overt hemochromatosis.4 That’s more than 8 million Americans who could have a significant short-circuit in their ability to regulate iron absorption and metabolism.

What if you have only one defective HFE gene, and one perfectly normal gene? This is called heterozygosity. We would expect to find more people in this situation than the homozygotes, or those with two bad copies of the gene. And in fact we do. Current estimates suggest that more than 30 percent of the U.S. population could be heterozygotes with one dysfunctional HFE gene.4 That’s pretty close to 100 million people.

Does this matter? Or is one good gene enough? There isn’t much research, but so far the evidence suggests that some heterozygotes do have impaired iron metabolism. Studies have shown that HFE heterozygotes seem to have modest elevations of ferritin as well as transferrin, a protein which chaperones iron through the blood, which would indicate elevated levels of iron.5,6 And a study published in 2001 concluded that HFE heterozygotes may have up to a fourfold increased risk of developing iron overload.4

A host of research articles have supported an association between iron and cancer.

Perhaps more concerning is that these heterozygotes have also been shown to be at increased risk for several chronic diseases, like heart disease and stroke. One study found that heterozygotes who smoked had a 3.5 times greater risk of cardiovascular disease than controls, while another found that heterozygosity alone significantly increased the risk of heart attack and stroke.7,8 A third study found that heterozygosity increased nearly sixfold the risk of cardiomyopathy, which can lead to heart failure.9

The connection between excessive iron and cardiovascular disease may extend beyond HFE heterozygotes. A recent meta-analysis identified 55 studies of this connection that were rigorous enough to meet their inclusion criteria. Out of 55 studies, 27 supported a positive relationship between iron and cardiovascular disease (more iron equals more disease), 20 found no significant relationship, and 8 found a negative relationship (more iron equals less disease).10

A few highlights: a Scandinavian study compared men who suffered a heart attack to men who didn’t, and found that elevated ferritin levels conferred a two- to threefold increase in heart attack risk. Another found that having a high ferritin level made a heart attack five times more likely than having a normal level. A larger study of 2,000 Finnish men found that an elevated ferritin level increased the risk of heart attack twofold, and that every 1 percent increase in ferritin level conferred a further 4 percent increase in that risk. The only other risk factor found to be stronger than ferritin in this study was smoking.

Ferritin isn’t a perfect marker of iron status, though, because it can also be affected by anything that causes inflammation. To address this problem a team of Canadian researchers directly compared blood iron levels to heart attack risk, and found that higher levels conferred a twofold increased risk in men and a fivefold increased risk in women.

If cardiovascular disease is one point in iron’s web of disease, diabetes may be another. The first hint of a relationship between iron and diabetes came in the late 1980s, when researchers discovered that patients receiving regular blood transfusions (which contain quite a bit of iron) were at significantly increased risk of diabetes. In hemochromatosis, there had been no way to know if the associated disturbance in glucose metabolism was due to the accumulation of iron itself, or to the underlying genetic defect. This new link between frequent transfusions and diabetes was indirect evidence that the iron itself may be the cause.

The next step was to mine existing data for associations between markers of iron status and diabetes. The first study to do so came out of Finland in 1997: Among 1,000 randomly selected Scandinavian men, ferritin emerged as a strong predictor of dysfunctional glucose metabolism, second only to body mass index as a risk factor.11 In 1999, researchers found that an elevated ferritin level increased the odds of having diabetes fivefold in men and nearly fourfold in women—similar in magnitude to the association between obesity and diabetes.12 Five years later, another study found that elevated ferritin roughly doubled the risk for metabolic syndrome, a condition that often leads to diabetes, hypertension, liver disease, and cardiovascular disease.13

Christina Ellervik’s first contribution to the field came in 2011, with a study investigating the association between increased transferrin saturation—a measure of how much iron is loaded onto the transferrin protein, which moves iron through the blood—and diabetes risk.14 Ellervik found that within a sample of nearly 35,000 Danes, transferrin saturation greater than 50 percent conferred a two- to threefold increased risk of diabetes. She also identified an increase in mortality rates with transferrin saturation greater than 50 percent.

In 2015, she led another study that found that, among a sample of 6,000 people, those whose ferritin levels were in the highest 20 percent had 4 times greater odds of diabetes than those with ferritin levels in the lowest 20 percent.15 Blood glucose levels, blood insulin levels, and insulin sensitivity all were raised with higher ferritin levels.

“It’s incredible that there is so much promising literature, and nobody—nobody—is doing the clinical trials.”

There’s a problem here, though. All of these studies show associations. They show that two things tend to happen together. But they don’t tell us anything about causality. To learn something about causality, you need an intervention. In the case of iron, you’d need to lower the iron and then watch what happens. Fortunately, there’s a very easy and very safe intervention to lower iron levels that’s performed millions of times every year—phlebotomy, also known as blood donation.

One of the first studies to use phlebotomy to examine the relationship between iron and diabetes was published in 1998.16 The authors found that among both healthy and diabetic subjects, phlebotomy improved insulin sensitivity and glucose metabolism. A 2005 study found that regular blood donors exhibited lower iron stores and significantly greater insulin sensitivity than non-donors.17 In 2012, researchers phlebotomized pre-diabetic volunteers until their ferritin levels dropped significantly, and found a marked subsequent improvement in their insulin sensitivity.18 In that same year, a different group of scientists studied the effect of phlebotomy on several elements of metabolic syndrome, including glucose metabolism. They found that a single phlebotomy session was associated with improvement in blood pressure, fasting glucose, hemoglobin A1C (a marker for average glucose levels), and blood cholesterol six weeks later.19

Many caveats apply to this evidence—the line between correlation and causation remains unclear, some of the studies used relatively small sample sizes, and phlebotomy may cause other changes in addition to lowering iron. But taken together, the data lends weight to the idea that iron plays a significant role in the tortuous pathophysiology of diabetes.

As more published data began to suggest a relationship between iron, cardiovascular disease, and diabetes, researchers started casting broader nets.

Next up was cancer.

It had been known since the late 1950s that injecting large doses of iron into lab animals could cause malignant tumors, but it wasn’t until the 1980s that scientists began looking for associations between iron and cancer in humans. In 1985, Ernest Graf and John Eton proposed that differences in colon cancer rates among countries could be accounted for by the variation in the fiber content of local diets, which can in turn affect iron absorption.20

The following year, Richard Stevens found that elevated ferritin was associated with triple the risk of death from cancer among a group of 20,000 Chinese men.21 Two years later Stevens showed that American men who developed cancer had higher transferrin saturation and serum iron than men who didn’t.22 In 1990, a large study of Swedish blood donors found that they were 20 percent less likely to get cancer than non-donor controls.23 Four years later, a group of Finnish researchers found that elevated transferrin saturation among 40,000 Scandinavians conferred a threefold increase risk for colorectal cancer, and a 1.5-fold increased risk for lung cancer.24

A host of research articles have been published since Graf and Eton’s first paper, and most have supported an association between iron and cancer—particularly colorectal cancer. In 2001, a review of 33 publications investigating the link between iron and colorectal cancer found that more than 75 percent of them supported the relationship.25 A 2004 study found an increased risk of death from cancer with rising serum iron and transferrin saturation. People with the highest levels were twice as likely to die from cancer than those with the lowest levels.26 And in 2008, another study confirmed that Swedish blood donors had about a 30 percent decrease in cancer risk.27

Dalton_BREAKER-3

There are a few other lines of evidence that support the association between iron and cancer. People with an HFE mutation have an increased risk of developing colon and blood cancers.28 Conversely, people diagnosed with breast, blood, and colorectal cancers are more than twice as likely to be HFE heterozygotes than are healthy controls.29

There are also a handful of interventional trials investigating the relationship between iron and cancer. The first was published in 2007 by a group of Japanese scientists who had previously found that iron reduction via phlebotomy essentially normalized markers of liver injury in patients with hepatitis C. Hepatocellular carcinoma (HCC) is a feared consequence of hepatitis C and cirrhosis, and they hypothesized that phlebotomy might also reduce the risk of developing this cancer. The results were remarkable—at five years only 5.7 percent of patients in the phlebotomy group had developed HCC compared to 17.5 percent of controls. At 10 years the results were even more striking, with 8.6 percent of phlebotomized patients developing HCC compared to an astonishing 39 percent of controls.30

The second study to investigate the effects of phlebotomy on cancer risk was published the following year by Leo Zacharski, a colorful emeritus professor at Dartmouth. In a multi-center, randomized study originally designed to look at the effects of phlebotomy on vascular disease, patients allocated to the iron-reduction group were about 35 percent less likely to develop cancer after 4.5 years than controls. And among all patients who did develop cancer, those in the phlebotomy group were about 60 percent less likely to have died from it at the end of the follow-up period.31

The brain is a hungry organ. Though only 2 to 3 percent of body mass, it burns 20 percent of the body’s total oxygen requirement. With a metabolism that hot, it’s inevitable that the brain will also produce more free radicals as it churns through all that oxygen. Surprisingly, it’s been shown that the brain appears to have less antioxidant capacity than other tissues in the body, which could make it more susceptible to oxidative stress.32 The balance between normal cellular energy metabolism and damage from reactive oxygen species may be even more delicate in the brain than elsewhere in the body. This, in turn, points to a sensitivity to iron.

It’s been known since the 1920s that neurodegenerative disease—illnesses like Alzheimer’s and Parkinson’s—is associated with increased iron deposition in the brain. In 1924, a towering Parisian neurologist named Jean Lhermitte was among the first to show that certain regions of the brain become congested with abnormal amounts of iron in advanced Parkinson’s disease.33 Thirty years later, in 1953, a physician named Louis Goodman demonstrated that the brains of patients with Alzheimer’s disease had markedly abnormal levels of iron deposited in the same regions as the famed plaques and tangles that define the illness.34 Goodman’s work was largely forgotten for several decades, until a 1992 paper resurrected and confirmed his findings and kindled new interest. Two years later an exciting new technology called MRI was deployed to probe the association between iron and disease in living patients, confirming earlier autopsy findings that Alzheimer brains demonstrated significant aberrations in tissue iron.35

Zacharski is convinced that iron overload is a huge common fulcrum underlying much of the chronic metabolic disease that is sweeping Western countries.

By the mid 1990s, there was compelling evidence that Alzheimer’s and Parkinson’s disease involved some dysregulation of iron metabolism in the brain, but no one knew whether the relationship was cause or consequence of the disease process. Hints began trickling in at around the same time the MRI findings were being published. A 1993 paper reported that iron promoted aggregation of amyloid-b, the major constituent of Alzheimer’s plaques.36 In 1997, researchers found that the aberrant iron associated with Alzheimer’s plaques was highly reactive and able to freely generate toxic oxygen radicals.37 By 2010, it had been shown that oxidative damage was one of the earliest detectable changes associated with Alzheimer’s, and that reactive iron was present in the earliest stages of the disease.38,39 And in 2015, a seven-year longitudinal study showed that cerebrospinal fluid ferritin levels were a strong predictor of cognitive decline and development of Alzheimer’s dementia.40

Perhaps most surprising was the discovery in 1999 that the pre-cursor to amyloid-b was under direct control by cellular iron levels—the more iron around, the more amyloid was produced.41 This raised the tantalizing possibility that amyloid plaques might actually represent an adaptive response rather than a cause, an idea that has been indirectly supported by the spectacular failure of essentially all efforts to directly target amyloid protein as treatment for the disease.

Together, these findings suggest that abnormal iron metabolism in the brain could be a causative factor in Alzheimer’s and other neurodegenerative diseases. If that’s true, then we might expect people who are genetically predisposed to an aberrant iron metabolism would be at higher risk of dementing diseases than others. And so they are.

In the early 2000s, it was discovered that patients with familial Alzheimer’s were more likely to possess one of the HFE genes than healthy controls.42 Another study found that these genotypes were associated with earlier onset of the disease compared to controls, and that there was an even more powerful effect in people who an HFE as well as an ApoE4 gene, the primary genetic risk factor for Alzheimer’s disease.43 A 2004 study showed that the co-occurrence of the HFE gene with a known variant in the transferrin gene conferred a fivefold increased risk of Alzheimer’s.44 Two years later a team of Portuguese scientists found that the HFE variants were associated with increased risk of Parkinson’s as well.45

What about interventional trials? For neurodegenerative disease, there has been exactly one. In 1991, a team of Canadian scientists published the results of a two-year randomized trial of the iron chelator desferrioxamine in 48 patients with Alzheimer’s disease.46 Chelators are a class of medication that bind metal cations like iron, sequester them, and facilitate their excretion from the body. Patients were randomly allocated to receive desferrioxamine, placebo, or no treatment. The results were impressive—at two years, iron reduction had cut the rate of cognitive decline in half.

The study was published in The Lancet, one of the world’s most prestigious medical journals, but seems to have been forgotten in the 20-odd year interim. Not a single interventional study testing the role of iron in Alzheimer’s disease has been published since.

If so many studies seem to show a consistent association between iron levels and chronic disease, why isn’t more work being done to clarify the risk?

“It’s incredible that there is so much promising literature, and nobody—nobody—is doing the clinical trials,” Dartmouth’s Zacharski said to me. “If people would just take up the gauntlet and do well-designed, insightful studies of the iron hypothesis, we would have a much firmer understanding of this. Just imagine if it turns out to be verified!”

His perspective on why more trials haven’t been done is fascinating, and paralleled much of what other experts in the field said. “Sexiness,” believe it or not, came up in multiple conversations—molecular biology and targeted pharmaceuticals are hot (and lucrative), and iron is definitively not. “Maybe it’s not sexy enough, too passé, too old school,” said one researcher I spoke to. Zacharski echoed this in our conversation, and pointed out that many modern trials are funded by the pharmaceutical industry, which is keen to develop the next billion-dollar drug. Government agencies like the NIH can step in to fill gaps left by the for-profit research industry, but publically funded scientists are subject to the same sexiness bias as everyone else. As one senior university scientist told me, “NIH goes for fashion.”

Zacharski is convinced that iron overload is a huge common fulcrum underlying much of the chronic metabolic disease that is sweeping Western countries. He thinks that even subtly elevated iron levels can result in free radical formation, which then contribute to chronic inflammation. And chronic inflammation, we know, is strongly linked to everything from heart disease to diabetes, cancer to Alzheimer’s.

“If this doesn’t deserve randomized trials,” he told me, “then I don’t know what does.”

Until those randomized trials arrive—I’ll see you at the blood bank.

Clayton Dalton is an emergency medicine resident at Massachusetts General Hospital in Boston. He has published stories and essays with NPR, Aeon, and The Los Angeles Review.

Lead image: Liliya Kandrashevich / Shuttterstock

References

1. Whittaker, P., Tufaro, P.R., & Rader. J.I. Iron and folate in fortified cereals. The Journal of the American College of Nutrition 20, 247-254 (2001).

2. Halliwell, B. & Gutteridge, J.M. Oxygen toxicity, oxygen radicals, transition metals and disease. Biochemical Journal 219, 1-14 (1984).

3. Connor, J.R. & Ghio, A.J. The impact of host iron homeostasis on disease. Preface. Biochimica et Biophysica Acta 1790, 581-582 (2009).

4. Hanson, E.H., Imperatore, G., & Burke, W. HFE gene and hereditary hemochromatosis: a HuGE review. Human Genome Epidemiology. American Journal of Epidemiology 154, 193-206 (2001).

5. Beutler, E., Felitti, V.J., Koziol, J.A., Ho, N.J., & Gelbart, T. Penetrance of 845G—> A (C282Y) HFE hereditary haemochromatosis mutation in the USA. The Lancet 359, 211-218 (2002).

6. Rossi, E., et al. Effect of hemochromatosis genotype and lifestyle factors on iron and red cell indices in a community population. Clinical Chemistry 47, 202-208 (2001).

7. Roest, M., et al. Heterozygosity for a hereditary hemochromatosis gene is associated with cardiovascular death in women. Circulation 100, 1268-1273 (1999).

8. Tuomainen, T.P., et al. Increased risk of acute myocardial infarction in carriers of the hemochromatosis gene Cys282Tyr mutation: A prospective cohort study in men in eastern Finland. Circulation 100, 1274-1279 (1999).

9. Pereira, A.C., et al. Hemochromatosis gene variants in patients with cardiomyopathy. American Journal of Cardiology 88, 388-391 (2001).

10. Muñoz-bravo, C., Gutiérrez-bedmar, M., Gómez-aracena, J., García-rodríguez, A., & Navajas, J.F. Iron: protector or risk factor for cardiovascular disease? Still controversial. Nutrients 5, 2384-2404 (2013).

11. Tuomainen, T.P., et al. Body iron stores are associated with serum insulin and blood glucose concentrations. Population study in 1,013 eastern Finnish men. Diabetes Care 20, 426-428 (1997).

12. Ford, E.S. & Cogswell, M.E. Diabetes and serum ferritin concentration among U.S. adults. Diabetes Care 22, 1978-1983 (1999).

13. Jehn, M., Clark, J.M., & Guallar, E. Serum ferritin and risk of the metabolic syndrome in U.S. adults. Diabetes Care 27, 2422-2428 (2004).

14. Ellervik, C., et al. Elevated transferrin saturation and risk of diabetes: three population-based studies. Diabetes Care 34, 2256-2258 (2011).

15. Bonfils, L., et al. Fasting serum levels of ferritin are associated with impaired pancreatic beta cell function and decreased insulin sensitivity: a population-based study. Diabetologia 58, 523-533 (2015).

16. Facchini, F.S. Effect of phlebotomy on plasma glucose and insulin concentrations. Diabetes Care 21, 2190 (1998).

17. Fernández-real, J.M., López-bermejo, A., & Ricart, W. Iron stores, blood donation, and insulin sensitivity and secretion. Clinical Chemistry 51, 1201-1205 (2005).

18. Gabrielsen, J.S., et al. Adipocyte iron regulates adiponectin and insulin sensitivity. Journal of Clinical Investigation 122, 3529-3540 (2012).

19. Houschyar, K.S., et al. Effects of phlebotomy-induced reduction of body iron stores on metabolic syndrome: results from a randomized clinical trial. BMC Medicine 10:54 (2012).

20. Graf, E. & Eaton, J.W. Dietary suppression of colonic cancer. Fiber or phytate?. Cancer 56, 717-718 (1985).

21. Stevens, R.G., Beasley, R.P., & Blumberg, B.S. Iron-binding proteins and risk of cancer in Taiwan. Journal of the National Cancer Institute 76, 605-610 (1986).

22. Stevens, R.G., Jones, D.Y., Micozzi, M.S., & Taylor, P.R. Body iron stores and the risk of cancer. New England Journal of Medicine 319, 1047-1052 (1988).

23. Merk, K., et al. The incidence of cancer among blood donors. International Journal of Epidemiology 19, 505-509 (1990).

24. Knekt, P., et al. Body iron stores and risk of cancer. International Journal of Cancer 56, 379-382 (1994).

25. Nelson, R.L. Iron and colorectal cancer risk: human studies. Nutrition Review 59, 140-148 (2001).

26. Wu, T., Sempos, C.T., Freudenheim, J.L., Muti, P., & Smit, E. Serum iron, copper and zinc concentrations and risk of cancer mortality in US adults. Annals of Epidemiology 14, 195-201 (2004).

27. Edgren, G., et al. Donation frequency, iron loss, and risk of cancer among blood donors. Journal of the National Cancer Institute 100, 572-579 (2008).

28. Nelson, R.L., Davis, F.G., Persky, V., & Becker, E. Risk of neoplastic and other diseases among people with heterozygosity for hereditary hemochromatosis. Cancer 76, 875-879 (1995).

29. Weinberg, E.D. & Miklossy, J. Iron withholding: a defense against disease. Journal of Alzheimer’s Disease 13, 451-463 (2008).

30. Kato, J., et al. Long-term phlebotomy with low-iron diet therapy lowers risk of development of hepatocellular carcinoma from chronic hepatitis C. Journal of Gastroenterology 42, 830-836 (2007).

31. Zacharski, L.R., et al. Decreased cancer risk after iron reduction in patients with peripheral arterial disease: results from a randomized trial. Journal of the National Cancer Institute 100, 996-1002 (2008).

32. Lee, H.G., et al. Amyloid-beta in Alzheimer disease: the null versus the alternate hypotheses. Journal of Pharmacology and Experimental Therapeutics 321, 823-829 (2007).

33. Lhermitte, J., Kraus, W.M., & Mcalpine, D. Original Papers: On the occurrence of abnormal deposits of iron in the brain in Parkinsonism with special reference to its localisation. Journal of Neurology and Psychopathology 5, 195-208 (1924).

34. Goodman, L. Alzheimer’s disease; a clinico-pathologic analysis of twenty-three cases with a theory on pathogenesis. The Journal of Nervous and Mental Disease 118, 97-130 (1953).

35. Bartzokis, G., et al. In vivo evaluation of brain iron in Alzheimer’s disease and normal subjects using MRI. Biological Psychiatry 35, 480-487 (1994).

36. Mantyh, P.W., et al. Aluminum, iron, and zinc ions promote aggregation of physiological concentrations of beta-amyloid peptide. Journal of Neurochemistry 61, 1171-1174 (1993).

37. Smith, M.A., Harris, P.L., Sayre, L.M., & Perry, G. Iron accumulation in Alzheimer disease is a source of redox-generated free radicals. Proceedings of the National Academy of Sciences 94, 9866-9868 (1997).

38. Nunomura, A., et al. Oxidative damage is the earliest event in Alzheimer disease. Journal of Neuropathology and Experimental Neurology 60, 759-767 (2001).

39. Smith, M.A., et al. Increased iron and free radical generation in preclinical Alzheimer disease and mild cognitive impairment. Journal of Alzheimer’s Disease 19, 363-372 (2010).

40. Ayton, S., Faux, N.G., & Bush, A.I. Ferritin levels in the cerebrospinal fluid predict Alzheimer’s disease outcomes and are regulated by APOE. Nature Communications 6:6760 (2015).

41. Rogers, J.T., et al. Translation of the alzheimer amyloid precursor protein mRNA is up-regulated by interleukin-1 through 5’-untranslated region sequences. Journal of Biological Chemistry 274, 6421-6431 (1999).

42. Moalem, S., et al. Are hereditary hemochromatosis mutations involved in Alzheimer disease? American Journal of Medical Genetics 93, 58-66 (2000).

43. Combarros, O., et al. Interaction of the H63D mutation in the hemochromatosis gene with the apolipoprotein E epsilon 4 allele modulates age at onset of Alzheimer’s disease. Dementia and Geriatric Cognitive Disorders 15, 151-154 (2003).

44. Robson, K.J., et al. Synergy between the C2 allele of transferrin and the C282Y allele of the haemochromatosis gene (HFE) as risk factors for developing Alzheimer’s disease. Journal of Medical Genetics 41, 261-265 (2004).

45. Pulliam, J.F., et al. Association of HFE mutations with neurodegeneration and oxidative stress in Alzheimer’s disease and correlation with APOE. American Journal of Medical Genetics; Part B 119B, 48-53 (2003).

46. Crapper-McLachlan, D.R., et al. Intramuscular desferrioxamine in patients with Alzheimer’s disease. The Lancet 337, 1304-1308 (1991).

Advertisements

Is Intermittent Fasting Really Worth It?


After all, 16 hours is a long time to go without eating. Here’s everything you need to know about the popular weight-loss regimen—including whether it actually works.

Chris Pratt! Hugh Jackman! Halle Berry! Kourtney Kardashian! What these celebrities have in common, other than a gratuitous exclamation point after their names, is a professed fondness for intermittent fasting, the diet craze turning the fitness world on its sweaty, well-toned head. For help determining whether you, too, should incorporate this into your 2019 resolution-related plans, we asked a few experts to explain what it is, why people love it, and whether it’s really worth the pain of forgoing on-demand snacks for the rest of the winter.

illustration of a man eating while using a clock as a table


What is intermittent fasting, exactly?

Intermittent fasting, unlike many other diets, is famously flexible in that you choose the days and hours during which you think it’s best to fast. The two most common methods are the 16:8 strategy—where you eat whatever you want (within reason) for eight hours a day and then fast for the other 16—and the 5:2 method, where you eat normally five days a week and then keep your food intake to roughly 500-600 calories for the other two days. It’s kind of a simplified-calories math problem that’s supposed to prevent the yo-yo effect of weight loss and weight gain.

“There are different ways to do this diet, but the bottom line is that no matter which you choose, you’re taking in less energy, and because of that, you’re going to start using your own body stores for energy,” says Lisa Sasson, a clinical professor of nutrition at NYU. “If you don’t, you’re not going to lose weight.”

Why might I want to try it?

A recent study completed by the German Cancer Research Center concluded that intermittent fasting indeed “helps lose weight and promotes health,” and noted that the regimen proved especially adept at getting rid of fat in the liver. A USC study also found that the diet reduced participants’ risk of cancer, diabetes, heart disease, and other age-related diseases. While researchers involved cautioned that more testing is necessary, the results are at least encouraging.

Most people who swear by intermittent fasting will tell you it helps not only with losing weight but also with reducing “belly fat.” This is not a conclusion with scientific backing, but it is the sort of thing to which every six-pack enthusiast aspires.


Why might I not want to try it?

“There’s really no conclusive evidence that there’s any benefit,” Sasson says. The German Cancer Research Center study qualified its findings by noting that the positive results weren’t noticeably better than those experienced by subjects who adopted a conventional calorie-reduction diet. In other words, it works, but not notably better than the alternative. (Sasson also offered a helpful list of individuals who should not give intermittent fasting a try: pregnant women and anyone with diabetes, cancer, or an eating disorder.)

The best long-term diets, no matter what their rules entail, are the ones that are least difficult to maintain—and again, in this regard, intermittent fasting isn’t inherently superior to anything else. “Are you making changes in your behavior? Have you learned positive habits so that when you go back to not fasting, you’re going to be a healthier eater?” Sasson asks. “I know people who fast because they think, Okay, I’m going to be really bad and overdrink or overeat, and then two days a week I’m going to have a clean life, and that’s just not how it works.”

Also, for many people, a full 16 hours of fasting just isn’t realistic, says Cynthia Sass, a New York City– and L.A.-based performance nutritionist. She recommends 12 hours of overnight fasting at most and believes the 16-hour gap is especially tough on those who exercise early in the morning or late at night. “If fasting makes you feel miserable and results in intense cravings and rebound overeating, it’s not the right path for you,” she says.

So—should I try it?

As long as you’re aware that it isn’t nutritional magic, Sasson isn’t against intermittent fasting altogether. “I’ve worked with patients who need positive reinforcement to see that their weight went down to feel better, and they feel in control for the first time,” she says. “That self-efficacy, that feeling that they could do it—for some, that might be important.”

Of the two most popular methods, Sasson leans toward the 5:2 schedule as slightly more manageable, since you’re only reducing your intake twice a week. But again, that’s contingent on you being a responsible dieter on your days of lowered caloric intake, which requires an immense amount of discipline—especially when it comes to remembering to drink water. “You can go a long time without food, but only a few days without adequate hydration,” she warns.

If these extended periods without delicious food sound too painful to handle, rest assured: The best available evidence indicates that a regular ol’ diet is at least as safe and healthy and efficacious as intermittent fasting. Besides, sooner or later, a shiny new fad is bound to come along for the A-listers to fawn over, she says: “There’s going to be a new darling of the month before you know it.”

Scientists Find Fluoride Causes Hypothyroidism Leading To Depression, Weight Gain, and Worse


The tables are finally starting to turn in regard to the perception that the world has of water fluoridation following the release of at least two reputable studies over the past three years documenting the adverse health effects caused by the chemical.

Researchers from the University of Kent, a public research university based in the United Kingdom, conducted the latest and considerably groundbreaking study on the health effects potentially caused by adding fluoride to the public’s water.

After studying data obtained from nearly every medical practice in England, scientists found that fluoride may be increasing the risk for hypothyroidism, or an underactive thyroid, a condition in which the thyroid gland fails to produce enough hormones, resulting in symptoms such as fatigue, obesity and depression.

Published in the Journal of Epidemiology and Community Health, the study included the largest population ever analyzed in relation to the adverse health effects caused by water fluoridation.

Recent UK study includes the “largest population ever studied in regard to adverse effects of elevated fluoride exposure”

After collecting data from 99 percent of England’s 8,020 general medical practices, researchers found that the locations with fluoridated water were 30 percent more likely to have high levels of hypothyroidism, compared to areas with low, natural levels of the chemical in the water.

This means that up to 15,000 people could be suffering from depression, weight gain, fatigue and aching muscles, all of which could theoretically be prevented if fluoride were removed from the water, according to The Telegraph.

“Overall, there were 9 percent more cases of underactive thyroid in fluoridated places,” reports Newsweek, which also notes that 10 percent of England’s water is fluoridated compared with nearly 70 percent of America’s.

The science paper also compared the fluoridated city of Birmingham with the city of Manchester, which refrains from fluoridating, and found that doctor’s offices in Birmingham were nearly twice as likely to report high levels of hypothyroidism.

The new report has some experts questioning their stance on water fluoridation.

“The study is an important one because it is large enough to detect differences of potential significance to the health of the population,” said Trevor Sheldon, a medical researcher and dean of the Hill York Medical School who has published numerous studies in this field.

Sheldon, who in the past supported fluoride, admits that the “case for general water fluoridation” is no longer clear.

New fluoride study contradicts last year’s report by Public Health England that states fluoride is “safe and effective” for improving dental health

Released in March of last year, Public Health England’s report states that “there is no evidence of harm to health in fluoridated areas,” and no differences were found between fluoridated and non-fluoridated areas in regard to rates of hip fractures, osteosarcoma (a form of bone cancer), cancers overall, Down’s syndrome births and all other recorded causes of death.

New research, however, suggests that the spike in the number of cases of hypothyroidism in areas such as the West Midlands and the North East of England is “concerning for people living in those areas.”

“The difference between the West Midlands, which fluoridates, and Manchester, which doesn’t was particularly striking. There were nearly double the number of cases in Manchester,” said the study’s lead author Stephen Peckham.

Women 15 times more likely to develop underactive thyroid

“Underactive thyroid is a particularly nasty thing to have and it can lead to other long term health problems. I do think councils need to think again about putting fluoride in the water. There are far safer ways to improve dental health.”

Hypothyroidism is particularly a cause for concern for women, as they’re 15 times more likely than men to develop the condition. Previous studies suggest that fluoride inhibits the thyroid’s ability to use iodine, which is an essential mineral for a healthy thyroid, the master gland in the human body.

 

Sources:
http://www.newsweek.com
http://jech.bmj.com
http://www.telegraph.co.uk
https://www.gov.uk

Inhaled Isopropyl Alcohol Superior to Oral Ondansetron as an Antiemetic


Astonishing results from a small, well-designed study may have far-reaching implications.

Though ondansetron is viewed by many as the first-line agent for nausea in the emergency department (ED), there is evidence it doesn’t work in noncancer patients (NEJM JW Emerg Med Aug 2014 and Ann Emerg Med 2014; 64:526). An alternative agent, inhaled isopropyl alcohol, has shown promise (NEJM JW Emerg Med Feb 2016 and Ann Emerg Med 2016; 68:1).

In the current trial, 120 adult ED patients with nausea or vomiting who did not require intravenous access were randomized to inhaled isopropyl alcohol plus 4 mg oral ondansetron; inhaled isopropyl alcohol plus oral placebo; or inhaled saline plus 4 mg oral ondansetron. Isopropyl alcohol was provided in the form of a standard alcohol swab. Patients received a single dose of the oral intervention but could sniff alcohol or saline swabs repeatedly. Nausea was measured on a 100-mm visual analog scale at baseline and 30 minutes.

Mean nausea scores decreased by 30 mm in the alcohol/ondansetron group, 32 mm in the alcohol/placebo group, and 9 mm in the saline/ondansetron group. Rescue antiemetic therapy was given to 28%, 25%, and 45% of each group, respectively. Differences between alcohol and saline groups were statistically significant. Patients in the inhaled alcohol groups also had better nausea control at the time of discharge and reported higher satisfaction with nausea treatment. No adverse events occurred. The mechanism of action is currently unknown.

Comment

It is uncommon for us to assign a rating of “Practice Changing” to a small, single-center study, but these results are truly remarkable and are consistent with prior research. For patients not obviously requiring IV therapy, we should treat nausea with repeated inhalations from an isopropyl alcohol swab instead of administering any other drug. And, although this study provides no direct evidence of benefit to patients who do require IV therapy, there would seem to be little downside to trying this simple and safe intervention in that group, too.

Obesity, Weight Gain Linked to Fibrosis Progression in NAFLD


Obesity and weight gain are independently associated with an increased risk for fibrosis progression in patients with nonalcoholic fatty liver disease (NAFLD), a large cohort study has found. Weight loss was negatively associated with fibrosis progression.

“This association remained significant after adjustment for confounders including baseline BMI [body mass index], indicating that weight change per se is an independent risk factor for fibrosis progression. Higher BMI at baseline was also positively associated with APRI [aspartate aminotransferase to platelet ratio index] progression,” the researchers write.

The study, by Yejin Kim, MHS, Center for Cohort Studies, Total Healthcare Center, Kangbuk Samsung Hospital, Sungkyunkwan University School of Medicine, Seoul, South Korea, and colleagues, was published online July 15 in Clinical Gastroenterology and Hepatology.

The researchers analyzed data from 40,700 adults who received comprehensive annual or biennial examinations as part of the Kangbuk Samsung Health Study. They included participants with fatty liver, as evidenced on abdominal ultrasonography, who had undergone a health examination between 2002 and 2016 and who had had two or more follow-up visits through the end of 2016. Patients were followed for a median of 6 years.

The authors explain that they used APRI score to assess fibrosis progression because it is noninvasive and the formula includes neither BMI nor age. The study’s primary endpoint was the development of intermediate to high probability of advanced fibrosis, as assessed by APRI.

The researchers first adjusted for age and sex, and later adjusted for center, year of screening examination, smoking status, alcohol consumption, regular exercise, education level, BMI, diabetes history, cardiovascular disease history, and hypertension history. A second model also adjusted for homeostatic model assessment of insulin resistance (HOMA-IR) and high-sensitivity C-reactive protein (hsCRP).

There were 275,421.5 person-years of follow-up, during which 5454 patients with low APRI progressed to intermediate or high APRI.

When stratified into quintiles by weight change, those with the greatest weight loss (-43.4 to -2.3 kg) had a significantly reduced risk for progression (hazard ratio [HR], 0.68; 95% confidence interval [CI], 0.62 – 0.74), compared with those whose weight was stable. Similarly, patients with small degrees of weight loss (-2.2 to -0.6 kg) had a reduced risk for progression (HR, 0.86; 95% CI, 0.78 – 0.94).

By contrast, any weight gain appeared to increase progression risk. Specifically, those who gained a smaller amount of weight (0.7 to 2.1 kg) showed a 17% risk increase (HR, 1.17; 95% CI, 1.07 – 1.28), and those who gained more (2.2 to 26.5 kg) had a 71% increased risk (HR, 1.71; 95% CI, 1.58 – 1.85).

These associations were not mediated by inflammation or insulin resistance after adjustment for HOMA-IR and hsCRP.

Compared with those whose baseline BMI was from 18.5 to 22.9 kg/m2, the HRs for APRI progression were as follows: 1.67 (95% CI, 0.74 – 3.73) for BMI of <18.5 kg/m2; 1.13 (95% CI, 1.02 – 1.26) for BMI of 23 – 24.9 kg/m2; 1.41 (95% CI, 1.28 – 1.55) for BMI of 25 – 29.9 kg/m2; and 2.09 (95% CI, 1.86 – 2.36) for BMI of ≥ 30. All values were determined after adjusting for age, sex, health center, year of screening examination, smoking status, alcohol consumption, exercise, education, diabetes history, cardiovascular disease history, and hypertension history. These associations remained significant after adjustments for HOMA-IR and hsCRP.

“When the impact of weight change on APRI worsening was compared with that of other metabolic factors, increasing quintiles of weight change, triglyceride, uric acid, and HOMA-IR and decreasing quintiles of high-density lipoprotein cholesterol were associated with increased risk of APRI worsening in a dose-response manner (all P for trend <.001), with weight change showing the greatest magnitude of association among the metabolic factors evaluated,” the authors explain.

The associations of both weight change and BMI with APRI progression were still seen in patients with NAFLD who had no history of diabetes or cardiovascular disease.

“Although the mechanisms underlying the association between excessive adiposity or fat gain and the fibrosis progression are not yet fully understood, insulin resistance and inflammation are thought to be involved,” the researchers write. “However, after adjustment for HOMA-IR and hsCRP, the association between obesity, weight gain, and fibrosis progression remained significant. Multiple other factors, including oxidative stress and lipotoxicity, have also been implicated in fibrosis progression.”

Study limitations include the use of ultrasonography to diagnose NAFLD. Although liver biopsy is considered the gold standard, it is not be feasible in a large, low-risk population, and abdominal ultrasound is acceptable, the authors say. Also, although APRI “has demonstrated a reasonable utility as a noninvasive method for the prediction of histologically confirmed advanced fibrosis…there are no currently available longitudinal data to support the use of worsening noninvasive fibrosis markers as an indicator of histologic progression of fibrosis stage over time.”

The study was conducted among fairly healthy young and middle-aged Koreans and may not be generalizable to those of other ages, or to groups in which comorbidities are more prevalent, or to other racial or ethnic groups.

“In this large cohort study of young and middle-aged adults with NAFLD, obesity and weight gain were significantly and independently associated with an increased risk of developing fibrosis progression. Strategies for maintaining a healthy weight and preventing weight gain may help reduce fibrosis progression and its associated consequences in individuals with NAFLD,” the researchers conclude.

Multiple Drugs Advance for Fatty Liver Disease


Although numerous drugs for nonalcoholic steatohepatitis (NASH) have shown positive results in phase 2 clinical trials, the cure might lie in combinations of drugs with different mechanisms, experts say.

In fact, curing NASH might turn out to be as challenging as curing type 2 diabetes, said Sidney Barritt IV, MD, from the University of North Carolina at Chapel Hill.

Unlike hepatitis C, which can be treated with the blockbuster antiviral drugs that have recently proven so effective, NASH is more complicated because there are no effective drugs to treat it.

With the obesity epidemic, NASH is increasingly common, and results from phase 2 trials attracted throngs of conference-goers with questions here at The Liver Meeting 2018.

Some of the results look encouraging, Barritt told Medscape Medical News. “I think they’re clinically significant.”

Phase 2 results have been positive for MGL-3196 (Madrigal Pharmaceuticals), GS-9674 (Gilead Sciences), NGM282 (NGM Bio), arachidyl amido cholanoic acid (Aramchol, Galmed Pharmaceuticals), tropifexor (Novartis), and VK2809 (Viking Therapeutics).

All the drugs reduced liver fat measured with MRI-derived proton density fat fraction (PDFF). The drugs also improved various other measures of the disease, such as NASH Activity Score, fibrosis, and alanine aminotransferase.

These NASH agents add to the four already in phase 3 trials: obeticholic acid (Ocaliva, Intercept Pharmaceuticals), elafibranor (Genfit), selonsertib (Gilead), and cenicriviroc (Tobira Therapeutics).

But no clear winner has emerged from these studies. It’s hard to know how well the biomarkers measured in trials will protect patients from sickness and death, Barritt explained. NASH destroys the liver gradually; most of its victims die from the heart disease or cancer that results from this damage, which takes decades.

The real test is going to be real-world efficacy. Are the drugs going to have the impact that we expect them to have based on the clinical trial data?

“The real test is going to be real-world efficacy,” he said. “Are the drugs going to have the impact that we expect them to have based on the clinical trial data?”

The development of NASH is mostly related to lifestyle factors, such as overeating and lack of exercise, so there is no obvious target for a drug as there is with a virus. As a result, drug makers have focused on various aspects of inflammation, fat accumulation, and scar formation.

Like obeticholic acid, GS-9674 and tropifexor are farnesoid X receptor (FXR) agonists, which help regulate bile acids, carbohydrate and lipid metabolism, and insulin sensitivity. They also play a role in growth and regeneration after liver injury.

MGL-3196 and VK2809 are thyroid hormone-receptor beta agonists designed to mediate the effects of the thyroid hormone on the liver, on low-density-lipoprotein cholesterol, on triglycerides, on fatty liver, and on insulin sensitivity.

Arachidyl amido cholanoic acid inhibits stearoyl CoA desaturase. It has a “dual mode of action on liver fibrosis, downregulation of steatosis, and a direct effect on hepatic stellate cells, the human collagen-producing cells,” according to Galmed Pharmaceuticals.

The potential for all these approaches was evident in the phase 2 results presented. But the most effective treatments might be a combination of drugs that act on different pathways, said Keyur Patel, BM, from Duke University in Durham, North Carolina, who is a GS-9674 investigator.

In a separate phase 2 trial now underway, Gilead is testing the combination of GS-9674 plus selonsertib, a small-molecule inhibitor of apoptosis signal-regulating kinase 1 (ASK1), plus GS-9676, an acetyl-CoA carboxylase inhibitor, Patel told Medscape Medical News.

A Strong Placebo Effect

Combining the drugs makes sense because the drugs now in phase 3 trials have not shown the potential to cure NASH on their own, according to Jerry Colca, PhD, chief scientific officer of Cirius Therapeutics in Kalamazoo, Michigan. “They have shown minimal effects in phase 2b,” he said.

One of the challenges that researchers have is the strong placebo effect, Colca told Medscape Medical News. Patients in placebo groups typically diet and exercise, which addresses the underlying cause of NASH, and drugs don’t always show much improvement over that.

Cirius is currently conducting a phase 2b study of MSDC-0602K, an insulin sensitizer “designed to selectively modulate the mitochondrial pyruvate carrier (MPC), which at the cellular level mediates the effects of overnutrition,” the company reports.

The effect of MSDC-0602K on NASH might be broader than that of competing drugs because it acts further upstream, Colca noted.

In the phase 2 studies presented, the drugs appear to be well tolerated, although some adverse events, such as pruritus and diarrhea, were reported.

Many of the questions about these drugs might not be addressed until they are already on the market.

“What we don’t know from these trials is what the expected duration of therapy will be,” Barritt said. “Are they drugs for 1 to 3 years to reset the clock while the patient addresses diet and exercise? Or are they going to be lifetime medications?”

The Impact of Bariatric Surgery on Cancer Incidence


What is the impact of bariatric surgery on cancer incidence? To answer this question, the authors of a study published in the British Journal of Surgery[1] compared cancer frequency following various types of obesity surgery in 8794 obese patients in England (average age, 42 years) who were operated versus in an equal number of nonoperated patients.

During a median follow-up period of 55 months, the risk for hormone-related cancers was significantly reduced in the operated group compared with the nonoperated group (odds ratio [OR], 0.23; 95% confidence interval [CI], 0.18-0.30). In contrast, for colorectal cancer following gastric bypass (but not banding or sleeve gastrectomy), there was an overall increase in the risk for colorectal cancer (OR, 2.63; 95% CI, 1.17-5.95).

Gastric Bypass Increased the Risk for Colorectal Cancer

This report provides valuable information about cancer risk following bariatric surgery, one of the most common procedures performed by general surgeons. The risk reduction for hormonally dependent cancers was seen in both males (prostate) and females (breast, endometrium), and the benefit became more pronounced with increasing duration of follow-up. On the basis of a total of 16 patients with colorectal cancer, gastric bypass resulted in a more than twofold increased risk.

If confirmed with studies with longer follow-up and more patients, the findings in this report suggest that the age for screening for colorectal cancer following bariatric surgery should be lowered. One study weakness is that the dataset only listed obesity as a comorbidity and did not include actual information to calculate body mass index. Nevertheless, the results seem biologically plausible and are consistent with those from other research reports.

Type 2 Diabetes Could Be a Cause of Erectile Dysfunction


Type 2 diabetes may be a causal factor in the development of erectile dysfunction (ED), with insulin resistance a likely mediating pathway, results of a large-scale genomic analysis suggest. The data also uncovered a genetic locus linked to ED.

Jonas Bovijn, MD, DPhil, Big Data Institute at the University of Oxford, United Kingdom, and colleagues gathered data on more than 220,000 men across three cohorts, of whom more than 6000 had ED.

The researchers initially showed that a region on chromosome 6 is linked to the development of ED. The location suggested that the condition is associated with dysregulation of the hypothalamus.

Next, they performed a Mendelian randomization analysis, which examined the relationship between gene mutations known to be associated, in this case, with cardiometabolic factors and the outcome of ED.

The research, published online December 20 in the American Journal of Human Genetics, showed that a genetic predisposition to type 2 diabetes increased the risk for ED. The risk was driven primarily by susceptibility to insulin resistance.

Bovijn said in a release: “We know that there is observational evidence linking erectile dysfunction and type 2 diabetes, but until now there has not been definitive evidence to show that predisposition to type 2 diabetes causes erectile dysfunction.”

“Further research is needed to explore the extent to which drugs used in the treatment of type 2 diabetes might be repurposed for the treatment of ED,” the team notes.

Co–senior author Anna Murray, PhD, University of Exeter Medical School, United Kingdom, said in the release that “until now little has been known” about the cause of ED.

Previous studies have suggested there is a genetic basis for ED. The new study goes further by demonstrating that a genetic predisposition to type 2 diabetes is linked to ED, according to Murray.

“That may mean that if people can reduce their risk of diabetes through healthier lifestyles, they may also avoid developing erectile dysfunction,” she said.

Michael Holmes, MD, PhD, of the Nuffield Department of Population Health at the University of Oxford, who was one of the senior authors, agreed.

“Our finding is important, as diabetes is preventable, and indeed one can now achieve ‘remission’ from diabetes with weight loss, as illustrated in recent clinical trials.

“This goes beyond finding a genetic link to erectile dysfunction to a message that is of widespread relevance to the general public, especially considering the burgeoning prevalence of diabetes,” Holmes said.

Large Studies Key

Although the prevalence of ED is known to increase with age, rising to 20% to 40% among men aged 60 to 69 years, the genetic architecture of the condition remains poorly understood. This is at least in part due to a lack of well-powered studies.

The researchers therefore conducted a genome-wide association study (GWAS) using data on 199,362 individuals from the UK Biobank cohort and 16,787 people from the Estonian Genome Center of the University of Tartu (EGCUT) cohort, both of which are population based.

In addition, they included information on 7666 participants in the hospital-recruited Partners HealthCare Biobank (PHB) cohort.

The prevalence of ED, which was determined on the basis of self- or physician-reported ED, the use of oral ED medication, or a history of ED surgical intervention, was 1.53% in the UK Biobank, 7.04% in EGCUT, and 25.35% in PHB.

The researchers believe that the difference in prevalence rates between the cohorts may relate to the older average age for men in PHB, at 65 years, vs 59 years in the UK Biobank and 42 in EGCUT. In addition, the prevalence in the UK Biobank cohort may have been affected by a “healthy volunteer” selection bias and a lack of primary care data.

GWAS on the UK Biobank data indicated that there was a single genome-wide significant locus at 6q16.3 between the MCHR2 and SIM1 genes, with rs57989773 the lead variant.

Pooled meta-analysis of the combined cohorts indicated that rs57989773 was associated with ED at an odds ratio of 1.20 per C-allele (P = 5.71 × 10-14).

Synthesizing previous research on SIM1, which is highly expressed in the hypothalamus, in both human and rodent models, the team found that rs57989773 is associated with syncope, orthostatic hypotension, and urinary incontinence.

Moreover, the common risk variant for ED at 6q16.3 is linked to blood pressure and adiposity, as well as male sexual behavior in mice.

The researchers, therefore, suggest that a potential mechanism for the effect of the MCHR2-SIM1 locus on ED could be the hypothalamic dysregulation of SIM1.

The team also performed Mendelian randomization analyses to examine the potential causal role of cardiometabolic traits in ED risk.

Factors included type 2 diabetes, insulin resistance, systolic blood pressure (SBP), low-density lipoprotein (LDL) cholesterol levels, smoking heaviness, alcohol consumption, body mass index, coronary heart disease, and educational attainment.

The analysis revealed that type 2 diabetes was causally implicated in ED, with the risk for ED increased 1.11-fold with each 1-log higher genetic risk for type 2 diabetes (P = 3.5 × 10-4).

Insulin resistance was found to be a likely mediating pathway for the relationship, with an odds ratio for ED of 1.36 per 1 SD genetic increase in insulin resistance (P = .042).

SBP also had a causal effect on ED risk, at an odds ratio of 2.34 per 1 SD increase in SBP (P = .007).

LDL cholesterol was found to have a minor impact on the risk for ED, at an odds ratio of 1.07 per 1 SD increase in levels (P = .113). There was no association between ED and either smoking heaviness or alcohol use.

Source:Medscape.com

Cirrhosis a ‘silent Epidemic’ in Young Adults, Women


Rates of cirrhosis are increasing, particularly among young adults and women, and an epidemic of non-alcoholic fatty liver disease (NAFLD) is one possible reason, say researchers from Canada.

Traditionally, cirrhosis has been considered a disease of older men, but the face of cirrhosis is changing, Dr. Jennifer Flemming from Queen’s University, in Kingston, Canada, told Reuters Health by phone.

“This is likely either related to alcohol or non-alcohol-related fatty liver disease,” she explained. Non-alcoholic fatty liver disease has been on the rise over the past two decades.

“Alcohol use patterns in young individuals and women have also changed over the past several decades such that women are drinking pretty much the same amount as men and women are predisposed to alcohol-related liver disease at much lower levels of alcohol than are men. My thought is that women are kind of catching up to the same risk factors that men have had, in addition to now having this epidemic of non-alcohol-related fatty liver disease,” said Dr. Flemming.

She and her colleagues did a retrospective population-based study looking at cirrhosis incidence by age group. They identified nearly 166,000 people in Ontario with cirrhosis from 1997 to 2016.

New cases of cirrhosis nearly doubled in the province during the study period, from 6,318 new cases diagnosed in 1997 (3,979 males/2,339 females) to 12,047 in 2016 (7,061 males/4,986 females).

The risk of cirrhosis is 116% higher for millennials who were born in 1990 than for baby boomers born in 1951, the researchers report in The Lancet Gastroenterology & Hepatology, online December 17. For women, the risk is even higher. A woman born in 1990 was 160% more likely to be diagnosed with cirrhosis than a woman born in 1951.

Strategies to increase awareness of this “silent epidemic in young adults and women are needed,” the researchers note in their paper.

“Future studies able to define the cause and natural history of cirrhosis in these groups are essential to develop strategies that could reverse these trends for future generations,” they conclude.

Funding for the study was provided by the Southeastern Ontario Academic Medical Association and the American Association for the Study of Liver Disease (AASLD). Dr. Flemming has received grants from both organizations.

Obesity to Blame for Almost 1 in 20 Cancer Cases Globally


Excess body weight is responsible for about 4 percent of all cancer cases worldwide and an even larger proportion of malignancies diagnosed in developing countries, a new study suggests.

As of 2012, excess body weight accounted for approximately 544,300 cancers diagnosed annually around the world, researchers report in CA: A Cancer Journal for Clinicians, December 12. While overweight and obese individuals contributed to just 1 percent of cancer cases in low-income countries, they accounted for 7 to 8 percent of cancers diagnosed in some high-income Western countries and in Middle Eastern and North African nations.

“Not many people know about excess body weight and its link to cancer,” said lead study author Hyuna Sung of the American Cancer Society in Atlanta.

“Trying to achieve healthy weight and maintaining it is important and may reduce the risk of cancer,” Sung said by email.

But the proportion of people who are overweight and obese has been increasing worldwide since the 1970s, the researchers note. As of 2016, 40 percent of adults and 18 percent of school-age children were overweight or obese, for a total of almost 2 billion adults and 340 million kids worldwide.

While the proportion of people with excess body weight has increased rapidly in most countries and across all population groups, the surge has been most pronounced in some low- and middle-income countries that have adopted a Western lifestyle with too little exercise and too many unhealthy foods, the study team writes.

“The simultaneous rise in excess body weight in almost all countries is thought to be driven largely by changes in the global food system, which promotes energy-dense, nutrient-poor foods, alongside reduced opportunities for physical activity,” Sung said.

Overweight and obesity has been definitively linked to an increased risk of 13 cancers affecting the breast, colon and rectum, uterus, esophagus, gallbladder, kidney, liver, ovary, pancreas, stomach, and thyroid, brain and spinal cord, and blood cells.

More recently, some research has also tied excess weight to risk for prostate tumors as well as cancers of the mouth and throat.

National wealth is the most apparent systematic driver of population obesity, the study authors note.

The economic transition to a wealthier economy brings with it an environment that precipitates obesity; each $10,000 increase in average per capita national income is associated with a 0.4 increase in body mass index among adults, the study authors note.

However, obesity is uncommon in some high-income Asia-Pacific countries, which is likely a result of consuming healthier foods like lean fish and veggies and eating fewer calories, as well as active transportation and walking as part of daily activity, the authors point out.

Still, the report offers fresh evidence of the need for policies that promote healthy eating and exercise habits as a way to battle obesity and reduce the global burden of cancer, the authors argue.

Dietary interventions might include eliminating trans-fats through the development of legislation to ban their use in the food chain; reducing sugar consumption through effective taxation on sugar-sweetened beverages; implementing subsidies to increase the intake of fruits and vegetables; and limiting portion and package size to reduce energy intake and the risk of excess body weight.

Activity interventions might include encouraging urban planning that promotes high-density housing with sidewalks, accessible public transportation and widespread availability of open spaces, parks and places to walk and cycle.

“Based on cancer alone, this report makes the case for allotting significant resources to addressing the global obesity epidemic, and those efforts have to address multiple factors that are creating ‘obesigenic’ societies,” said Dr. Graham A. Colditz of Washington University School of Medicine in St. Louis.

“The actions of individuals are important when it comes to weight – eating a healthy diet and exercising regularly, for example,” Colditz, who wasn’t involved in the report, said by email. “But unless those actions are supported by policies, infrastructure, schools, and employers, they’re less likely to take hold and be broadly successful over time.”

Source:Medscape

%d bloggers like this: