Vitamin pills are a waste of money, offer no health benefits and could be harmful – study


Evidence from the study suggested that ‘supplementing the diet of well-nourished adults…has no clear benefit and might even be harmful’

Vitamin pills are a waste of money, usually offer no health benefits and could even be harmful, a group of leading scientists has said.

A study of nearly 500,000 people, carried out by academics from the University of Warwick and the Johns Hopkins School of Medicine in Baltimore, USA, has delivered a damning verdict on the claims made by the vitamin supplement industry.

Evidence from the study suggested that “supplementing the diet of well-nourished adults…has no clear benefit and might even be harmful”, despite one in three Britons taking vitamins or mineral pills.

According to The Times, scientists involved in the study, which was published in the Annals of Internal Medicine, concluded that companies selling supplements were fuelling false health anxieties to offer unnecessary cures. The industry in the UK is thought to be worth more than £650 million annually.

Researchers declared ‘case closed’ on the vitamin and mineral pills after making their conclusion based on the study of half-a-million people along with three separate research papers.

Evidence from the study suggested that

Evidence from the study suggested that “supplementing the diet of well-nourished adults…has no clear benefit and might even be harmful”, despite one in three Britons taking vitamins or mineral pills.

One of the research papers involved the retrospective study of 24 previous trials. In total 450,000 people were involved in the trials and the paper concluded that there was no beneficial effect on mortality from taking vitamins.

Another examined 6,000 elderly men and found no improvement on cognitive decline after 12 years of taking supplements, while a third saw no advantage of supplements among 1,700 men and women with heart problems over an average study of five years.

The experts said most supplements should be avoided as their use is not justified, writing: “These vitamins should not be used for chronic disease prevention. Enough is enough.”

The scientists argued that the average Western diet is sufficient to provide the necessary vitamins the body needs.

Edgar Miller, of the Johns Hopkins School of Medicine, said: “There are some that advocate we have many nutritional deficiencies in our diet. The truth is though we are in general overfed, our diet is completely adequate.”

He added: “These companies are marketing products to us based on perceptions of deficiencies. They make us think our diet is unhealthy, and that they can help us make up for these deficiencies and stop chronic illnesses.

“The group that needs these is very small. It’s not the general population.”

Dr Miller continued: “There’s something for everything: preventing joint pains, stopping heart disease. If you’re going to spend your money on something every month, is this really the best option?”

The NHS advised recently that other than women taking folic acid to help them conceive and the elderly and children under five benefiting from vitamin D, supplementary vitamins would be surplus to that already gained through diet, The Times said.

The Health Food Manufacturers’ Association said vitamin supplements provided people with “nutritional insurance”.

In July 2011 the Advertising Standards Agency criticised Vitabiotics Ltd for an advert headlined: ‘Advanced Nutrients For The Brain’.

They ruled that the implied claims that “recent research had shown that B vitamins could help maintain brain function and performance’ were not substantiated and were “misleading”.

Advertisements

Hopes Raised for Early Pancreatic Cancer Detection.


Scientists at the Johns Hopkins University School of Medicine say a simple blood test based on detection of tiny epigenetic alterations may reveal the earliest signs of pancreatic cancer. The findings of their research, if confirmed, they add, could be an important step in reducing mortality from the cancer, which has an overall five-year survival rate of less than 5% and has seen few improvements in survival over the last three decades.

“While far from perfect, we think we have found an early detection marker for pancreatic cancer that may allow us to locate and attack the disease at a much earlier stage than we usually do,” explains Nita Ahuja, M.D., an associate professor of surgery, oncology and urology at the Johns Hopkins and leader of the study (“Novel Methylation Biomarker Panel for the Early Detection of Pancreatic Cancer”) described online this month in the journal Clinical Cancer Research.

For their study, Dr. Ahuja and her colleagues focused on two genes.

“We used a nanoparticle-enabled MOB (Methylation On Beads) technology to detect early-stage pancreatic cancers by analyzing DNA methylation in patient serum,” wrote the investigators in their journal article. “We identified two novel genes, BNC1 (92%) and ADAMTS1 (68%), that showed a high frequency of methylation in pancreas cancers (n=143), up to 100% in PanIN-3 and 97% in Stage I invasive cancers.”

Together, BNC1 and ADAMTS1 were detectable in 81% of blood samples from 42 people with early-stage pancreatic cancer, but not in patients without the disease or in patients with a history of pancreatitis, a risk factor for pancreatic cancer.

Dr. Ahuja’s team found that, in pancreatic cancer cells, it appears that chemical alterations to BNC1 and ADAMTS1, i.e., epigenetic modifications that alter the way the genes function without changing the underlying DNA sequence, silence the genes and prevent them from making their protein product, the role of which is not well understood. These alterations are caused by the addition of a methyl group to the DNA.

Using MOB, the researchers were able to single out in the blood even the smallest strands of DNA of those two genes with their added methyl groups. The technique uses nanoparticle magnets to latch on to the few molecules being shed by the tumors, which are enough to signal the presence of pancreatic cancer in the body, the researchers found.

Dr. Ahuja says the practical value of any blood test for cancer markers depends critically on its sensitivity (meaning the proportion of tumors it detects) and its specificity (meaning how many of the positive results are false alarms). The specificity of this new pair of markers is 85%, meaning 15% would be false alarms. She hopes further research will help refine the test, possibly by adding another gene or two, in order to go over 90% in both sensitivity and specificity.

Football-shaped particles bolster the body’s defense against cancer


Researchers at Johns Hopkins have succeeded in making flattened, football-shaped artificial particles that impersonate immune cells. These football-shaped particles seem to be better than the typical basketball-shaped particles at teaching immune cells to recognize and destroy cancer cells in mice.

https://i2.wp.com/cdn.physorg.com/newman/gfx/news/2013/footballshap.jpg

“The shape of the really seems to matter because the stretched, ellipsoidal particles we made performed much better than spherical ones in activating the immune system and reducing the animals’ tumors,” according to Jordan Green, Ph.D., assistant professor of biomedical engineering at the Johns Hopkins University School of Medicine and a collaborator on this work. A summary of the team’s results was published online in the journal Biomaterials on Oct. 5.

According to Green, one of the greatest challenges in the field of cancer medicine is tracking down and killing once they have metastasized and escaped from a tumor mass. One strategy has been to create tiny artificial capsules that stealthily carry toxic drugs throughout the body so that they can reach the escaped tumor cells. “Unfortunately, traditional chemotherapy drugs do not know healthy cells from tumor cells, but immune system cells recognize this difference. We wanted to enhance the natural ability of T-cells to find and attack tumor cells,” says Jonathan Schneck, M.D., Ph.D., professor of pathology, medicine and oncology.

In their experiments, Schneck and Green’s interdisciplinary team exploited the well-known immune system interaction between antigen-presenting cells (APC) and T-cells. APCs “swallow” invaders and then display on their surfaces chewed-up protein pieces from the invaders along with molecular “danger signals.” When circulating T-cells interact with APCs, they learn that those proteins come from an enemy, so that if the T-cells see those proteins again, they divide rapidly to create an army that attacks and kills the invaders.

According to Schneck, to enhance this natural process, several laboratories, including his own, have made various types of “artificial APCs”—tiny inanimate spheres “decorated” with pieces of tumor proteins and danger signals. These are then often used in immunotherapy techniques in which are collected from a cancer patient and mixed with the artificial APCs. When they interact with the patient’s T-cells, the T-cells are activated, learn to recognize the tumor cell proteins and multiply over the course of several days. The immune cells can then be transferred back into the patient to seek out and kill .

The cell-based technique has had only limited success and involves risks due to growing the cells outside the body, Green says. These downsides sparked interest in the team to improve the technique by making biodegradable artificial APCs that could be administered directly into a potential patient and that would better mimic the interactions of natural APCs with T-cells. “When immune cells in the body come in contact, they’re not doing so like two billiard balls that just touch ever so slightly,” explains Green. “Contact between two cells involves a significant overlapping surface area. We thought that if we could flatten the particles, they might mimic this interaction better than spheres and activate the T-cells more effectively.”

To flatten the particles, two M.D./Ph.D. students, Joel Sunshine and Karlo Perica, figured out how to embed a regular batch of spherical particles in a thin layer of a glue-like compound. When they heated the resulting sheet of particles, it stretched like taffy, turning the round spheres into tiny football shapes. Once cooled, the film could be dissolved to free each of the microscopic particles that could then be outfitted with the tumor proteins and danger signals. When they compared typical spherical and football-shaped particles—both coated with tumor proteins and danger signals at equivalent densities and mixed with T-cells in the laboratory—the T-cells multiplied many more times in response to the stretched particles than to spherical ones. In fact, by stretching the original spheres to varying degrees, they found that, up to a point, they could increase the multiplication of the T-cells just by lengthening the “footballs.”

When the particles were injected into mice with skin cancer, the T-cells that interacted with the elongated artificial APCs, versus spherical ones, were also more successful at killing tumor cells. Schneck says that tumors in mice that were treated with round particles reduced tumor growth by half, while elongated particles reduced tumor growth by three-quarters. Even better, he says, over the course of a one-month trial, 25 percent of the mice with skin cancer being treated with elongated particles survived, while none of the mice in the other treatment groups did.

According to Green, “This adds an entirely new dimension to studying cellular interactions and developing new artificial APCs. Now that we know that shape matters, scientists and engineers can add this parameter to their studies,” says Green. Schneck notes, “This project is a great example of how interdisciplinary science by two different groups, in this case one from biomedical engineering and another from pathology, can change our entire approach to tackling a problem. We’re now continuing our work together to tweak other characteristics of the artificial APCs so that we can optimize their ability to activate T- inside the body.”

Source: Johns Hopkins University School of Medicine

Epilepsy Drug Warnings May Slip Through Cracks.


One-fifth of American neurologists are unaware of serious safety risks associated with epilepsydrugs and are potentially risking the health of patients who could be treated with safer medications, a new study reveals.

The 505 neurologists who took part in the survey between March and July 2012 were asked if they knew about several epilepsy drugs’ safety risks recently identified by the U.S. Food and Drug Administration.

These risks included increased danger of suicidal thoughts or behaviors linked with some newer drugs, a high risk for birth defects and mental impairment in children of mothers taking divalproex (brand nameDepakote), and the likelihood of serious hypersensitivity reactions in some Asian patients treated with carbamazepine (Tegretol).

One in five of the neurologists said they did not know about any of these risks. Neurologists who treat 200 or more epilepsy patients per year were most likely to know all the risks, according to the study, which was published online recently in the journal Epilepsy.

Although this study focused on epilepsy drugs, the findings suggest that the FDA needs to find better ways to inform doctors about newly discovered drug safety risks, said the researchers from Johns Hopkins University School of Medicine. Their results show that warnings about these risks are not getting through to doctors making important prescribing decisions.

There is no single place for neurologists to find updated drug risk information, said study leader Dr. Gregory Krauss, a professor of neurology. A few get emails from the FDA, while others get the information from neurology societies, continuing medical education courses or journal articles.

“There is poor communication from the FDA to specialists, and there’s some risk to patients because of this,” Krauss said in a Johns Hopkins news release.

“Unless it’s a major change requiring the FDA to issue a black box warning on a product, important information appears to be slipping through the cracks,” he said. “We need a more systematic and comprehensive method so that doctors receive updated safety warnings in a format that guarantees they will see and digest what they need to protect patients.”

Source: Drugs.com

Cocoa, Even With Few Flavonoids, Boosts Cognition.


Drinking cocoa, whether rich in flavonoids or not, appears to boost the effect of blood flow on neuronal activity in the brain, known as neurovascular coupling (NVC).

A new study shows not only that drinking flavonoid-rich or flavonoid-poor cocoa improves NVC but also that higher NVC is associated with better cognitive performance and greater cerebral white matter structural integrity in elderly patients with vascular risk factors.

As researchers search for ways to detect dementia at the earliest possible stage, the study results could pave the way for using NVC as a biomarker for vascular function in those at high risk for dementia, said lead author Farzaneh A. Sorond, MD, PhD, Department of Neurology, Stroke Division, Brigham and Women’s Hospital, Boston, Massachusetts.

“Our study shows that NVC is modifiable and can be enhanced with cocoa consumption,” said Dr. Sorond.

Tight Correlation

The double-blind proof-of-concept study included 60 community-dwelling participants, mean age 72.9 years. About 90% of the participants were hypertensive, but with well-controlled blood pressure, and half had diabetes mellitus type 2 with reasonably good control. Three quarters were overweight or obese.

Participants were randomly assigned to 2 cups a day of cocoa rich in flavonoids (609 mg per serving) or cocoa with little flavonoids (13 mg per serving). Diets were adjusted to incorporate the cocoa, each cup of which contained 100 calories. Participants were also asked to abstain from eating chocolate.

Researchers measured cerebral blood flow in these participants using transcranial Doppler ultrasonography. Among other things, they documented changes in the middle cerebral artery and blood flow velocity at rest and in response to cognitive tasks (NVC).

The study showed that NVC was tightly correlated with cognition; scores for Trail making Test B, a test of executive function, were significantly better in those with intact NVC (89 seconds vs 167 seconds; P = .002). Participants with intact NVC also had significantly better performance on the 2-Back Task, a test for both attention and memory (82% vs 75%; P = .02).

“The higher you increase your blood flow during a cognitive task, the better your cognitive performance,” commented Dr. Sorond, adding that this is something that has never been shown before.

NVC was also correlated with cerebral white matter structural integrity. Higher NVC was associated with overall less white matter macro- and micro-structural damage. In general, those with intact NVC had a greater volume of normal white matter and smaller volume of white matter hyperintensities, higher fractional anisotropy, and lower mean diffusivity in the normal white matter and WMH.

Therapeutic Target

These results suggest that NVC could be an important therapeutic target. But before NVC can be considered a biomarker, it has to be shown to be changeable, and the clinical importance of the modification must be shown.

To that end, the study authors opted to use cocoa. They could have chosen many other potential modifiers but chose cocoa because the literature has shown the beneficial effects of cocoa on brain health and also because it’s something that many people enjoy, said Dr. Sorond.

The study found that blood pressure, blood flow, and change in NVC were not significantly different between the 2 cocoa groups. In the combined cocoa groups, 30-day blood pressures were not significantly different from baseline (P > .5).

In contrast, response to cocoa differed significantly depending on NVC status. Cocoa had a significant effect on NVC in those with impaired (<5%) coupling at baseline. Of those with impaired NVC, 89% responded to 30 days of cocoa consumption and increased NVC compared with only 36% of those with intact NVC (P = .0002). In those with impaired baseline coupling, cocoa consumption was associated with an 8.3% (P < .0001) increase in NVC at 30 days.

The effect of cocoa consumption on Trail B scores was also significantly dependent on NVC status.

The authors were surprised at the lack of effect of flavonoids because previous research had indicated a dose-response with respect to cognitive performance. It could be something other than flavonoids in the cocoa, possibly caffeine, that improves NVC, or it could be that the 13 mg in the low-flavonoid cocoa group was enough to have an effect.

“I think there are effects of flavonol on brain blood flow no matter how low it is,” said Dr. Sorond, adding that perhaps only a tiny amount is needed to activate an enzyme or some other trigger.

It’s important to identify the component or mechanism, whatever it is, because just telling patients to drink cocoa could be risky, said Dr. Sorond. “Patients with diabetes or hypertension really don’t need the extra sugar, extra calories, and extra fat that come with it.”

Dr. Sorond thinks NVC could be measured in high-risk patients seen in the clinic. “I think this could be an easy, in-clinic quick test of vascular brain function that pertains to cognitive performance.”

The ideal next step would be to carry out a larger study in patients with mild cognitive impairment that includes more detailed cognitive profiles and more control groups. “We need a cocoa arm; we need a caffeine arm; we need maybe other arms, to make sure that we understand this, and maybe look at some of the metabolites in the blood as a result of cocoa consumption that correlates with these things,” said Dr. Sorond.

Remarkable First Step

In an accompanying editorial, Paul B. Rosenberg, MD, associate professor of psychiatry and behavioral sciences, Johns Hopkins School of Medicine, Baltimore, Maryland, and Can Ozan Tan, PhD, Harvard Medical School, Boston, write that in many ways, the study represents a “remarkable first step.”

For one thing, it demonstrates the practical utility of a simple, inexpensive, and noninvasive technique for measuring NVC that has several advantages over functional MRI and other means of measuring blood brain flow during cognitive tasks.

In demonstrating a link between NVC and cerebral white matter structural integrity, the study provides an important validation for the association between vascular and cognitive function, according to Dr. Rosenberg.

The study demonstrates that NVC “hangs together” as a measure of vascular function, which could be used in studies targeting vascular interventions, said Dr. Rosenberg in an interview with Medscape Medical News. In this way, he added, the study is “promising for the development of new treatments for vascular dementia.”

The study suggests that the vascular effects of cocoa are not due to its flavonol content, noted Dr. Rosenberg.”It could be a placebo effect.”

Dr. Rosenberg pointed out several strengths of the study, including its relatively large size for a pilot study and its “well-chosen” measures.

Among its weaknesses are that it’s not a placebo-controlled study and the hypothesis that flavonoid-rich cocoa would work better than flavonoid-poor cocoa didn’t pan out. The study may also not have been long enough, said Dr. Rosenberg. “It’s nice to see a drug work for 30 days, but you really need a longer study.”

The study didn’t include patients with mild cognitive impairment who are at risk of developing dementia, which Dr. Rosenberg sees as another weakness. “It’s one thing to show an effect in cognitively healthy older people; it’s a very different thing to show an effect in people who have a brain disease,” he said.

The Alzheimer’s Association also sees weaknesses in the study. Not only is it a very small and very preliminary study, but it was also not well designed as a test of an intervention or therapy because it didn’t include a control group for comparison with the group that drank cocoa, said Maria Carrillo, PhD, Alzheimer’s Association vice president of medical and scientific relations.

Further, said Dr. Carrillo, it didn’t appear that other factors that could possibly affect brain blood flow and/or cognition were controlled for, tracked, or accounted for in the study.

“There is no information on what else the 18 people with impaired cerebral blood flow did during the trial that might have improved their cerebral blood flow or cognitive performance: exercise, for example. A well-designed intervention trial anticipates, tracks, and accounts for these possible confounding factors to help ensure the credibility of the findings.”

Source: Neurology

 

Cherries May Prevent Gout Flares.


Patients with gout were less likely to report acute attacks after 2 days of eating cherries or imbibing cherry extract than during periods after no cherry intake, according to data reported in Arthritis & Rheumatism by Yuqing Zhang, DSci, and colleagues from Boston University School of Medicine in Massachusetts.

Dr. Zhang, who is professor of medicine and epidemiology at Boston University School of Medicine, told Medscape Medical News that cherry intake during a 2-day period was associated with a 35% lower risk for gout attacks and that cherry extract intake was associated with a 45% lower risk.

Risk for gout attacks was reduced by 75% when cherry intake was combined with allopurinol use. Dr. Zhang said, “We found that if subjects took allopurinol alone, it reduced the risk of gout attack by 53%; if subjects took cherry alone, it reduced the risk by 32%; if they took both, the risk of gout attack was reduced by 75%.”

These associations were discovered in a case-crossover study of 633 individuals with physician-diagnosed gout who were prospectively recruited and followed online for 1 year. When a participant reported a gout attack, the researchers asked about the onset date of the gout attack, symptoms and signs, medications, and potential risk factors (including daily intake of cherries and cherry extract) during the 2 days before the attack. Patients served as their own controls, so the same information was assessed for 2-day control periods not associated with gout attacks. A cherry serving was defined as one-half cup or 10 to 12 cherries.

Participants had a mean age of 54 years; 88% were white and 78% were male. Of patients with some form of cherry intake, 35% ate fresh cherries, 2% ingested cherry extract, and 5% consumed both fresh cherry fruit and cherry extract. Researchers documented 1247 gout attacks during the 1-year follow-up period, with 92% occurring in the joint at the base of the big toe.

Factors associated with increased serum uric acid levels, such as increased alcohol consumption and purine intake, or use of diuretics, were associated with increased risk for recurrent gout attacks.

“Our findings indicate that consuming cherries or cherry extract lowers the risk of gout attack,” Dr. Zhang said in a press release. “The gout flare risk continued to decrease with increasing cherry consumption, up to three servings over two days.” Further cherry intake was not associated with additional benefit.

“However, the protective effect of cherry intake persisted after taking into account patients’ sex; body mass (obesity); purine intake; and use of alcohol, diuretics, and antigout medications,” according to the release.

The authors speculate that cherries may decrease serum uric acid levels by increasing glomerular filtration or reducing tubular reabsorption. They also note that cherries and cherry extract contain high levels of anthocyanins, which possess anti-inflammatory properties.

Dr. Zhang told Medscape Medical News, “While our study findings are promising, randomized clinical trials should be conducted to confirm whether cherry products could provide a nonpharmacological preventive option against grout attacks. Until then we would not advocate on the basis of the current findings that individuals who suffer from gout abandon standard therapies and opt for cherry extract products as an alternative.”

In an accompanying editorial, Allan Gelber, MD, from Johns Hopkins University School of Medicine in Baltimore, Maryland, and Daniel Solomon, MD, from Brigham and Women’s Hospital and Harvard University Medical School in Boston, write that the findings are promising but reiterates the need for randomized clinical trials to confirm that consumption of cherry products could prevent gout attacks.

Dr. Gelber told Medscape Medical News, “For the patient who asks his/her doctor ‘Doc, what can I do, myself, to decrease my chance of developing another gout attack, above and beyond the medications you have prescribed for me?’ our response would include that one of the options is dietary modification. Previously, physician recommendations included advocating for moderation in alcohol consumption, weight reduction, and decreasing high-purine foods from the diet…but now there are new data supporting a beneficial role in eating cherries to reduce one’s risk for recurrent gout attacks.”

Dr. Gelber noted that the most definitive support for the recommendation to eat cherries as a strategy to reduce gout risk would come from a randomized clinical trial. “Just as with new medications that come down the pipeline, dietary interventions ought also be subject to the rigor of a clinical trial. Such a study could be undertaken. There is logistical challenge to undertaking such a trial since cherry fruit is broadly available. But, in a controlled setting, such a trial would be feasible,” he said.

Source: Mescape.com

 

 

 

The Hidden “Cancer-Trigger” You Probably Swallow Every Day.


It’s estimated that half of all hospital beds in the world are occupied by people who have become sick from drinking contaminated water. In fact, over 1 billion people (or about one-sixth of the world’s population) do not have access to safe drinking water, and millions in developing countries die each year from water-related diseases.1

In third-world countries, sunlight exposure is often used to help make water safer, but this natural disinfection process can take anywhere from six to 48 hours (depending on cloud cover and so on).

Now researchers from the Johns Hopkins Bloomberg School of Public Health and the Johns Hopkins School of Medicine have found a simple twist to make this disinfection method even more powerful, not to mention much faster …

Lime Juice and Sunlight Can Help Make Water Safer

When researchers added lime juice or lime slurry to water that had been contaminated with various types of bacteria and viruses, then exposed it to sunlight, levels of both E. coli and MS2 bacteriophage virus were significantly lower than when compared to solar disinfection alone.2 Kellogg Schwab, PhD, MS, senior author of the study, said:3

“The preliminary results of this study show solar disinfection of water combined with citrus could be effective at greatly reducing E. coli levels in just 30 minutes, a treatment time on par with boiling and other household water treatment methods. In addition, the 30 milliliters of juice per 2 liters of water amounts to about one-half Persian lime per bottle, a quantity that will likely not be prohibitively expensive or create an unpleasant flavor.”

Noroviruses in the drinking water were not significantly reduced using the lime juice/sunlight technique, so unfortunately it is not a perfect solution. However, limes are readily available in most tropical countries, as is steady sunlight, so this finding could still have an extremely beneficial impact in countries that don’t have ready access to clean drinking water.

You may be surprised to learn, however, that your drinking water may still be contaminated even if you live in the developed world. Further, many of the “modern” disinfection processes used in the United States and other developed countries create their own set of issues …

Have You Heard of Disinfection Byproducts?

Part of the allure of natural disinfection processes like exposure to sunlight and lime juice is that they have no harmful side effects – unlike the chlorination process used by most U.S. municipalities.

If you receive municipal water, the main chemical used to disinfect the tap water in your house is chlorine. While your local government is quick to assure you that there is relatively no danger from drinking chlorinated water, that simply is not the case, because the levels of chlorine disinfection byproducts (DBPs) that are produced by this process are both dangerous and alarming.

There is actually no safe level for many contaminants found in drinking water, including heavy metals, pesticides, herbicides, hormones and DBPs, but they persist nonetheless, in varying quantities.

The government is much more concerned with providing water that doesn’t kill you by causing diarrhea (the way it does in many third-world countries) and it does a good job at that, although some microorganisms (cysts and parasites) do survive the chlorination process (cryptosporidium, Giardia, for instance) and can lead to isolated outbreaks of disease and even death to those with compromised immune systems.

If you have not heard of DBPs before, you need to pay close attention as it turns out that DBPs, not chlorine, are responsible for nearly all the toxic effects of chlorinated water. Chlorine by itself is relatively harmless, but its side effects, by producing DBPs, are what cause nearly all of the problems.

As it turns out, DBPs are over 10,000 times more toxic than chlorine, and out of all the other toxins and contaminants present in your water, such as fluoride and miscellaneous pharmaceutical drugs, DBPs may be the absolute worst of the bunch.

The most common disinfectant byproducts formed when chlorine is used are:

The U.S. Environmental Protection Agency (EPA) takes the dangers of THMs — which are measured in parts per billion (ppb) — very seriously and regulates these compounds. The maximum annual average of THMs in your local water supply cannot exceed 80 ppb, and the maximum annual average of HAAs permitted by EPA regulations is 60 ppb.

However even though these are allowed, ideally it would be best to have zero. These levels have been regularly adjusted downwards over the years as science progresses and gains a deeper appreciation of their true toxicity. What makes DBPs so toxic?

Disinfection Byproducts May Cause Cancer, Reproductive Problems and More

THMs are Cancer Group B carcinogens, meaning they’ve been shown to cause cancer in laboratory animals. DBPs have also been linked to reproductive problems in both animals and humans, and human studies suggest that lifetime consumption of chlorine-treated water can more than double the risk of bladder and rectal cancers in certain individuals.

One such study found that men who smoked and drank chlorinated tap water for more than 40 years faced double the risk of bladder cancer compared with smoking men who drank non-chlorinated water.4 A second study found that rates for rectal cancers for both sexes escalated with duration of consumption of chlorinated water.5 Individuals on low-fiber diets who also drank chlorinated water for over 40 years more than doubled their risk for rectal cancer, compared with lifetime drinkers of non-chlorinated water.

As the vast majority of the U.S. population continues to receive and consume disinfected or chlorinated drinking water, we can assume that Americans are consuming disinfection byproducts every single day, and the number of related cancer cases could be substantial. And, you’re exposed not only when you drink chlorinated water but also, and even more significantly, when you shower or bathe, as well as when you breathe in the chemicals from the air.

The chlorine that enters your lungs is in the form of chloroform, a carcinogen, and chlorite, a byproduct of chlorine dioxide. These forms of chlorine hit your bloodstream instantly before they have a chance to be removed by your organs of detoxification. The DBPs that enter your body through your skin during showering or bathing also go directly into your bloodstream. And the warm or hot water maximizes absorption by your skin. So unless you are regularly taking one-minute long cold showers, your body is like a sponge for these airborne toxins every second you spend in a shower.

If you are like me and obtain your water from a private well, then DBPs are a non-issue as they are only produced when chlorine is added, and it’s highly unusual to add chlorine to most private well water systems. However, well water has its own set of potential hazards as well.

Is Well Water Safe?

Unless you are getting your water from a well that is located 800 feet or more below the ground surface, chances are your well water has been contaminated by some if not many toxic substances that have been dumped into the ground soil over past decades. Some common toxins that are dumped by the millions of pounds into soil every year are:

  • Herbicides
  • Pesticides
  • Estrogen-mimicking hormones
  • Drug residues
  • Heavy metals

Many private wells in the United States have been affected by these types of chemical or heavy metal runoff from the surrounding ground soil, and this is to say nothing of the microorganisms living in well water as well. No matter how clean or pure your natural ground water looks, this has nothing to do with potential bacterial contamination or toxic pollution in the water. Many of the offenders in well water are just much too small to be seen with the naked eye.

So if your home uses well water, you really need to test to see what unwanted contaminants you’re piping into your house, and then filter it accordingly. And if you get municipal water, you should have that tested too, as Sen. Frank Lautenberg, D-N.J. told ABC News that there are more than 140 chemicals in U.S. drinking water supplies that are not regulated by the U.S. Environmental Protection Agency (EPA).6 This includes gasoline, pesticides, rocket fuel, prescription drugs and more. Furthermore, more than 20 percent of U.S. water treatment systems have violated key provisions of the Safe Drinking Water Act over the last six years!

You Can Get Chlorine (and Other Toxins) Out of Your Drinking Water

Most people in the United States are not going to take the time to expose their drinking water and bathwater to sunlight, then add lime juice, to help make it more pure – and this wouldn’t do anything to eliminate the chlorine or fluoride it contains anyway.

Fortunately, there are other options at your disposal.

If you can only afford one filter there is no question in most experts’ minds that the shower filter is the most important product to buy for water filtration, even more important than filtering your tap water. This is because the damage you incur through your skin and lungs far surpasses the damage done by drinking water (which at least gives your body a fighting chance to eliminate the toxins through your organs of elimination).

An even better solution to the problem of harsh chemicals and toxins in your home’s water supply is to install a whole house water filtration system. This not only protects your body, but also your appliances as well. There’s just one water line coming into your house. Putting a filter on this is the easiest and simplest strategy you can implement to take control of your health by ensuring the water and, subsequently, the air in your house is as clean as possible.

Remember, if you are getting your water from a municipal source your indoor air quality, especially in the winter when your windows are closed, is likely atrocious. This is related to the chlorine and other toxins evaporating from all your toilet bowls, showers, baths, dishwashers and washing machines.

My advice for whole house filtration systems is as follows: Find a system that uses at least 60 pounds of filter media and can produce eight or more gallons a minute. When you are running two different showers, the dishwasher and the kitchen sink at the same time, you’ll find out why these minimum levels are so important. This recommendation covers a home or apartment up to 3200 sq./ft, or in other words, a residence with about three and a half bathrooms. For more than that you will probably require two whole house water filtration systems.

You also need to look for a whole house water filter that has three separate stages of contamination removal:

  • Stage one removes sediment
  • Stage two removes chlorine and heavy metals
  • Stage three should be a heavy-duty carbon filter for removing hormones, drug residues, chemicals, pesticides, and herbicides

You want to look for granular carbon in the carbon filter, not a solid block of carbon. The granular carbon allows for better water flow, which translates to more water pressure and better filtering properties as well.

You also want to look for NSF certification, which ensures your water filter is meeting national standards. NSF certification is only granted when a product is proven to remove everything it claims to. It’s also good to makes sure all particles under .8 microns are being filtered out of the water. A lower number is actually better, but .8 microns is the standard I recommend because that covers most bacteria, viruses and VOCs.

Your body requires a constant daily supply of water to fuel all the various waste filtration systems nature has designed to keep you healthy and free of toxins. Your blood, your kidneys, and your liver all require a source of good clean water to detoxify your body from the toxic exposures you come into contact with every day.

When you give your body water that is filled with by-products from chlorination, or with volatile organic compounds, or water that is contaminated by pesticides or hormones, you are asking your body to work twice as hard at detoxification, because it must first detoxify the water you are drinking, before that water can be used to fuel your organs of detoxification! Clearly, one of the most efficient ways to help your body both avoid and eliminate toxins, and reach optimal health, is to provide it with the cleanest, purest water you can find.

  • ·         Source: Dr. Mercola