Air pollution may be making us less intelligent

Long-term exposure to air pollution was linked to cognitive decline in elderly people.

Not only is air pollution bad for our lungs and heart, it turns out it could actually be making us less intelligent, too. A recent study found that in elderly people living in China, long-term exposure to air pollution may hinder cognitive performance (things like our ability to pay attention, to recall past knowledge and generate new information) in verbal and maths tests. As people age, the link between air pollution and their mental decline becomes stronger. The study also found men and less educated people were especially at risk, though the reason why is currently unknown.

We already have compelling evidence that air pollution – especially the tiniest, invisible particulates in pollution – damages the brain in both humans and animals. Traffic pollution is associated with dementia, delinquent behaviour in adolescents, and stunted brain development in children who attend highly polluted schools.

In animals, mice exposed to urban air pollution for four months showed reduced brain function and inflammatory responses in major brain regions. This meant the brain tissues changed in response to the harmful stimuli produced by the pollution.

We don’t yet know which aspects of the air pollution particulate “cocktail” (such as the size, number or composition of particles) contribute most to reported brain deterioration. However, there’s evidence that nanoscale pollution particles might be one cause.

These particles are around 2,000 times smaller than the diameter of a human hair, and can be moved around the body via the bloodstream after being inhaled. They may even reach the brain directly through the olfactory nerves that give the brain information about smell. This would let the particles bypass the blood-brain barrier, which normally protects the brain from harmful things circulating in the bloodstream.

Postmortem brain samples from people exposed to high levels of air pollution while living in Mexico City and Manchester, UK, displayed the typical signs of Alzheimer’s disease. These included clumps of abnormal protein fragments (plaques) between nerve cells, inflammation, and an abundance of metal-rich nanoparticles (including iron, copper, nickel, platinum, and cobalt) in the brain.

Automobiles are a major cause of the world’s air pollution.

The metal-rich nanoparticles found in these brain samples are similar to those found everywhere in urban air pollution, which form from burning oil and other fuel, and wear in engines and brakes. These toxic nanoparticles are often associated with other hazardous compounds, including polyaromatic hydrocarbons that occur naturally in fossil fuels, and can cause kidney and liver damage, and cancer.

Repeatedly inhaling nanoparticles found in air pollution may have a number of negative effects on the brain, including chronic inflammation of the brain’s nerve cells. When we inhale air pollution, it may activate the brain’s immune cells, the microglia. Breathing air pollution may constantly activate the killing response in immune cells, which can allow dangerous molecules, known as reactive oxygen species, to form more often. High levels of these molecules could cause cell damage and cell death.

The presence of iron found in air pollution may speed up this process. Iron-rich (magnetite) nanoparticles are directly associated with plaques in the brain. Magnetite nanoparticles can also increase the toxicity of the abnormal proteins found at the centre of the plaques. Postmortem analysis of brains from Alzheimer’s and Parkinson’s disease patients shows that microglial activation is common in these neurodegenerative diseases.

The latest study of the link between air pollution and declining intelligence, alongside the evidence we already have for the link between air pollution and dementia, makes the case for cutting down air pollution even more compelling. A combination of changes to vehicle technology, regulation and policy could provide a practical way to reduce the health burden of air pollution globally.

However, there are some things we can do to protect ourselves. Driving less and walking or cycling more can reduce pollution. If you have to use a car, driving smoothly without fierce acceleration or braking, and avoiding travel during rush hours, can reduce emissions. Keeping windows closed and recirculating air in the car might help to reduce pollution exposure during traffic jams as well.

Reducing vehicle use by walking or cycling instead could have a major impact on air pollution levels.

But young children are among the most vulnerable because their brains are still developing. Many schools are located close to major roads, so substantially reducing air pollution is necessary. Planting specific tree species that are good at capturing particulates along roads or around schools could help.

Indoor pollution can also cause health problems, so ventilation is needed while cooking. Open fires (both indoors and outdoors) are a significant source of particulate pollution, with woodburning stoves producing a large percentage of outdoor air pollution in the winter. Using dry, well-seasoned wood, and an efficient ecodesign-rated stove is essential if you don’t want to pollute the atmosphere around your home. If you live in a naturally-ventilated house next to a busy road, using living spaces at the back of the house or upstairs will reduce your pollution exposure daily.

Finally, what’s good for your heart is good for your brain. Keeping your brain active and stimulated, eating a good diet rich in antioxidants, and keeping fit and active can all build up resilience. But as we don’t yet know exactly the mechanisms by which pollution causes damage to our brains – and how, if possible, their effects might be reversed – the best way we can protect ourselves is to reduce or avoid pollution exposure as much as possible.

Laptops Are Great. But Not During a Lecture or a Meeting.

Step into any college lecture hall and you are likely to find a sea of students typing away at open, glowing laptops as the professor speaks. But you won’t see that when I’m teaching.

Though I make a few exceptions, I generally ban electronics, including laptops, in my classes and research seminars.

That may seem extreme. After all, with laptops, students can, in some ways, absorb more from lectures than they can with just paper and pen. They can download course readings, look up unfamiliar concepts on the fly and create an accurate, well-organized record of the lecture material. All of that is good.

But a growing body of evidence shows that over all, college students learn less when they use computers or tablets during lectures. They also tend to earn worse grades. The research is unequivocal: Laptops distract from learning, both for users and for those around them. It’s not much of a leap to expect that electronics also undermine learning in high school classrooms or that they hurt productivity in meetings in all kinds of workplaces.

Measuring the effect of laptops on learning is tough. One problem is that students don’t all use laptops the same way. It might be that dedicated students, who tend to earn high grades, use them more frequently in classes. It might be that the most distracted students turn to their laptops whenever they are bored. In any case, a simple comparison of performance may confuse the effect of laptops with the characteristics of the students who choose to use them. Researchers call this “selection bias.”

Researchers can solve that problem by randomly assigning some students to use laptops. With that approach, the students who use laptops are comparable in all other ways to those who don’t.

In a series of experiments at Princeton University and the University of California, Los Angeles, students were randomly assigned either laptops or pen and paper for note-taking at a lecture. Those who had used laptops had substantially worse understanding of the lecture, as measured by a standardized test, than those who did not.

The researchers hypothesized that, because students can type faster than they can write, the lecturer’s words flowed right to the students’ typing fingers without stopping in their brains for substantive processing. Students writing by hand had to process and condense the spoken material simply to enable their pens to keep up with the lecture. Indeed, the notes of the laptop users more closely resembled transcripts than lecture summaries. The handwritten versions were more succinct but included the salient issues discussed in the lecture.

Even so, it may seem heavy-handed to ban electronics in the classroom. Most college students are legal adults who can serve in the armed forces, vote and own property. Why shouldn’t they decide themselves whether to use a laptop?

The strongest argument against allowing that choice is that one student’s use of a laptop harms the learning of students around them. In a series of lab experiments, researchers at York University and McMaster University in Canada tested the effect of laptops on students who weren’t using them. Some students were told to perform small tasks on their laptops unrelated to the lecture, like looking up movie times. As expected, these students retained less of the lecture material. But what is really interesting is that the learning of students seated near the laptop users was also negatively affected.

The economic term for such a spillover is a “negative externality,” which occurs when one person’s consumption harms the well-being of others. The classic negative externality is pollution: A factory burning coal or a car using gasoline can harm the air and environment for those around it. A laptop can sometimes be a form of visual pollution: Those nearby see its screen, and their attention is pulled toward its enticements, which often include not just note-taking but Facebook, Twitter, email and news.

These experiments go only so far. They may not capture positive effects of laptops in real classrooms over the course of a semester, when students use their typed notes for review and grades are at stake. But another study did just that.

At the United States Military Academy, a team of professors studied laptop use in an introductory economics class. The course was taught in small sections, which the researchers randomly assigned to one of three conditions: electronics allowed, electronics banned and tablets allowed but only if laid flat on desks, where professors could monitor their use. By the end of the semester, students in the classrooms with laptops or tablets had performed substantially worse than those in the sections where electronics were banned.

You might question whether the experience of military cadets learning economics is relevant to students in other settings — say, community college students learning Shakespeare. But we’d expect the negative effects of laptops to be, if anything, less at West Point, where all courses are taught in small sections, than it is at institutions with many large lectures. Further, cadets have very strong incentives to perform well and avoid distractions, since class rank has a major impact on their job status after graduation.

The best way to settle this question is probably to study laptop use in more colleges. But until then, I find the evidence sufficiently compelling that I’ve made my decision: I ban electronics in my own classes.

I do make one major exception. Students with learning disabilities may use electronics in order to participate in class. This does reveal that any student using electronics has a learning disability. That is a loss of privacy for those students, which also occurs when they are given more time to complete a test. Those negatives must be weighed against the learning losses of other students when laptops are used in class.

Students may object that a laptop ban prevents them from storing notes on their computers. But smartphones can snap pictures of handwritten pages and convert them to an electronic format. Even better, outside class, students can read their own handwritten notes and type them, if they like, a process that enhances learning.

The best evidence available now suggests that students should avoid laptops during lectures and just pick up their pens. It’s not a leap to think that the same holds for middle and high school classrooms, as well as for workplace meetings.

If the world builds every coal plant that’s planned, climate change goals are doomed, scientists say

The coal power plant in Jaenschwalde, Germany, on Saturday.

The much-heralded demise of the coal industry may be overstated, a new scientific analysis asserts — finding that if all planned plants were constructed, the world would have little chance of meeting its climate change goals.

The new study, by Ottmar Edenhofer of the Mercator Research Institute on Global Commons and Climate Change in Berlin, and three colleagues, finds that nations including Turkey, Vietnam and Indonesia could increase their emissions from coal dramatically between now and 2030, based on current plans.

In combination with already existing infrastructure, these planned or in-construction plants, if run for a standard plant lifetime, could burn up much of the remaining carbon budget for holding Earth’s temperature increase below two degrees Celsius, or 3.6 degrees Fahrenheit, the research concludes.

“The main message is that when we continue with the existing coal fired power plants, and build the new ones, we are closing the door to the 2 degree target,” Edenhofer said. The research was published in Environmental Research Letters, with co-authors from the Potsdam Institute for Climate Impact Research and the Technical University of Berlin.

The study is based on a concept of “lock-in” or “committed” emissions: Once a coal plant is completed and put into service, the thinking goes, it’s likely to operate for long time to justify the cost of the investment.

And based on an analysis of global coal plans, the research finds that five countries — India, China, Turkey, Vietnam and Indonesia — are home to “nearly three quarters (73 percent) of the global coal-fired capacity that is currently under construction or planned.” Vietnam, if plans are carried forward, could see 948 percent growth in coal emissions, the research asserts, by 2030.

But even as this is happening, the study notes that we have only about 700 billion tons of carbon dioxide that can still be emitted, after the year 2016, to preserve good odds of holding the temperature increase below two degrees Celsius. Existing coal plants and other infrastructure are capable of consuming 500 billion tons on their own, assuming we use them until they are worn out.

New coal plants that are underway or planned, meanwhile, could consume another 150 billion tons, the research finds, during their lifetimes. That pretty much accounts for the two degrees Celsius budget right there.

Here’s a figure from the study showing these calculations:

And if we want to hold warming to just 1.5 degrees Celsius … well. The numbers suggest there’s not much chance of that happening, even with the existing infrastructure alone.

“There is a real risk that the inertia of fossil fuel infrastructure will drag us past where we want to be,” said Steven Davis, an energy researcher at the University of California at Irvine, in a comment on the study.

“However, the retirement of power plants is ultimately an economic decision, and one that gets easier as non-fossil energy sources get cheaper,” Davis added. “And of all coal plants proposed in recent years, only about half have historically been built, and that fraction has been trending down in key places like China and India.”

That highlights the key challenge with the study — it’s one thing to say that coal plants are planned, and quite another to say that they will be completed and used throughout their full potential lifetimes.

The research is based on a database by CoalSwarm, a project of the Earth Island Institute, which carefully tracks coal plants in varying stages of completion across the globe, in collaboration with Greenpeace and the Sierra Club. The group makes is clear that it opposes the new coal plants that it is tracking.

Christine Shearer, a researcher with CoalSwarm, said it’s important to bear in mind that not all coal plants are actually completed. “Since we started doing this work, since 2010, only about a third of proposed coal plants ever begin construction or are commissioned,” she said.

“Stranding or divesting from coal power plants is an outcome that is in reach also,” added Peter Erickson, an expert with the Stockholm Environment Institute, who reviewed the study but was not involved with it. “Coal financing and political and social norms around coal are also rapidly changing.”

Edenhofer, the study author, countered that the current building plans are important information.

“This does not mean we are doomed, but these announcements are announcements which should be taken into account very seriously,” he said. “These are not just paper plants, these are real plants.”

It’s important to consider the implications of the new study in terms of equity, experts say. It’s not fair to say that developing nations should cancel coal plans when major industrialized countries, like Germany and the United States, continue to burn large amounts of coal.

“Higher-income countries have by far the greatest coal power — U.S., Germany, China, Russia, Japan — and so the phaseout in these countries is even more important in the big picture and, also, more equitable,” Erickson said.

Edenhofer doesn’t disagree. He cited Germany as one example of a developed nation set to miss climate targets because it is burning a lot of coal.

“I can see a fairness issue,” Edenhofer said. “But when we want to be fair, this means that we have to support the developing countries in building up a new infrastructure.”

The ultimate conclusion, though, is that operating and future coal plants alike are in tension with the Paris climate agreement and widely accepted climate goals.

“If we don’t stop building coal plants now, we will have four unpalatable options,” said Cameron Hepburn, a researcher at the University of Oxford, in a comment sent by email. “We either (1) shut down coal plants early, (2) retrofit expensive carbon capture technologies, (3) suck even more CO2 out of the atmosphere, potentially at high cost, or (4) burn through the 2 degree C target.”

NIH Officially Lifts Ban on Research Studying Germs with Pandemic Potential


The National Institutes of Health (NIH) has lifted a three-year freeze in federal funding for research projects pertaining to germs that can cause pandemics. The Department of Health and Human Services (HHS) released a new framework dictating how research that could create newer and deadlier germs with pandemic potential is funded.

“We have a responsibility to ensure that research with infectious agents is conducted responsibly, and that we consider the potential biosafety and biosecurity risks associated with such research,” wrote NIH director Francis S. Collins in a statement published on the organization’s website. “I am confident that the thoughtful review process laid out by the HHS P3CO Framework will help to facilitate the safe, secure, and responsible conduct of this type of research in a manner that maximizes the benefits to public health.

Pandemics are disease epidemics that occur worldwide and affect a large number of people, like the Spanish Flu in 1918 that killed nearly 50 million people. Typically, scientists manipulate existing pathogens – making them deadlier or easier to pass on – to better understand them and develop countermeasures against those that may threaten public health.

But the funding ban was put in place after a string of incidents involving avian flu and anthrax that raised concerns about the consequences of an accident occurring in a lab. Any research involving influenza, severe acute respiratory syndrome (SARS), or Middle East Respiratory Syndrome (MERS) viruses was blocked.

The issue has become a point of contention among members of the scientific community. While some argue that this work is an essential component of preparing for future pandemics, others maintain that the risks are too great.

“The public and regulators are looking for science-based advice, but, in this case, there is still considerable disagreement within the scientific community,” explained Daniel Rozell, a research assistant professor in the department of technology and society at Stony Brook University, in an email correspondence with Futurism.

“Furthermore, there is some unavoidable bias in the advice. Some of the virologists most acquainted with the specifics of the research have careers that depend on its continuance,” he said. “While they may have the best of intentions, there is still a tendency to underestimate familiar risks and to be partial towards one’s own efforts.”


When funding was paused in 2014, the NIH Office of Science Policy was tasked with carrying out a “comprehensive, sound, and credible” risk-benefit analysis to inform how the situation should be handled. Even this analysis proved contentious. However, risk assessments don’t just serve to determine whether or not the research can be carried out safely – they can establish best practices for doing so.

“A risk-benefit assessment is still a useful exercise because it can be used for risk exploration,” said Rozell. “When researchers are cognizant of the most likely hazards arising from a line of research, they can take steps to redesign the research to achieve the same outcome without the potential for unintended consequences.”

Research into pandemic pathogens could play a vital role in ensuring that we can respond appropriately to an outbreak – but it’s crucial that such research is carried out in such a way that it doesn’t end up causing the very situation it’s meant to address.

Brain-damaging vaccines, pesticides and medicines generate nearly $800 billion a year in medical revenues.

‘The current estimated annual cost for nine of the most common neurological disorders in the U.S. was a hefty $789 billion, a recent paper revealed. According to the paper, these conditions include Alzheimer’s disease and other forms of dementia, traumatic brain injury and Parkinson’s disease, as well as epilepsy, multiple sclerosis, and spinal cord injury.

get-attachment (324)

Researchers also projected that health care costs associated with brain damage will continue to increase as the number of elderly patients were expected to double between 2011 and 2050. Data showed that medical costs related to dementia and stroke alone were estimated to be more than $600 billion by 2030.

“The findings of this report are a wake-up call for the nation, as we are facing an already incredible financial burden that is going to rapidly worsen in the coming years. Although society continues to reap the benefits of the dramatic research investments in heart disease and cancer over the last few decades, similar levels of investment are required to fund neuroscience research focused on curing devastating neurological diseases such as stroke and Alzheimer’s, both to help our patients and also to avoid costs so large they could destabilize the entire health care system and the national economy,” said lead author Dr. Clifton Gooch, reports.’

Drug-resistant “nightmare bacteria” are quickly spreading through US hospitals

Researchers have found evidence that drug-resistant superbugs, which have been labelled “nightmare bacteria”, are spreading faster and more stealthily inside US hospitals than previously thought.

In the US, the bacteria, known as carbapenem-resistant Enterobacteriaceae(CRE), infect roughly 9,300 people per year, and kill around 600. And now researchers think they might spread from person to person asymptomatically – which explains why doctors are often unable to detect it.

“While the typical focus has been on treating sick patients with CRE-related infections, our new findings suggest that CRE is spreading beyond the obvious cases of disease,” said William Hanag  from the Harvard T. H. Chan School of Public Health.

“We need to look harder for this unobserved transmission within our communities and healthcare facilities if we want to stamp it out.”

CRE are a class of drug-resistant bacteria that are even able to withstand carbapenems – last-resort drugs that are administered after all other antibiotics fail.

Enterobacteriaceae are a large-family of bacteria that include bugs such as SalmonellaE. coli, and Shigella –all of which are common causes of food poisoning and stomach bugs.

When they’re not drug-resistant, these bacteria can easily be treated by antibiotics, but antibiotic resistance has increasingly been spread within the family.

The bacteria are known to thrive in hospitals and long-term care facilities, where they evolve and pass genes back and forth over time, eventually becoming deadly CSE superbugs that drugs cannot treat, and earning the researchers’ title of “nightmare bacteria“.


An official report last week showed that a US woman has already died from one superbug – an antibiotic resistant strain of pneumonia (not a type of CSE), which was resistant to all available antibiotics in the US.

Now, Hanage and his colleagues have discovered that CSE superbugs, at least, might be spreading at a much faster rate than expected, and are starting to avoid our normal ‘surveillance’ methods by spreading asymptomatically.

“You know the phrase ‘Shutting the stable door after the horse has bolted?’ The horse has not only bolted, the horse has had a lot of ponies, and they’re eating all our carrots,” Hanage told Helen Branswell at Stat News.

To figure out how rapidly CRE was diversifying and spreading, the team analysedover 250 samples from hospitalised patients in three different Boston-based facilities and one in California.

When finished, they found that CRE populations were way more diverse than previously thought, meaning that drug-resistant genes had spread more rapidly and easily between the strains than expected.

The team called it a “riot of diversity“.

Sometimes the species they found didn’t even carry the genes known to supress carbapenems, but  were still able to survive them, suggesting that they’ve found new ways to avoid these antibiotics that we don’t even know about yet.

“There are many different ways in which they can be resistant,” Hanage told Stat News.

To make things worse, the team wasn’t able to see a clear pattern of transmission for these CRE strains – the resistance seemed to be spreading even without any obvious cases of illness or infection.

“The best way to stop CRE making people sick is to prevent transmission in the first place,” Hanage said.

“If it is right that we are missing a lot of transmission, then only focusing on cases of disease is like playing Whack-a-Mole; we can be sure the bacteria will pop up again somewhere else.”

The team hypothesises that these transmissions might be happening from person to person asymptomatically, though they will need to carry out further studies to verify this is the case.

How Monsanto Promotes Worldwide Infertility

Monsanto has a long and infamous history of manufacturing and bringing to market such chemicals as DDT, Agent Orange, aspartame, Roundup and dioxin1 — chemical compounds from which society continues to feel the effects.

In an effort to distance the current corporation from past deeds, Monsanto refers to the company prior to 2002 as “the former Monsanto” in their news releases.2 However, nothing has really changed aside from their PR machine.

While Monsanto has branched into genetic engineering (GE) of plants, the sale of patented GE seeds simply feeds the need for the company’s pesticides. Monsanto is STILL primarily a purveyor of toxins, not life.

Monsanto began forging a unique and financially advantageous relationship with the U.S. government starting with the company’s involvement in the Manhattan Project that produced the first nuclear weapons during World War II. During the Vietnam War they were the leading producer of Agent Orange.

The specialization in the production and distribution of toxic chemicals continues today.

Their influence over government runs so deep that despite the fact 64 other countries have been labeling genetically engineered (GE) foods for years, the U.S. now has the distinction of being the first country to un-label GE foods at the urging of a company producing mass amounts of GE seeds.

Monsanto and Polychlorinated Biphenyls (PCBs)

In the latter part of the 1920s, Monsanto was the largest producer of PCBs. This chemical was used in lubricant for electric motors, hydraulic fluids and to insulate electrical equipment.3 Old fluorescent light fixtures and electrical appliances with PCB capacitors may still contain the chemical.

During the years PCB was manufactured and used, there were no controls placed on disposal. Since PCBs don’t break down under many conditions, they are now widely distributed through the environment and have made the journey up the food chain.4

Between the inception and distribution of the product and its subsequent ban in the late 1970s, an estimated 1.5 billion pounds were distributed in products around the world.5

Monsanto was the primary manufacturer of PCBs in the U.S. under the trade name Aroclor. Health problems associated with exposure to the chemical were noted as early as 1933 when 23 of 24 workers at the production plant developed pustules, loss of energy and appetite, loss of libido and other skin disturbances.6

According to Monsanto’s public timeline, it was in 1966 that “Monsanto and others began to study PCB persistence in the environment.”7 However, seven years earlier, Monsanto’s assistant director of their Medical Department wrote:

“… [S]ufficient exposure, whether by inhalation of vapors or skin contact, can result in chloracne which I think we must assume could be an indication of a more systemic injury if the exposure were allowed to continue.”8

In 1967, Shell Oil called to inform Monsanto of press reports from Sweden, noting that PCBs were accumulating in mammals further up the food chain. Shell asked for PCB samples to perform their own analytical studies.9

With full knowledge of the devastation expected to the environment and humanity, it wasn’t until 11 years later, in 1977, that Monsanto reportedly pulled production on PCB.10

PCBs Are Probable Human Carcinogens

The International Agency for Research on Cancer (IARC), the U.S. Environmental Protection Agency (EPA), the National Toxicology Program, and the National Institute for Occupational Safety and Health (NIEHS) have identified PCBs as either probable, potential or reasonably likely to cause cancer in humans.11

If it seems like these agencies are couching their words, they are. Human studies have noted increased rates of liver cancer, gall bladder cancer, melanomas, gastrointestinal cancer, biliary tract cancer, brain cancer and breast cancer when individuals had higher levels of PCB chemicals in their blood and tissue.12

However, the EPA limits the ability of researchers to link a chemical as a carcinogen unless there is conclusive proof. While this proof is evident in animal studies, you can’t feed these chemicals to humans and record the results. Thus PCBs are a “probable” carcinogen in humans. Other health effects from PCBs include:

  • Babies born with neurological and motor control delays including lower IQ, poor short-term memory and poor performance on standardized behavioral assessment tests
  • Disrupted sex hormones including shortened menstrual cycles, reduced sperm count and premature puberty
  • Imbalanced thyroid hormone affecting growth, intellectual and behavioral development
  • Immune effects, including children with more ear infections and chickenpox

Once PCBs are absorbed in the body they deposit in the fat tissue. They are not broken down or excreted. This means the number of PCBs build over time and move up the food chain. Smaller fish are eaten by larger ones and eventually land on your dinner table.

Chemical Poisoning Begins Before Birth

A recent study at the University of California demonstrated that PCBs are found in the blood of pregnant women.13 Before birth, the umbilical cord delivers approximately 300 quarts of blood to your baby every day.

Not long ago, researchers believed the placenta would shield your developing baby from most pollutants and chemicals. Now we know it does not.

The umbilical cord is a lifeline between mother and child, sustaining life and propelling growth. However, in recent research cord blood contained between 200 and 280 different chemicals; 180 were known carcinogens and 217 were toxic to the baby’s developing nervous system.14

The deposits of chemicals in your body or the body of your developing baby are called your “body burden” of chemicals and pollution.

A steady stream of chemicals from the environment during a critical time of organ and system development has a significant impact on the health of your child, both in infancy and as the child grows to adulthood.

Tracey Woodruff, Ph.D., director of the University of California San Francisco Program on Reproductive Health and the Environment, was quoted in a press release, saying:

“It was surprising and concerning to find so many chemicals in pregnant women without fully knowing the implications for pregnancy. Several of these chemicals in pregnant women were at the same concentrations that have been associated with negative effects in children from other studies.

In addition, exposure to multiple chemicals that can increase the risk of the same adverse health outcome can have a greater impact than exposure to just one chemical.”

Butyl Benzyl Phthalate — Another Monsanto Product

Butyl benzyl phthalate (BBP), also manufactured by Monsanto, was recently implicated in cell fat storage.15 This specific phthalate was found in human fluids and had an effect on the accumulation of fat inside cells.

BBP is used in the manufacture of vinyl tile, as a plasticizer in PVC pipe, carpets, conveyer belts and weather stripping in your home and office.

Like other phthalates used in the production of plastics, BBP is not bound to the product and can be released into your environment. It may be absorbed by crops and move up the food chain.16 The biggest source of exposure is food.

Drive-through hamburgers and take-out pizzas may be increasing your intake of phthalates. The danger is not in the food itself but in the products used to handle it. The study analyzed data from nearly 9,000 individuals, finding the one-third who had eaten at a fast food restaurant had higher levels of two different phthalates.17

Potentially, BBP may adversely affect your reproductive function. However, at lower doses it also has an effect on your kidneys, liver and pancreas.18 Increased risks of respiratory disorders and multiple myelomas have also been reported in people who have exposure to products manufactured with BBP.19 An increasing waistline from BBP exposure may also reduce your fertility.

Low Sperm Count and Infertility Affecting Animals and Humans

A 26-year study of fertility in dogs, published recently, has distinct similarities to infertility rates in humans. In this study, researchers evaluated the ejaculate of nearly 2,000 dogs. Over the 26 year period, they found a drop in sperm motility of 2.4 percent per year.20

Additionally, both the semen and the testicles of castrated dogs contained by PCBs and phthalates, implicated in other studies to reduction in fertility. Phthalates have been implicated in both decreased sperm motility and quality of your sperm,21 affecting both fertility and the health of your children.22

Researchers used dogs in this study as they live in the same environment as their owners, and often eat some of the same food. This correlation between sperm function and concentration, and environment and food in dogs and humans is significant.

In those 26 years there was also a rise in cryptorchidism in male pups (a condition where the testicles don’t descend into the scrotum) born to stud dogs who experienced a decline in sperm quality and motility.23 Cryptorchidism and undescended testicles, occurs at a rate of 1 in 20 term male human infants and 1 in 3 pre-term babies.24

Problems with infertility are also affecting marine animals at the top of the food chain. In the western waters of the Atlantic, the last pod of Orcas are doomed to extinction. High levels of PCB have been found in the fat of over 1,000 dolphins and Orcas in the past 20 years. Now taking a toll on the animal’s fertility, this pod of Orcas has not reproduced in the 19 years it has been under study.25

Orcas were living in the North Sea until the 1960s. At that time PCB pollution peaked in the area and the Orca whales disappeared. The same happened in the Mediterranean Sea, where the whales flourished until the 1980s. This pod off the coast of the U.K. is the last living pod in that area.

Monsanto’s Argument in PCB Lawsuits

Although Monsanto denies culpability and knowledge of the danger behind the chemical PCB, you’ll discover internal documentation in this video that they did, in fact, know of the danger while manufacturing and distributing the product. Monsanto is currently embroiled in several lawsuits across eight cities and the argument is over who owns the rain. The cities are suing Monsanto in Federal Court, saying PCBs manufactured by Monsanto have polluted the San Francisco Bay.26

Monsanto attorney Robert Howard argues that because the city does not own the water rights, the city does not have the right to sue. And, because the PCBs have not damaged city property, such as corroding pipes, Howard claims it is a state problem. Scott Fiske, attorney for three cities, countered with the city’s regulatory interests in management of storm water as a fundamental function of the city.27

While Fiske claims he can prove Monsanto knew the product was hazardous as early as 1969, Howard maintains the company should not be liable for the use of the chemicals it produced.

In 2001, Monsanto attorneys in the Owens v. Monsanto case, acknowledged only one health threat from exposure to PCBs: chloracne, and instead argued that since the entire planet has been contaminated, they are innocent of all liability.28 The attorney for Monsanto was quoted in the Chemical Industry Archives, saying:

“The truth is that PCBs are everywhere. They are in meat, they are in everyone in the courtroom, they are everywhere and they have been for a long time, along with a host of other substances.” 29

The cities currently engaged in lawsuits against Monsanto for damage to the environment and waterways include Berkley, Oakland, San Jose, Portland, Spokane, Seattle, Long Beach and San Diego. All eight cities attempted to combine their cases against the agrochemical giant but were unsuccessful when one judge found the issues were different enough to warrant separate cases.30

Monsanto’s Deep Pockets

Monsanto petitioned the Federal Court to dismiss Portland’s lawsuit, claiming it would countersue, adding years to the process. It is likely Monsanto would increase the scope of the case and include companies who used the product and released the PCBs.31 Meanwhile, three plaintiffs in St. Louis received better news in May 2016 when a jury awarded them a total of $46.5 million, finding Monsanto negligent in the production of PCBs.32

This suit claimed Monsanto sold PCBs even after it learned about the dangers, bringing to court internal documents dated 1955, which stated: “We know Aroclors [PCBs] are toxic but the actual limit has not been precisely defined.”33 To date this win over Monsanto has been rare. Williams Kherkher, attorney for the plaintiffs, explained in EcoWatch:34

“The only reason why this victory is rare is because no one has had the money to fight Monsanto.”

Kherkher and other firms pooled their resources in this case and expect wins in upcoming lawsuits. The firm has accumulated the names of approximately 1,000 plaintiffs with claims against Monsanto and PCBs.

Find Out the Glyphosate Levels in Your Body

Glyphosate is the active ingredient in Roundup, and is the most widely used weed-killing chemical on farms, lawns, schoolyards and other public spaces. It’s also extensively applied to many crops before harvest. The World Health Organization (WHO) performed its own independent analysis in March 2015, and determined glyphosate is a probable carcinogen.

The Health Research Institute (HRI) in Iowa has developed a glyphosate test kit that will allow you to learn your personal glyphosate levels. I’ve recently gained access to a limited number of kits that I’m now able to offer on at cost, so no profit will be made on their sales. Ordering also allows you to participate in a worldwide study on environmental exposure to glyphosate.

The Dangers of Processed Junk Foods to Your Body

Lung Cancer Risk

Story at-a-glance

  • An astonishing 60 percent of the food Americans eat is ULTRA-processed, and these foods account for 90 percent of the added sugar consumption in the U.S.
  • About 2 percent of the calories in processed foods come from added sugars. By definition, unprocessed or minimally processed contain none. Ultra-processed foods get 21 percent of their calories from added sugars
  • The 2015 U.S. dietary guidelines recommend limiting sugar intake to a max of 10 percent of daily calories. Cutting down on ultra-processed foods is a simple way to reduce your added sugar intake

About 90 percent of the money Americans spend on food goes to buy processed food.1,2,3 What’s worse, new research shows that, astonishingly, more than half—nearly 60 percent, in fact—of the food Americans eat is ULTRA-processed.4,5,6,7,8,9,10,11

Basically, half of what the average American eats in any given day are convenience foods that can be bought at your local gas station.

Moreover, those ultra-processed foods account for 90 percent of the added sugar consumption in the U.S.  Data from a nationally representative food survey was used for this study, which found that:

  • On average, 57.9 percent of the calories people eat comes from ultra-processed foods
  • 29.6 percent of calories comes from unprocessed or minimally processed foods (such as meats, eggs, milk, and pasta)
  • Processed but not ultra-processed foods (such as canned or preserved foods, cured meats and cheeses) account for 9.4 percent of calories
  • 2.9 percent of calories comes from “processed culinary ingredients” such as vegetable oil, table salt, and sugar
  • Less than 1 percent of daily calories comes from vegetables

Excessive Sugar Consumption Drives Disease Statistics

The dangers of eating too much added sugar have been well-established, and have even become officially recognized. For the first time ever, the 2015-2020 U.S. dietary guidelines12 now recommend limiting your sugar intake to a maximum of 10 percent of your daily calories.13

Decreasing sugar consumption is indeed at the top of the list if you’re overweight, insulin resistant, or struggle with any chronic disease. Research14 has shown that as much as 40 percent of American healthcare expenditures are for diseases directly related to the overconsumption of sugar.

More than $1 trillion each year is spent on treating sugar and junk food-related diseases, which runs the gamut from obesity and diabetes, to heart disease and cancer.15

According to a report16 on the global cancer burden, published in 2014, obesity is responsible for an estimated 500,000 cancer cases worldwide each year. A more recent British report estimates obesity may result in an additional 670,000 cancer cases in the U.K. alone over the next 20 years.

For over half a century, nutritional guidelines have focused on cutting saturated fats and cholesterol, and we now know that this was a very serious mistake.

As fats were removed from processed fare, the sugar content increased (to make the food palatable), and sugar is the real culprit of virtually all diseases previously blamed on dietary fats.

What is Ultra-Processed Food?

Anything that isn’t directly from the vine, bush, tree, or from the earth is considered processed. Bread and pasta, for example, are processed goods. Ditto for anything canned or frozen.

Depending on the amount of adulteration the food goes through, processing may be considered minimal or significant. “Ultra-processed” foods are at the far end of the significantly altered spectrum.

Examples of ultra-processed foods include breakfast cereals, pizza, soda, chips and other salty/sweet/savory snacks, packaged baked goods, microwaveable frozen meals, instant soups and sauces, and much more. In the featured study, ultra-processed foods were defined as:

  • Food products containing several ingredients that are not traditionally used in cooking
  • Besides salt, sugar, oils and fats, they can include artificial flavors, colors, sweeteners, and other additives “used to imitate sensorial qualities of unprocessed or minimally processed foods”
  • These ingredients may also be added “to disguise undesirable qualities of the final product”
  • They typically contain preservatives and chemicals that give them an unnaturally long shelf-life

Ultra-Processed Foods Contain FAR More Sugar Than Processed Foods

The difference between processed foods and ultra-processed foods in terms of sugar content is quite dramatic.

The researchers found that about 2 percent of the calories in processed foods came from added sugars. By definition, unprocessed or minimally processed contained none. Ultra-processed foods, on the other hand, got 21 percent of their calories from added sugars.

Not surprisingly, the authors of the featured study concluded that: “Decreasing the consumption of ultra-processed foods could be an effective way of reducing the excessive intake of added sugars in the USA.”

On a positive note, the researchers also found that there were significant differences in how much ultra-processed foods people ate.

One in 5 people (about 60 million Americans) actually got more than 70 percent of their calories from real food (i.e. unprocessed or minimally processed), and only 30 percent from ultra-processed fare.

As noted by Time Magazine:17 “7.5 percent of the people with the lowest processed food consumption actually met the federal dietary recommendations of eating no more than 10 percent of daily calories from sugar.

So if people avoid processed foods, it’s possible to reach recommended nutritional requirements.”

So there is a ray of hope. In my view, eating a diet consisting of 90 percent real food and only 10 percent or less processed foods is a doable goal for most that could make a significant difference in your weight and overall health.

I realize for many  this is a challenge, but I know it is doable. Unless I’m travelling, my diet is very close to 100 percent real food, much of it grown on my property. One just needs to make the commitment and place a high priority on it.

Carb-Rich Foods are As Risky As Cigarettes

In related news, research suggests refined non-vegetable fiber carbs such as potatoes, bagels and breakfast cereal are as risky as smoking, increasing your risk for lung cancer by as much as 49 percent.

Your risk is particularly high if you’ve never smoked. Among smokers, eating a high glycemic diet was associated with a 31 percent increased risk for lung cancer. As reported by UPI:18

“A high glycemic index, a measure of the effect of carbohydrates on blood sugar levels, was linked to a greater chance for developing lung cancer, researchers at the University of Texas MD Andersen Cancer Center found…

While increased levels of carbohydrates can increase the risk, the researchers said the quality of carbohydrates, rather than the quantity, has the strongest effect.

Foods such as white bread and puffed rice cereal are highly refined, which is why the researchers suggest swapping them out for whole-wheat or pumpernickel breads and pasta.

“The results from this study suggest that, besides maintaining healthy lifestyles, such as avoiding tobacco, limiting alcohol consumption and being physically active, reducing the consumption of foods and beverages with high glycemic index may serve as a means to lower the risk of lung cancer,” Dr. Xifeng Wu, a professor of epidemiology at the University of Texas, said…”

High glycemic foods, i.e. refined carbs high in sugar, promote insulin resistance and obesity, and this isn’t the first time a connection has been made between a high-sugar and/or obesity and cancer.

In fact, cancer specialists who discussed the cancer trend at the 2015 American Society of Clinical Oncology conference in Chicago warned that obesity will likely overtake smoking to claim the lead spot as the principal cause of 10 different types of cancer within the next decade.19 Obesity is also associated with worsened prognosis after a cancer diagnosis, raises your risk of dying from the cancer treatment, and raises your risk of additional malignancies and comorbidities.20

Half of All Americans are Pre-Diabetic or Diabetic

Other recent research suggests that nearly half of all adults living in California now have diabetes or prediabetes, and most are not even aware of it. (For a list of pre-diabetes and diabetes rates by county, see the original news story.21) According to Harold Goldstein, executive director of the California Center for Public Health Advocacy which commissioned the report:22 “This study is a wake-up call that says it’s time to make diabetes prevention a top state priority.”

As reported by Marinij Health:23

“Nationally, diabetes rates have tripled over the past 30 years. In California, the rate has increased by 35 percent since 2001…Some health experts say one way to address the diabetes epidemic is to impose a tax on sugary beverages. Berkeley became the first city in the country to pass a soda tax in 2014, but similar efforts have repeatedly failed in the Legislature…

[H]owever, two legislators — Democratic Assemblymen Jim Wood, of Healdsburg, and Richard Bloom, of Santa Monica – [have] proposed a “health impact fee” of 2 cents per ounce on sugar-sweetened sodas and other drinks. And last month, a Field Poll about childhood obesity-prevention policies showed more than 7 in 10 of voters polled believe there’s a close link between a child regularly drinking sugary beverages and diabetes.”

In 2008, pre-diabetes and diabetes affected 1 in 4 Americans. Then, research24,25 published last year which looked at data up to 2012, found that HALF of all Americans are now either pre-diabetic or diabetic. In all, 12 to 14 percent have full-blown diabetes, and another 38 percent are pre-diabetic. So California is not unusual in that sense. Moreover, as in California, African-Americans, Hispanics, and Asian-Americans are nearly twice as likely to have diabetes as Caucasians.

Why Diabetes is Such a Dangerous Disease

Diabetes has become so common that many don’t even bat an eyelash anymore, but this is a serious mistake. Aside from the potentially deadly side effects of diabetes drugs, which I’ve covered in previous articles, the health complications that diabetes fosters are many, including but not limited to the following:

High blood pressure, heart disease, and stroke – 75 percent of diabetics have high blood pressure (130/80 mm Hg or higher). Death from heart disease and risk for stroke is 2 to 4 times higher among people with diabetes. Amputations – In 2004, 71,000 lower limb amputations due to diabetes were performed in the U.S.
Blindness — Diabetes is the leading cause of new cases of blindness among adults aged 20 to 74 years Dental disease — Almost one-third of people with diabetes have severe periodontal disease
Kidney disease – Diabetes is the leading cause of kidney failure.

In 2005, more than 45,700 people began treatment for end-stage kidney disease in the U.S. and Puerto Rico, and another 178,700 were living on chronic dialysis

Pregnancy complications — Poorly controlled diabetes before conception and during the first trimester of pregnancy among women with type 1 diabetes can cause major birth defects in 5 to 10 percent of pregnancies, and spontaneous abortions in 15 to 20 percent of pregnancies
Nervous system disease — About 60 to 70 percent of people with diabetes have mild to severe forms of nervous system damage such as: impaired sensation or pain in hands or feet, poor digestion, carpal tunnel syndrome and erectile dysfunction Cancer— People with prediabetes have a 15 percent higher risk of cancer, especially cancers of the liver, stomach, pancreas, breast, and endometrium.

Women with diabetes have a 50 percent greater risk of developing colorectal cancer than women without diabetes.26

People with the highest insulin levels at the time of a cancer diagnosis also have significantly increased risks of cancer recurrence, as well as a greater risk of being diagnosed with a particularly aggressive form of cancer27

What’s the Key to Resolving Insulin Resistance and Diabetes?

The answer can be summarized in three words: Eat real food. Intermittent fasting, or the more accurate term, Time Restricted Feeding (TRF), can also be helpful. When you fast, your liver burns off the available liver fat, and by temporarily depleting your liver fat stores you restore metabolic stability to your liver and improve hepatic insulin sensitivity.

Exercise is also an important component. Studies have shown that exercise is beneficial and increases insulin sensitivity, whether you lose weight or not,28 and even if you’re physically active as little as 2.5 hours a week can be beneficial.29 When it comes to diet though, the long-term and most sustainable answer is to simply cut way down on ultra-processed foods, and to think of “diet” in terms of unprocessed whole foods, with which you then cook from scratch.

Truly, a major part of the problem is that so few people take the time to cook their own meals anymore. But relying on a “gas station diet” of ultra-processed foods is a recipe for insulin resistance, obesity, and related diseases that will ultimately cost you a fortune in medical bills and shorten your lifespan.

When you consider the ultimate, long-term price tag of all this convenience food, the time you invest in cooking will pay tremendous dividends. Remember if you want to be healthy, you or someone you trust needs to spend some serious time in the kitchen preparing your own food.

If you’re insulin/leptin resistant, have diabetes, high blood pressure, heart disease, or are overweight, you’d be wise to limit your total fructose intake to 15 grams per day until your insulin/leptin resistance has resolved. For others, limit your daily fructose consumption to 25 grams or less. This can be pretty difficult unless you eat real food, and the reason for this is because ultra-processed foods are eight times higher in sugar than minimally processed or unprocessed foods.

Replace Refined Carbs With Healthy Fats and Moderate Amounts of Protein

Since you’re cutting a lot of energy (carbs) from your diet when you eliminate processed sugars and grains, you need to replace them with something better, including:

    • As much high quality healthy fat as you want. Your body needs saturated and monounsaturated fats to stay healthy, in appropriate quantities, as they provide many beneficial effects, contrary to what you have probably been told.30 It is good to target about 90 percent of your fat calories from them. If you’re insulin resistant, you may need upwards of 50-85 percent of your daily calories in the form of healthy fats.

Good sources include coconut and coconut oil, avocados, butter, nuts, and animal fats. Remember—fats are high in calories but small in volume, so when you look at your plate, vegetables should be the largest portion by far, as they are not calorie dense.

  • Moderate amounts of high quality protein found in organically-raised, grass-fed or pastured meats and dairy products, fish, legumes, and nuts. Aim for one-half gram of protein per pound of lean body mass, which places most people in a range of 40-70 grams of protein per day. Use the chart below to help you.
Red meat, pork, poultry and seafood average 6-9 grams of protein per ounce.

An ideal amount for most people would be a 3 ounce serving of meat or seafood (not 9 or 12 ounce steaks!), which will provide about 18-27 grams of protein

Eggs contain about 6-8 grams of protein per egg. So an omelet made from two eggs would give you about 12-16 grams of protein.

If you add cheese, you need to calculate that protein in as well (check the label of your cheese)

Seeds and nuts contain on average 4-8 grams of protein per quarter cup Cooked beans average about 7-8 grams per half cup
Cooked grains average 5-7 grams per cup Most vegetables contain about 1-2 grams of protein per ounce

Fermented Foods and Fiber Help Prevent Diabetes

Optimizing your gut health is also important. Multiple studies have shown that obese people have different intestinal bacteria than lean people. Recent research31,32 also suggests your microbiome can influence your risk of diabetes. Fortunately, optimizing your gut flora is relatively easy.

You can reseed your body with good bacteria by regularly eating fermented foods (like fermented vegetables, especially fermented with starter culture that has strains that produce vitamin K2, natto, raw organic cheese and miso) or by taking a high-quality probiotic supplement.

Recent research33 also shows that increasing your fiber intake can help prevent diabetes. In this study, those who had the highest intake of fiber (more than 26 grams a day) had an 18 percent lower risk of developing type 2 diabetes than those with the lowest intake (less than 19 grams a day).

One way that a high-fiber diet may be protective against obesity and diabetes has to do with your intestinal bacteria’s ability to ferment fibers. When you eat foods high in fermentable fibers, such as cabbage, beans, and other vegetables, the bacteria in your intestines ferments them into butyrate and propionate, which are short-chain fatty acids (SCFAs) involved in sugar production.

Just be sure to get most of your fiber in the form of vegetables, not grains, and focus on eating more vegetables, nuts, and seeds. The following whole foods, for example, contain high levels of soluble and insoluble fiber.

Chia seeds Berries Vegetables such as broccoli and Brussels sprouts
Root vegetables and tubers, including onions and sweet potatoes Almonds Psyllium seed husk, flax, and chia seeds
Green beans Cauliflower Beans

Earth is overdue an ‘extinction-level’ event, Nasa scientist warns.


Earth is due an “extinction-level” event we’re currently powerless to defend ourselves against.

This is according to Nasa scientist Dr Joseph Nuth, of the Goddard Space Flight Center, speaking on Monday at a meeting in San Francisco.

Large asteroids and comets of the variety that could lead to extinctions have tended to hit the earth around 50 to 60 million years apart.

The last major comet extinction is thought to have been around 65 million years ago, wiping out the dinosaurs towards the end of the cretaceous period.

Most large life on earth is thought to have been wiped out while small mammals thrived, able to scavenge and burrow unlike larger life which starved after the extinction event.

Nuth cited a close encounter in 2014, when a comet passed perilously close to Mars, spotted only 22 months prior to the fly-by.

Had a similar comet been spotted coming towards earth, it wouldn’t have been sufficient time to prepare a defence.

He has suggested that Nasa build a defence system to be deployed should a similar situation be expected for Earth.

The issue was discussed on BBC Radio 5live , when Cathy Plesko, a research scientist at the Los Alamos National Laboratory in New Mexico, discussed the possibility of a spaceship firing nuclear weapons at incoming asteroids.

Where Science Ends and the GMO Debate Really Begins

Opponents and proponents of genetically modified food have invoked science in their arguments, but science has no definitive answer.

Evaluating the risks and benefits of genetically modified organisms (GMOs) cannot depend on science alone, at least for now.

For the past two years, the National Academies of Sciences, Engineering, and Medicine (NAS) worked on a report that was to become the most exhaustive analysis of the science on GMOs in agriculture.

The 400-page report, released earlier this year, covers everything from safety and regulation to policy and socio-economic issues. It is likely the best shot science has had thus far to clear the air on the issue of GM food. But will the report substantially impact the debate on GMOs?

“Not really,” said Jack Heinemann, professor of genetics at the University of Canterbury in New Zealand. “It will inform lots of discussions, but mainly so far, I see it being selectively quoted to support pre-existing positions.”

Heinemann has been labeled anti-GMO, despite being a genetic engineer.

Henry Miller, on the other hand, has been said to back the GMO industry. He’s a former Food and Drug Administration GMO drugs reviewer, now with the Hoover Institution think tank.

Heinemann and Miller agree on the impact of the NAS report.

“[The] impact will likely be minimal,” Miller said via email. “The report is hardly definitive in any way, and because of the extensive ‘on the one hand, on the other hand’ equivocation, various aspects of it will be used by different people and organizations to support their own positions.”

Both experts have a point. At least one trade association and one environmental group used the report to fortify positions they apparently held before.

A women from New York Public Interest Research Group speaks to a passerby about the potential dangers of GMOs in front of a Whole Foods Market in the Lower East Side on June 3. (Jonathan Zhou/Epoch Times)

A woman from New York Public Interest Research Group speaks to a passerby about the potential dangers of GMOs in front of a Whole Foods Market in New York on June 3, 2014.

The American Seed Trade Association released a statement saying the report’s findings “reinforce what we’ve known all along: GE crops are safe.” GE, or genetically engineered, is another term for organisms that have been altered on a genetic level.

Meanwhile, the Environmental Working Group stated the report took “a major policy step in calling on the food and agriculture industries to increase transparency regarding GMO foods.”

Two Sides

The issue of genetic modification in food has been mired in controversy since GM products hit the market in the early 1990s. Two camps formed, with environmentally oriented groups opposing the practice and the GMO industry promoting it.

Indeed, both camps have done so good a job with discrediting their opponents it seems there’s hardly any source of information left that hasn’t been labeled pro- or anti-GMO.

A day before the NAS report came out, a consumer advocacy nonprofit (itself labeled anti-GMO) released a report questioning the credibility of NAS.

The nonprofit, Food & Water Watch, listed GMO industry ties for 11 of the 20 members of the committee that authored the NAS report. Weeks later, Miller singled out another member of the committee for a “long history of anti-genetic engineering activism.”

Lost in the scrimmage lies the science on GMOs, called on to aid both sides, yet fully satisfying neither.

For example, GMWatch, an environmental organization labeled anti-GMO, accused the NAS report of “sandwich” composition, meaning it includes information critical of GM crops in the middle of the report, while keeping its opening statement and conclusion positive toward GMOs.

Meanwhile, Miller argued that the report failed to address “current excessive, unscientific regulation” of the GM crop industry.

Yet the report seems adamant in avoiding clear-cut answers on broad topics, asserting that “sweeping statements about GE crops are problematic because issues related to them are multidimensional.”

While both proponents and opponents of GMO may lambast such statements as weak and vague, it may just reflect the fundamental difference between science and advocacy.

Advocacy Beyond Science

A Research Biologist takes tissue samples from genetically modified corn plants inside a climate chamber housed in Monsanto headquarters in St Louis in 2009. (Brent Stirton/Getty Images.)

A research biologist takes tissue samples from genetically modified corn plants inside a climate chamber housed in Monsanto headquarters in St. Louis in 2009.

It’s important “for scientists to emphasize that uncertainty is central to science, and advocacy is disruptive of it,” said Stephen Benner, a biochemist who, among other things, helps NASA look for life on other planets, in a blog post titled “The Dangers of Advocacy in Science.” His observations were not about the science of GMOs in particular, but apply to science in general.

“When a scientist becomes an advocate, he loses for himself the power to use scientific discipline to discern reality,” he wrote.

The GMO debate mostly stems from values and beliefs, rather than science. And that’s unlikely to change.

The NAS report states that “there are limits to what can be known about the health effects of any food, whether non-GE or GE,” and, furthermore, that parts of the argument reach beyond food safety to cultural and social values, which elude scientific judgment completely.

“Very little of what we’re talking about is science,” said Heinemann.

Instead of science, we’re talking about technology and its integration in society, Heinemann said. He explained the difference: science doesn’t necessarily have to result in a product—something practical and marketable—but technology does. “The science is only one small part of it,” he said.

It is one thing when scientists’ genetic research stays in the laboratory, but it’s another when those discoveries are developed into products that industries then market to the public for profit.

An example of advocacy versus science is seen in the history of the tobacco industry. It took science decades to substantiate health claims against smoking.

While many detrimental health effects of smoking can be reversed by quitting, if GMOs turn out to have long-term negative impacts, they may not be so easily reversed.

No ‘Off Switch’

Since the inception of GMOs, one of the main arguments against them is their potential irreversibility.

It has been documented that GM crops spread into the wild, breeding and passing down their modified genes. “The extent of the escape is unprecedented,” Cynthia Sagers, an ecologist at the University of Arkansas, told Nature in 2010.

Yet the NAS report concluded that research on GM plants’ spread into the wild has, so far, shown no problems for the environment. The report’s conclusion on GM food safety followed the same pattern.

People protest against agribusiness giant Monsanto in Los Angeles on May 25, 2013. (ROBYN BECK/AFP/Getty Images)

People protest against agribusiness giant Monsanto in Los Angeles on May 25, 2013.

The report’s authors stated that they “could not find persuasive evidence of adverse health effects directly attributable to consumption of GE foods.”

“This is not the same as saying that there is no evidence of potential health effects,” Heinemann noted, but to him, the NAS conclusion was “reassuring.”

However, the report acknowledged that there are no long-term studies on human consumption of GM food.

And even if scientists conduct long-term studies, the report notes that “isolating the effects of diet” on humans from all the other factors that can impact health is challenging. Also, tests on whether GMOs cause allergies “could miss some allergens,” the report states. The best science we have on GMOs remains open to identifying impacts we have not yet seen.

(Jim Liao/Epoch Times)

GMO proponents have long been saying that mere potential risks are not enough to stop technological progress that may bring revolutionary discoveries (for example, the promise of crops impervious to drought, pests, and anything that could thwart their growth, theoretically ending world hunger).

Critics, on the other hand, argue that most of the promised breakthroughs have not materialized and the possible progress is not worth the risks of irreversibly interfering with nature—and causing potential long-term effects on humans that cannot yet be discerned.

What Is an Acceptable Risk?

The report acknowledges it is not necessarily scientists who determine the level of risk a given population is willing to accept.

“What is acceptable is inherently a value-laden concept” that, in part, depends on “societal judgments,” it states.

Advance machines chip a parts of each corn seed to find the best grade to use for the next stage at Monsanto headquarters in St Louis in 2009. (Brent Stirton/Getty Images.)

Seed chipping machines work to identify best-grade corn seeds to use for the next stage at Monsanto headquarters in St. Louis in 2009.

Decisions to enforce labeling laws on GMOs, for example, are not entirely about the scientific studies showing effects one way or another, but about people evaluating the potential risks of GM over non-GM foods. GMO labeling is mandatory in the European Union and many other countries; the NAS report states that this is not based on science, but rather on the “right to know” rooted in the values of human rights.

Sixty-six percent of Americans favored labeling GM food products in a December 2014 Associated Press-GfK poll. Only 7 percent opposed the idea.

The first mandatory GMO labelling law in the United States—in effect in Vermont since July 1 and now superseded by a new federal bill—stated GM food should be labelled in the state for “multiple health, personal, religious, and environmental reasons.”

On the other hand, Miller said that values and beliefs have nothing to do with it. He blamed opposition to GMO on the fear of the unknown, ignorance, and “black marketing” by the organic industry.

Yet in general, lack of knowledge is rarely why people consider things more (or less) risky, according to Lennart Sjöberg, a professor at the Center for Risk Research at Stockholm School of Economics.

“People are not that misinformed about all risks,” he wrote in a 1999 paper. He found that perception of risk did not vary much according to how much or little knowledge a person possessed. Even if everybody is an expert, the conflict perseveres due to the fundamentally uncertain nature of empirical science.

“There are always at least some uncertainties in an empirical risk estimate,” Sjöberg wrote.

People may push the risk bar up or down for various reasons, like peer pressure, vested interests, political views, or how much control they feel they have.

“A good example is alcohol,” Sjöberg wrote. Because people feel they can control how much they drink, the risks that come with it seem smaller to them.

Consumers, however, have little to no control over GMOs.

“Ever since GMOs entered the market 20 years ago, we’ve been kept in the dark about whether foods we feed our families contain GMOs.” states the website of Just Label It, a GM food labeling campaign.

No matter what scientists may say, it seems consumers still feel entitled to having a choice between GM and non-GM food.

A sign supporting Proposition 37 which calls for the mandatory labeling of genetically engineered foods in Glendale, Calif., Oct. 19, 2012.  (ROBYN BECK/AFP/Getty Images)

A sign supporting Proposition 37 which calls for the mandatory labeling of genetically engineered foods in Glendale, Calif., Oct. 19, 2012.

Common Ground

While arguments outside science wield so powerful an influence on the GMO debate, it doesn’t mean scientists have no say. Assessing risks is a collaborative effort between experts and the public.

Paul Slovic, a psychology professor at the University of Oregon, has been studying risk perception for decades. He has said that the public’s understanding of risk is “much richer than that of the experts, and reflects legitimate concerns that are typically omitted from expert risk assessments.”

Experts can sometimes become accustomed to the risks through long experience and also may feel a greater degree of control over the risks than the general public, noted Sjöberg.

“There is wisdom as well as error in public attitudes and perceptions,” Slovic wrote. “Each side, expert and public, has something valid to contribute. Each side must respect the insights and intelligence of the other.”