Enhanced Cardiac Testing in a Dual Anti-HER2 Regimen: What Have We Learned?

Concerns that cancer treatment caused cardiac dysfunction came to the forefront after the discovery of the anthracyclines in the 1960s. Damage to the heart subsequent to the administration of anthracyclines was dramatic, devastating, and sometimes fatal. While oncologists had tackled the adverse events of preanthracycline chemotherapy, the new phenomenon of anthracycline-associated cardiotoxicity created new challenges [1]. Through collaboration with cardiologists, those treating patients with cancer came to learn about this new mechanism of cardiac injury, and cardio-oncology was born.

Anthracycline toxicity was related to the cumulative dose, and an extensive database of endomyocardial biopsies demonstrated that myocyte death followed exposure to this class of agents when administered above a threshold dose that varied considerably among patients. Higher cumulative doses resulted in higher biopsy specimen grades, indicating increased cellular injury and a greater loss of myocytes [2].

Many questions had yet to be answered: Why did patients seem to do well from the cardiac standpoint after treatment, only to develop heart failure years or even decades after treatment? What were the risk factors that made heart failure more likely, and could cardiotoxicity be prevented or mitigated? We learned that the myocyte injury and cell death associated with anthracyclines took place in the days and weeks following administration. The damaged cells either recovered or underwent apoptosis, and the patient then went on with life, albeit with a heart that was not normal. When compensation was no longer complete, we saw the effects on various noninvasive parameters, the most enduring of which was a reduction in the left ventricular ejection fraction. Although countless coexisting conditions were identified as risk factors, they were ultimately summed up as any factor that had previously damaged the heart or any factor that made the heart more susceptible to future injury [3]. Finally, in addition to reducing the overall exposure to anthracyclines by dose reduction, chemical protection with dexrazoxane [4], lowering the peak plasma level by prolonged infusion schedules [5], and, albeit less convincingly, molecular or delivery system modification [6] all demonstrated significant but varying degrees of cardioprotection.

More recently it was found that biomarkers, especially troponins, were a good substitute for cardiac biopsy specimens in assessing actual myocyte destruction, and that reducing myocardial stress through β-adrenergic blockade and afterload reduction protected the heart. These agents appear to do so both during the phase of actual damage as well as during the period of posttreatment remodeling [7].

Newer targeted therapies, including monoclonal antibodies and tyrosine-kinase inhibitors, were not expected to cause cardiotoxicity, but when the pivotal trial that dosed trastuzumab along with doxorubicin suggested levels of cardiotoxicity far beyond those expected from the anthracycline alone, interest in the effects of cancer treatment on the heart exploded [8]. The rate of cardiac adverse events, initially reported to be 27%, has been cited extensively, although, fortunately, this level of cardiotoxicity has not been replicated. The subsequent series of clinical trials were conducted to establish both the efficacy of trastuzumab in the adjuvant setting and to determine the incremental extent of cardiotoxicity if trastuzumab was administered sequentially rather than concomitantly with an anthracycline.

When these trials were initiated, it had not yet been recognized that cardiac dysfunction seen with trastuzumab was quite different, both mechanistically and with regard to long-term implications, from the cardiotoxicity associated with anthracyclines and their inherent myocyte damage. Extensive monitoring was therefore incorporated into these trials. For better or worse, the exhaustive monitoring schemes used in these early trials were incorporated in clinical use, resulting in monitoring guidelines based more on clinical trial design than on the robust clinical insight that the trials provided. We continue to hear that cardiac monitoring of left ventricular ejection fraction should be undertaken every 3 months, despite the now widely recognized differences among potentially cardiotoxic agents. Agents that destroy myocytes (type I agents) may warrant intensive monitoring, especially for high-risk patients, but, as pointed out in the study by Yu et al. [9], this is probably not nearly as important for agents for which myocyte apoptosis is not part of their primary toxicity spectrum [10].

What was not initially explained was why the incidence of cardiotoxicity was so high in the pivotal trial or why patients who experienced cardiac events with trastuzumab usually, but not always, recovered. Why was it possible to continue trastuzumab (along with several other agents associated with cardiac events) for long periods of time—in some instances for longer than 10 years—with stable cardiac parameters in most of the patients so treated?

Explaining what was happening when trastuzumab and anthracyclines were given together involved a bit of detective work. Initially it was shown that trastuzumab interfered with cell repair [11]. Putting that bit of information together with the fact that when we repeated cardiac biopsies in patients previously treated with anthracyclines and in whom biopsy specimen evidence of toxicity was anticipated, no abnormalities were observed if an interval of 4–6 months had transpired since the last anthracycline administration. The assumption, albeit unproven, was that damaged myocytes either recovered and appeared normal in biopsy specimens, or had undergone apoptosis and were replaced in the cardiac matrix. Further evidence was provided by looking at the extent of cardiac events in the various trials—high in the pivotal trial when the agents were given together, much less when there was a 3-week interval between the last anthracycline dose and trastuzumab, and almost no events in the Herceptin Adjuvant (HERA) trial, in which the interval between anthracycline and trastuzumab was 89 days [12]. Could it be that what we were seeing was predominantly an augmentation of anthracycline toxicity induced by trastuzumab’s inhibition of cell repair? This might also explain why so many patients have tolerated the drug for long periods of time after completion of anthracyclines.

While the oncologic efficacy of trastuzumab was clear from the start, the idea of adding a second anti-HER2 agent was intriguing, and this was the basis for the CLEOPATRA trial [13]. If the observed cardiotoxicity of trastuzumab was truly a secondary phenomenon because of its effect on cell repair, and if pertuzumab was not inherently cardiotoxic, then very little toxicity would be expected. Indeed, CLEOPATRA demonstrated the cardiac safety of the combined regimen [14], and Yu et al. have now confirmed this safety [9].

In CLEOPATRA, a left ventricular ejection fraction of 50% or greater was required for study entry. Left ventricular ejection fraction (LVEF) assessments were carried out every 9 weeks, and either basic two-dimensional echocardiography or multi-gated acquisition scans were permitted. The trial was conducted at 204 centers in 25 countries; 402 patients were included in the cohort that received pertuzumab 840 mg as a loading dose followed by 420 mg every 3 weeks plus trastuzumab 8 mg/kg as a loading dose followed by 6 mg/kg every 3 weeks, plus docetaxel 75 mg/m2 every 3 weeks. LVEF declines of 10% or more to a value of less than 50% were reported in 3.8% of patients in that group. In the report by Yu et al. [9], the regimen is similar but for the use of paclitaxel 80 mg/m2 rather than docetaxel. Yu et al. reported 2 patients (3%) who experienced asymptomatic declines in LVEF.

Contractile dysfunction is primarily assessed by declines in the LVEF, and that was the parameter used to assess cardiotoxicity in CLEOPATRA. LVEF assessments are not perfect. The basic ultrasound examination demonstrates interobserver variation; in addition, a host of physiological changes affect the final LVEF value. Drug-related declines are evident only after substantial impairment of function and inability to fully compensate for the loss of contractile elements. As was emphasized by Yu et al., newer techniques have demonstrated greater precision, accuracy, and sensitivity [9]. We may now be able to identify those patients who are at especially high risk and offer mitigating strategies earlier in an effort to spare myocytes. Fortunately, this strategy, as emphasized by Yu et al., is gaining momentum. On the other hand, earlier recognition of subclinical toxicity has a potential danger, also clearly recognized by Yu et al., that might result in treatment decisions compromising optimal oncologic efficacy.

Safety of the combination of pertuzumab and trastuzumab was established in CLEOPATRA, yet CLEOPATRA used conventional LVEF determinations to define events and did not include troponin determinations or strain imaging. In their report, Yu et al. confirm cardiac safety even when these more-sensitive cardiac testing procedures are used [9]. As the authors point out, a trial incorporating cardiac risk and central review showing enhanced recognition of functional impairment would be hugely important to help establish the optimal level of surveillance and add perspective to these new modalities. Because the level of toxicity was not sufficiently high to detect meaningful differences in the Yu et al. study, this may not have been the optimal patient population in which to detect the incremental benefit of more sensitive approaches.

It is time to recognize that a modification in the recommended monitoring schedule of some regimens may be justified. Even as guidelines are evolving, an evidence-based monitoring schedule rather than one based on the schedules of prior clinical trials that ultimately showed considerable cardiac safety is clearly needed. Cardiologists and oncologists must incorporate new safety data into guidelines lest potentially wasteful, expensive, and probably unnecessary monitoring for regimens with demonstrated cardiac safety are perpetuated. We applaud the work of Yu et al. for using cardiac testing with enhanced sensitivity as well as for having confirmed the cardiac safety of the dual anti-HER2 regimen, thereby moving us forward as we strive to achieve the mutual goals that define cardio-oncology.

A New Option for Treating Ebola

A human-derived monoclonal antibody effectively treated macaques when given up to 5 days after Ebola infection.

 The recent West African Ebola virus outbreak highlighted the need for effective therapies against this pathogen. Research has shown that a cocktail of three mouse-derived monoclonal antibodies (mAbs), Zmapp, could be effective in treating nonhuman primates infected with Ebola (NEJM JW Infect Dis Oct 2014 and Nature 2015; 514:47), work that led researchers to explore the potential efficacy of human-derived mAbs.

The investigators found that a survivor of a 1995 Ebola virus outbreak retained potent virus-neutralizing antibody 11 years after infection. The researchers created immortalized cell lines from this individual’s peripheral blood mononuclear cells and tested mAbs derived from the cell lines in vitro against Ebola virus. Four of these mAbs showed both high-level binding and inhibition activity against Ebola virus. Further testing of the two most neutralizing antibodies showed that both also mediated antibody-dependent cell-mediated cytotoxicity toward Ebola virus–infected cells. Treating three rhesus macaques with a combination of the two mAbs 24 hours after Ebola virus challenge protected all three animals. Giving monotherapy with the more-active mAb was also found to protect three of three macaques from Ebola virus challenge when given either 1 or 5 days postinfection.

The use of an investigational combination monoclonal antibody preparation, ZMapp, for the treatment of two American healthcare workers who contracted Ebola virus (EBOV) disease in Liberia brought international attention to this therapeutic approach. A team of researchers, including industry members, has now published data on the efficacy of ZMapp in experimentally infected rhesus macaques.

Previous work by this research group had shown that a combination of three murine monoclonal antibodies could effectively treat rhesus macaques challenged with EBOV 48 hours earlier (NEJM JW Infect Dis Jul 11 2012). The researchers subsequently sought to improve the monoclonal antibody cocktail. To extend the half-life of the antibodies in humans, they chimerized the murine antibodies with human constant regions and produced the resulting components in Nicotiana benthamiana plants. They screened individual and combination anti-EBOV monoclonal preparations in guinea pig and nonhuman primate EBOV models. The best-performing combination, dubbed ZMapp, was then tested in rhesus macaques given a lethal dose of the Kikwit variant of EBOV.

All 18 animals treated with three doses of ZMapp (given on postinfection days 3, 6, and 9; 4, 7, and 10; or 5, 8, and 11) survived the infection and had undetectable viral loads 21 days after infection, whereas all 3 animals that received a control antibody preparation died. Most ZMapp-treated animals had developed fever at the time treatment was started; all developed laboratory abnormalities during the experiment. In additional work, ZMapp was found to bind effectively to the current West African variant of EBOV and to neutralize this variant in vitro.


In animal studies, ZMapp is the most promising preparation developed to date for EBOV disease. Data on clinical efficacy in humans are urgently needed, even though standard clinical trials cannot ethically be undertaken. Whether the rapid evolution of the current EBOV variant spreading in West Africa could render the virus resistant to neutralization by ZMapp — and whether treatment with ZMapp or similar monoclonal antibodies would protect recipients from reinfection — remain unclear.

Cardiovascular and Renal Outcomes of Renin-Angiotensin System Blockade in Adult Patients with Diabetes Mellitus: A Systematic Review with Network Meta-Analyses.

BACKGROUND: Medications aimed at inhibiting the renin-angiotensin system (RAS) have been used extensively for preventing cardiovascular and renal complications in patients with diabetes, but data that compare their clinical effectiveness are limited. We aimed to compare the effects of classes of RAS blockers on cardiovascular and renal outcomes in adults with diabetes.

METHODS AND FINDINGS: Eligible trials were identified by electronic searches in PubMed/MEDLINE and the Cochrane Database of Systematic Reviews (1 January 2004 to 17 July 2014). Interventions of interest were angiotensin-converting enzyme (ACE) inhibitors, angiotensin receptor blockers (ARBs), and direct renin (DR) inhibitors. The primary endpoints were cardiovascular mortality, myocardial infarction, and stroke-singly and as a composite endpoint, major cardiovascular outcome-and end-stage renal disease [ESRD], doubling of serum creatinine, and all-cause mortality-singly and as a composite endpoint, progression of renal disease. Secondary endpoints were angina pectoris and hospitalization for heart failure. In all, 71 trials (103,120 participants), with a total of 14 different regimens, were pooled using network meta-analyses. When compared with ACE inhibitor, no other RAS blocker used in monotherapy and/or combination was associated with a significant reduction in major cardiovascular outcomes: ARB (odds ratio [OR] 1.02; 95% credible interval [CrI] 0.90-1.18), ACE inhibitor plus ARB (0.97; 95% CrI 0.79-1.19), DR inhibitor plus ACE inhibitor (1.32; 95% CrI 0.96-1.81), and DR inhibitor plus ARB (1.00; 95% CrI 0.73-1.38). For the risk of progression of renal disease, no significant differences were detected between ACE inhibitor and each of the remaining therapies: ARB (OR 1.10; 95% CrI 0.90-1.40), ACE inhibitor plus ARB (0.97; 95% CrI 0.72-1.29), DR inhibitor plus ACE inhibitor (0.99; 95% CrI 0.65-1.57), and DR inhibitor plus ARB (1.18; 95% CrI 0.78-1.84). No significant differences were showed between ACE inhibitors and ARBs with respect to all-cause mortality, cardiovascular mortality, myocardial infarction, stroke, angina pectoris, hospitalization for heart failure, ESRD, or doubling serum creatinine. Findings were limited by the clinical and methodological heterogeneity of the included studies. Potential inconsistency was identified in network meta-analyses of stroke and angina pectoris, limiting the conclusiveness of findings for these single endpoints.

CONCLUSIONS: In adults with diabetes, comparisons of different RAS blockers showed similar effects of ACE inhibitors and ARBs on major cardiovascular and renal outcomes. Compared with monotherapies, the combination of an ACE inhibitor and an ARB failed to provide significant benefits on major outcomes. Clinicians should discuss the balance between benefits, costs, and potential harms with individual diabetes patients before starting treatment.

Effectiveness of psychological interventions for chronic pain on health care use and work absence: systematic review and meta-analysis.

Psychological interventions for chronic pain and its consequences have been shown to improve mood, disability, pain, and catastrophic thinking, but there has been no systematic review specifically of their effects on health care use or time lost from work as treatment outcomes in mixed chronic pain. We conducted a systematic review and meta-analysis to evaluate the effectiveness of psychological therapies for chronic pain (excluding headache) in adults for these outcomes. We used searches from 2 previous systematic reviews and updated them. Eighteen randomized controlled trials were found that reported health care use (15 studies) and work loss (9 studies) as outcomes. Fourteen studies provided data for meta-analysis. There were moderate effects for psychological interventions compared with active controls, treatment as usual and waiting list controls in reducing health care use, with confidence in the findings. No benefits were found for medication reduction, but with less confidence in this result. Analysis of work loss showed no significant effects of psychological interventions over comparisons, but the use of many different metrics necessitated fragmenting the planned analyses, making summary difficult. The results are encouraging for the potential of routine psychological intervention to reduce posttreatment health care use, with associated cost savings, but it is likely that the range and complexity of problems affecting work necessitate additional intervention over standard group psychological intervention.

Type 1 diabetes increases risk for epilepsy

Children with type 1 diabetes have a threefold greater risk for developing epilepsy, possibly due to increased hypoglycemia, according to study results.

In a retrospective, population-based study, I-Ching Chou, of China Medical University Children’s Hospital in Taichung, Taiwan, and colleagues analyzed claims data from the Taiwan National Health Insurance Research Database. Each patient with type 1 diabetes (n = 2,568; mean age, 10 years) was matched by sex, residence area and index year to 10 patients without type 1 diabetes (n = 25,680; mean age, 11 years). Both cohorts were 46.5% boys, with approximately 60% of children living in highly urbanized areas. Researchers used Cox proportional hazard regression analysis to estimate the effects of type 1 diabetes on epilepsy risk. Confounding comorbidities included prior epilepsy, head injury, intellectual disabilities and low birth weight.

Researchers found the incidences of epilepsy were 33.7 per 10,000 person-years for children with type 1 diabetes vs. 10.4 per 10,000 person-years in the control group. After adjustment for age, sex, urbanization level, prior epilepsy, intellectual disabilities, low birth weight and head injury, the risk for epilepsy remained higher for children with type 1 diabetes (HR = 2.84; 95% CI, 2.11-3.83).

The risk for epilepsy increased further for children with type 1 diabetes and documented hypoglycemia vs. those without hypoglycemia (HR = 16.5; 95% CI, 5.19-52.3 vs. HR = 2.67; 95% CI, 1.97-3.62). Epilepsy also increased with type 1 diabetes severity (P < .0001 for trend).

“This result is consistent with those of previous studies in that epilepsy or seizures are observed in many autoimmune or inflammatory disorders and are linked to the primary disease or secondary to proinflammatory processes,” the researchers wrote. “Moreover, we determined that the proportion of intellectual disabilities in the type 1 diabetes cohort was significantly greater than that in the comparison cohort. Furthermore, children with an intellectual disability exhibited a significantly increased risk for epilepsy.” – by Regina Schaffer



Not Even Your Organic Wine Is Safe From Monsanto

red and white wine

Story at-a-glance

  • An analysis revealed glyphosate in all 10 wine samples tested; even organic and biodynamic wines contained traces of glyphosate
  • Most conventional vineyards use Roundup (active ingredient glyphosate) to kill weeds between the rows of grapes
  • The organic wines may have been contaminated because glyphosate drifted over onto the organic and biodynamic vineyards from conventional vineyards nearby

Glyphosate usage has gotten so out of control that it’s seemingly taken on a life of its own and is now showing up even in foods that haven’t been directly sprayed, namely the grapes used to make organic wine.

Glyphosate, the active ingredient in Monsanto’s Roundup herbicide, is the most used agricultural chemical in history. It’s used in a number of different herbicides (700 in all), but Roundup is by far the most widely used.

Since glyphosate was introduced in 1974, 1.8 million tons have been applied to U.S. fields, and two-thirds of that volume has been sprayed in the last 10 years.

A recent analysis showed that farmers sprayed enough glyphosate in 2014 to apply 0.8 pounds of the chemical to every acre of cultivated cropland in the U.S., and nearly 0.5 a pound of glyphosate to all cropland worldwide.1

If you purchase organic foods or beverages, you should theoretically be safe from glyphosate exposure, as this chemical is not allowed in organic farming. But a new analysis revealed glyphosate has now infiltrated not only wine but also organic wine.

100 Percent of Wine Tested Contained Glyphosate

An anonymous supporter of advocacy group Moms Across America sent 10 wine samples to be tested for glyphosate. All of the samples tested positive for glyphosate — even organic wines, although their levels were significantly lower.2

The highest level detected was 18.74 parts per billion (ppb), which was found in a 2013 Cabernet Sauvignon from a conventional vineyard. This was more than 28 times higher than the other samples tested.

The lowest level, 0.659 ppb, was found in a 2013 Syrah, which was produced by a biodynamic and organic vineyard. An organic wine made from 2012 mixed red wine grapes also tested positive for glyphosate at a level of 0.913 ppb.

How Does Glyphosate End up in Wine?

While glyphosate isn’t sprayed directly onto grapes in vineyards (it would kill the vines), it’s often used to spray the ground on either side of the grape vines. Moms Across America reported:3

This results in a 2-to 4- foot strip of Roundup sprayed soil with grapevines in the middle. According to Dr. Don Huber at a talk given at the Acres USA farm conference in December of 2011, the vine stems are inevitably sprayed in this process and the

Roundup is likely absorbed through the roots and bark of the vines from where it is translocated into the leaves and grapes.”

As for how the organic wines became contaminated, it’s likely that the glyphosate drifted over onto the organic and biodynamic vineyards from conventional vineyards nearby.

It’s also possible that the contamination is the result of glyphosate that’s left in the soil after a conventional farm converted to organic; the chemical may remain in the soil for more than 20 years.4

Glyphosate Detected in 14 German Beers

A study of glyphosate residues by the Munich Environmental Institute also found glyphosate in 14 best-selling German beers.5 All of the beers tested had glyphosate levels above the 0.1 microgram limit allowed in drinking water.

Levels ranged from a high of 29.74 micrograms per liter found in a beer called Hasseroeder to a low of 0.46 micrograms per liter, which was found in the beer Augustiner.6 Although no tests have yet been conducted on American beer, it’s likely to be contaminated with glyphosate as well.

Indeed, laboratory testing commissioned by Moms Across America and Sustainable Pulse revealed that glyphosate is now showing up virtually everywhere, including in blood and urine samples, breast milk, drinking water and more.7

The beer finding could be a blow to the German beer industry in particular. The country is the biggest beer producer in Europe and has long prided itself on brewing only the purest beer.

“Das Reinheitsgebot” is Germany’s food purity law. It’s one of the world’s oldest food safety laws and limited the ingredients in beer to only water, barley and hops (yeast was later approved as well).

Now Monsanto’s chemicals are threatening this German tradition and their reputation for producing the purest beer. As reported by The Local:8

“’In contrast to our colleagues abroad, German brewers don’t use artificial flavours, enzymes or preservatives,’ said Hans-Georg Eils, president of the German Brewers’ Federation, at the Green Week agricultural fair in Berlin.

The keep-it-simple brews indeed suit a trend toward organic and wholesome food, agreed Frank-Juergen Methner, a beer specialist at the National Food Institute of Berlin’s Technical University.

‘In times of healthy nutrition, demand for beer which is brewed according to the Reinheitsgebot is on the rise too,’ he said.”

Many are unaware of the fact that glyphosate is patented as an antibiotic. It’s designed to kill bacteria, which is one of the primary ways it harms both soils and human health. Recent research has even concluded that Roundup (and other pesticides) promotes antibiotic resistance.

Scientist Anthony Samsel, Ph.D. (watch my interview with him above) was the person who dug up the patents showing glyphosate is a biocide and an antibiotic. A study in poultry found the chemical destroys beneficial gut bacteria and promotes the spread of pathogenic bacteria.9

Samsel also reported that chronic low-dose oral exposure to glyphosate is a disruption of the balance among gut microbes, leading to an over-representation of pathogens, a chronic inflammatory state in the gut and an impaired gut barrier.

Samsel’s research also revealed that Monsanto knew in 1981 that glyphosate caused adenomas and carcinomas rats.

Monsanto’s own research supports the International Agency for Research on Cancer (IARC) determination that glyphosate is a Class 2A “probable human carcinogen” — a determination Monsanto is now trying to get retracted. Other research has shown glyphosate may:

  • Stimulate the growth of human breast cancer cells10
  • Have endocrine-disrupting effects and affect human reproduction and fetal development11
  • Induce oxidative damage and neurotoxicity in the brain12
  • Modify the balance of sex hormones13
  • Cause birth defects14

Glyphosate May Be Even More Toxic Due to Surfactants

Most studies looking into glyphosate toxicity have only studied the “active” ingredient (glyphosate) and its breakdown product, aminomethylphosphonic acid (AMPA). But the presence of so-called inactive compounds in the herbicide may be amplifying glyphosate’s toxic effects.

A 2012 study revealed that inert ingredients such as solvents, preservatives, surfactants and other added substances are anything but “inactive.” They can, and oftentimes do, contribute to a product’s toxicity in a synergistic manner — even if they’re non-toxic in isolation.

Certain adjuvants in glyphosate-based herbicides were also found to be “active principles of human cell toxicity,” adding to the hazards inherent with glyphosate.

It’s well worth noting that, according to the researchers, this cell damage and/or cell death can occur at the residual levels found on Roundup-treated crops, as well as lawns and gardens where Roundup is applied for weed control.15 As written in the International Journal of Environmental Research and Public Health:16

“Pesticide formulations contain declared active ingredients and co-formulants presented as inert and confidential compounds. We tested the endocrine disruption of co-formulants in six glyphosate-based herbicides (GBH) … All co-formulants and formulations were comparably cytotoxic [toxic to living cells] well below the agricultural dilution of 1 percent (18 to 2000 times for co-formulants, 8 to 141 times for formulations).

… It was demonstrated for the first time that endocrine disruption by GBH could not only be due to the declared active ingredient but also to co-formulants.

These results could explain numerous in vivo results with GBHs not seen with G [glyphosate] alone; moreover, they challenge the relevance of the acceptable daily intake (ADI) value for GBHs exposures, currently calculated from toxicity tests of the declared active ingredient alone.”

How to Avoid Glyphosate in Your Food

Your best bet for minimizing health risks from herbicide and pesticide exposure is to avoid them in the first place by eating organic as much as possible and investing in a good water filtration system for your home or apartment. If you know you have been exposed to herbicides and pesticides, the lactic acid bacteria formed during the fermentation of kimchi may help your body break them down.

So including fermented foods like kimchi in your diet may also be a wise strategy to help detox the pesticides that do enter your body. One of the benefits of eating organic is that the foods will be free of genetically engineered (GE) ingredients, and this is key to avoiding exposure to toxic glyphosate. Following are some great resources to obtain wholesome organic food.

Eating locally produced organic food will not only support your family’s health, it will also protect the environment from harmful chemical pollutants and the inadvertent spread of genetically engineered seeds and chemical-resistant weeds and pests.

What You Need to Know About GMOs

Genetically modified organisms (GMOs), or genetically “engineered” (GE) foods, are live organisms whose genetic components have been artificially manipulated in a laboratory setting through creating unstable combinations of plant, animal, bacteria, and even viral genes that do not occur in nature or through traditional crossbreeding methods.

GMO proponents claim that genetic engineering is “safe and beneficial,” and that it advances the agricultural industry. They also say that GMOs help ensure the global food supply and sustainability. But is there any truth to these claims? I believe not. For years, I’ve stated the belief that GMOs pose one of the greatest threats to life on the planet. Genetic engineering is NOT the safe and beneficial technology that it is touted to be.

The FDA cleared the way for GE (Genetically Engineered) Atlantic salmon to be farmed for human consumption. Thanks to added language in the federal spending bill, the product will require special labeling so at least consumers will have the ability to identify the GE salmon in stores. However, it’s imperative ALL GE foods be labeled, which is currently still being denied.

The FDA is threatening the existence of our food supply. We have to start taking action now. I urge you to share this article with friends and family. If we act together, we can make a difference and put an end to the absurdity.

QR Codes Are NOT an Adequate Substitute for Package Labels

The biotech industry is trying to push the QR code as an answer for consumer concerns about GE foods. QR stands for Quick Response, and the code can be scanned and read by smart phones and other QR readers.

The code brings you to a product website that provides further details about the product. The video below shows you why this is not an ideal solution. There’s nothing forcing companies to declare GMOs on their website. On the contrary, GE foods are allowed to be promoted as “natural,” which further adds to the confusion.

hese so-called “Smart Labels” hardly improve access to information. Instead, by making finding the truth time-consuming and cumbersome, food makers can be assured that most Americans will remain ignorant about the presence of GMOs in their products. Besides, everyone has a right to know what’s in the food. You shouldn’t have to own a smartphone to obtain this information.

Vermont’s mandatory labeling law is scheduled to go into effect July 1. Now, Monsanto is going with the only strategy it has left to block it — a Senate version of H.R.1599, also referred to as the DARK (Denying Americans the Right to Know) Act. Sen. Pat Roberts (R-Kan) introduced the bill, which would preempt Vermont’s GMO labeling law, and replace state mandatory labeling laws with a federal voluntary labeling plan.

Fortunately, on March 16, the Senate rejected the bill, falling far short of the 60 votes it needed in its favor to pass. This is great news, but though the DARK Act was defeated, it’s not over yet.

Roberts said he would still work to find another way to preempt the law, and majority leader Mitch McConnell changed his vote from YES to NO for procedural reasons. This allows him to bring up the bill again later if a compromise is created, and the creation of such a compromise is certainly already underway.

Vermont’s law is set to take effect on July 1. It’s imperative you take action now by contacting your senators. Ask them to oppose any compromise that would block or delay Vermont’s labeling law. It’s critical that we flood Senators’ phone lines — it’s now or never for GMO labeling.


Glucocorticoid-induced osteoporosis risks are undetermined

Glucocorticoid-induced osteoporosis and associated fractures are not uncommon, but avoidable, according to a speaker here at the at the American College of Rheumatology State of the Art Clinical Symposium.

Kenneth G. Saag, MD, MSc, said testing with dual-energy X-ray absorptiometry (DXA) scanning can be an important monitoring tool, but getting insurance reimbursement for regular testing, such as every 6 months for patients who chronically take glucocorticoids, can be difficult.

Saag said alendronate can be a useful medication in premenopausal women who receive 7.5 mg per day of prednisone equivalent to increase BMD and prevent fractures. Other treatments, such as teriparinide (Forteo, Eli Lilly) or risedronate (Actonel, Actavis; Atelvia, Allergan) also may provide protection against fractures in at-risk patients who chronically take glucocorticoids, such as women of child-bearing age who have rheumatoid arthritis (RA) or systemic lupus erythematosus (SLE).

Some women who chronically received glucocorticoids and bisphosphonates or other treatments for osteoporosis prevention had children with low birth weight, but Saag said disease activity from RA or SLE also may be confounding factors.

About 20% of patients with SLE diagnosed at a young age, mostly women, will have a fracture over a 15-year period after diagnosis, according to Saag, who added that data is sparse and more study is required to understand the risks and benefits of treatments.

An open-label study from Korea showed lower total hip arthroplasty prevalence among patients who received alendronate and fewer instances of osteonecrosis, Saag said, but added more study is needed to determine the risks of osteoporosis among women and men who chronically receive glucocorticoids. – by Shirley Pulawski

You Could Launch Your Own Satellites soon.

Arizona State University has come up with a student designed project called SunCube FemtoSat, which makes smaller than a standard CubeSat, low-cost, student-designed spacecrafts aimed at providing greater access to space for scientists and hobbyists alike.

Assistant professor Jekan Thanga and team of students have been developing the small 3 cm3 (1.1 in3) cube weighing in at just 35 grams (1.2 oz) and a longer (3 x 3 x 9 cm, 100 g) model including payload storage space for the past two years. They view FemtoSat as a starting point for scientists and students, and even hope the device could be available on Amazon one day.

Each FemtoSat has its own communication, data collection and propulsion systems and is powered by solar paneling. The modules are made of off-the-shelf parts, and the solar panels were cut from scrap – sold at a discount by manufacturers. The team says that while launching your own satellite would usually cost between US$60-70,000 per kilo, it would only cost $1000 to send a FemtoSat to the International Space Station, and $3000 to send it into low earth orbit. Leaving Earth’s gravity would cost an estimated $27,000.

The FemtoSats would be packed into an orbital deployer with a “jack in the box” style system that matches standard CubeSat dimensions (around 10 cm3), simplifying the process of getting the tiny satellites into orbit. NASA has sent 30 CubeSats into space over the last few years, with another 50 awaiting launch.


“With a spacecraft this size, any university can do it, any lab can do it, any hobbyist can do it,” says Thanga. “That’s part of our major goal – space for everybody.”

Four applications for the device can be estimated: hands on testing experience for students, miniaturized versions of current experiments, artificial gravity experiments, and giving users their own “GoPro in space.”

Are humans the new supercomputer? Team blurred the boundaries between man and mac

Are humans the new supercomputer?
A screenshot of one of the many games that are available. In this case the task is to shoot spiders in the “Quantum-Shooter” but there are many other kinds of games. 

The saying of philosopher René Descartes of what makes humans unique is beginning to sound hollow. ‘I think—therefore soon I am obsolete’ seems more appropriate. When a computer routinely beats us at chess and we can barely navigate without the help of a GPS, have we outlived our place in the world? Not quite. Welcome to the front line of research in cognitive skills, quantum computers and gaming.

Today there is an on-going battle between man and machine. While genuine machine consciousness is still years into the future, we are beginning to see computers make choices that previously demanded a human’s input. Recently, the world held its breath as Google’s algorithm AlphaGo beat a professional player in the game Go—an achievement demonstrating the explosive speed of development in machine capabilities.

But we are not beaten yet—human skills are still superior in some areas. This is one of the conclusions of a recent study by Danish physicist Jacob Sherson, published in the journalNature.

“It may sound dramatic, but we are currently in a race with technology—and steadily being overtaken in many areas. Features that used to be uniquely human are fully captured by contemporary algorithms. Our results are here to demonstrate that there is still a difference between the abilities of a man and a machine,” explains Jacob Sherson.

At the interface between and computer games, Sherson and his research group at Aarhus University have identified one of the abilities that still makes us unique compared to a computer’s enormous : our skill in approaching problems heuristically and solving them intuitively. The discovery was made at the AU Ideas Centre CODER, where an interdisciplinary team of researchers work to transfer some human traits to the way computer algorithms work.

Quantum physics holds the promise of immense technological advances in areas ranging from computing to high-precision measurements. However, the problems that need to be solved to get there are so complex that even the most powerful supercomputers struggle with them. This is where the core idea behind CODER—combining the processing power of computers with human ingenuity—becomes clear.

Our common intuition

Like Columbus in QuantumLand, the CODER research group mapped out how the human brain is able to make decisions based on intuition and accumulated experience. This is done using the online game “Quantum Moves”. Over 10,000 people have played the game that allows everyone contribute to basic research in quantum physics.

“The map we created gives us insight into the strategies formed by the human brain. We behave intuitively when we need to solve an unknown problem, whereas for a computer this is incomprehensible. A computer churns through enormous amounts of information, but we can choose not to do this by basing our decision on experience or intuition. It is these intuitive insights that we discovered by analysing the Quantum Moves player solutions,” explains Jacob Sherson.

The laws of quantum physics dictate an upper speed limit for data manipulation, which in turn sets the ultimate limit to the processing power of quantum computers—the Quantum Speed Limit. Until now a computer algorithm has been used to identify this limit. It turns out that with human input researchers can find much better solutions than the algorithm.

Are humans the new supercomputer? Team blurred the boundaries between man and mac
This is how the “Mind Atlas” looks. Based on 500.000 completed games the group has been able to visualize our ability to solve problems. Each peak on the ‘map’ represents a good idea, and the area with the most peaks – marked by red rings – are where the human intuition has hit a solution. A computer can then learn to focus on these areas, and in that way ‘learn’ about the cognitive functions of a human. Credit: CODER/AU

“The players solve a very complex problem by creating simple strategies. Where a computer goes through all available options, players automatically search for a solution that intuitively feels right. Through our analysis we found that there are common features in the players’ solutions, providing a glimpse into the shared intuition of humanity. If we can teach computers to recognise these good solutions, calculations will be much faster. In a sense we are downloading our common intuition to the computer” says Jacob Sherson.

And it works. The group has shown that we can break the Quantum Speed Limit by combining the cerebral cortex and computer chips. This is the new powerful tool in the development of quantum computers and other quantum technologies.

We are the new supercomputer

Science is often perceived as something distant and exclusive, conducted behind closed doors. To enter you have to go through years of education, and preferably have a doctorate or two. Now a completely different reality is materialising.

In recent years, a new phenomenon has appeared—citizen science breaks down the walls of the laboratory and invites in everyone who wants to contribute. The team at Aarhus University uses games to engage people in voluntary science research. Every week people around the world spend 3 billion hours playing games. Games are entering almost all areas of our daily life and have the potential to become an invaluable resource for science.

“Who needs a supercomputer if we can access even a small fraction of this computing power? By turning science into games, anyone can do research in quantum physics. We have shown that games break down the barriers between quantum physicists and people of all backgrounds, providing phenomenal insights into state-of-the-art research. Our project combines the best of both worlds and helps challenge established paradigms in computational research,” explains Jacob Sherson.

The difference between the machine and us, figuratively speaking, is that we intuitively reach for the needle in a haystack without knowing exactly where it is. We ‘guess’ based on experience and thereby skip a whole series of bad options. For Quantum Moves, intuitive human actions have been shown to be compatible with the best computer solutions. In the future it will be exciting to explore many other problems with the aid of human intuition.

“We are at the borderline of what we as humans can understand when faced with the problems of quantum physics. With the problem underlying Quantum Moves we give the computer every chance to beat us. Yet, over and over again we see that players are more efficient than machines at solving the problem. While Hollywood blockbusters on artificial intelligence are starting to seem increasingly realistic, our results demonstrate that the comparison between man and machine still sometimes favours us. We are very far from computers with human-type cognition,” says Jacob Sherson and continues:

“Our work is first and foremost a big step towards the understanding of physical challenges. We do not know if this can be transferred to other challenging problems, but it is definitely something that we will work hard to resolve in the coming years.”