Scientific Study of Surfer Butts Reveal Drug-Resistant Bacteria in the Oceans


Surfers are known to brave bad weather, dangerously sized waves, and even sharks, for the perfect ride. But, it seems another danger of surfing has been lying in plain sight all along: ocean waters are full of drug-resistant bacteria — and surfers are most at risk.

 %E2%80%9CBeach%20Bums%E2%80%9D%20Study%20

In a study published this weekend in the journal Environmental International, a team of researchers from the University of Exeter found that regular surfers and bodyboarders are four times as likely as normal beach-goers to harbor bacteria with high likelihoods of antibiotic resistance. This is because surfers typically swallow ten times more seawater during a surf session than sea swimmers.

The cheekily named Beach Bums study, carried out with the help of UK charity Surfers Against Sewage compared rectal swabs from 300 participants and found that 9 percent of the surfers and bodyboarders (13 of 143) harbored drug-resistant E. coli in their systems, compared to just 3 percent of non-surfers (four of 130).

World Health Organization Anti-Microbial Resistance
The World Health Organization is concerned about drug resistance.

The World Health Organization has warned that widespread drug resistance may render antibiotics useless in the face of otherwise easily treatable bacterial infections, meaning that just as in the age before Penicillin, diseases like tuberculosis, pneumonia, blood poisoning, gonorrhea, food– and water-born illnesses as well as routine medical procedures that can lead to infection, including joint replacements and chemotherapy, could once again be fatal.

 Indeed, a 2016 report commissioned by the British government estimated that, by 2050, infections stemming from antimicrobial resistance could kill one person every three seconds.

Solutions to an impending drug resistance epidemic have largely focused on prescriptions and use, but there is an increasing focus on the role of the environment in transmitting drug-resistant bacteria strains. The Beach Bums study adds important insight into how sewage, run-off, and pollution that makes its way into the oceans spread the drug-resistant bacteria.

“We are not seeking to discourage people from spending time in the sea,” says Dr. Will Gaze of the University of Exeter Medical School, who supervised the research. “We now hope that our results will help policy-makers, beach managers, and water companies to make evidence-based decisions to improve water quality even further for the benefit of public health.”

Though the study’s purpose is not to alarm beachgoers — or surfers — Dr. Anne Leonard, who led the research, tells Inverse that the risk for anti-drug resistance may actually be lower in the United Kingdom, which “has invested a great deal of money in improving water quality at beaches, and 98 percent of English beaches are compliant with the European Bathing Water Directive. The risk of exposure to and colonization by antibiotic resistant bacteria in seawater might be greater in other countries which have fewer resources to spend on treating wastewater to improve water quality.”

For surfers on this side of the pond, check out the free app available for Apple and iOS, Swim Guide, for updated water quality information on 7,000 beaches in Canada and the U.S.

Mounting data suggest antibacterial soaps do more harm than good.


Few pros, but cons include upped risk of infection, microbiome changes, drug resistance.

Whether you’re coming home from an airport fluttering with international germs, a daycare full of sticky-fingered toddlers, or just a grimy office building, scrubbing your hands with bacteria-busting soap seems like a great idea. But the data that have washed up on the cleansers in recent years suggest that they actually do more harm than good—for you, those around you, and the environment.

Scientists report that common antibacterial compounds found in those soaps, namely triclosan and triclocarban, may increase the risk of infections, alter the gut microbiome, and spur bacteria to become resistant to prescription antibiotics. Meanwhile, proof of the soaps’ benefits is slim.

There are specific circumstances in which those antimicrobials can be useful, civil engineer Patrick McNamara of Marquette University in Milwaukee told Ars. Triclosan, for instance, may be useful to doctors scrubbing for minutes at a time before a surgery or for hospital patients who can’t necessarily scrub with soap but could soak in a chemical bath. Triclosan and triclocarban do kill off bacteria during long washes. But most people only clean their hands for a few seconds. “There’s evidence that there is no improvement with using soaps that have these chemicals relative to washing your hands under warm water for 30 seconds with soaps without these chemicals,” he said.

And the point hasn’t been lost on the US Food and Drug Administration. Though the agency ruled years ago that triclosan and other antimicrobials are safe, it’s now revisiting claims that the chemicals make soaps and other personal care products better. The FDA has asked antibacterial soap makers to send in data showing that their soaps beat out regular soaps at keeping people germ-free and healthy. The agency expects to announce this September whether the submitted data pass muster. If they don’t, the companies that make up the $5.5 billion soap market may be forced to ditch the chemicals entirely.

Sullied Soaps

In the meantime, however, researchers seem to be digging up more and more dirt on the chemicals, particularly triclosan. This antimicrobial is widely used in not just hand soaps, but body washes, shampoos, toothpastes, cosmetics, household cleaners, medical equipment, and more. And it’s just as pervasive in people as it is in homes and clinics. Triclosan easily enters bodies by ingestion (think toothpaste) or skin absorption. It’s commonly found in people’s urine, blood, breast milk, and even their snot.

A 2014 study led by microbiologist Blaise Boles of the University of Michigan in Ann Arbor tested 90 adults and found that 41 percent (37 people) had triclosan-laced boogers. Antimicrobial-snot paradoxically doubles your odds of having the potentially-infectious Staphylococcus aureusbacteria up your nose.

In rats exposed to triclosan, Dr. Boles and his colleagues found that triclosan exposure made it more difficult, not less, for the rodents to fend off Staph invasions. Triclosan seems to make the bacteria “stickier”—better able to adhere to proteins and surfaces. That stickiness could be whyStaph is so good at hunkering down in the schnoz, setting the stage for future infections.

Other researchers have been looking at how triclosan and other antimicrobials may alter microbial communities further down from the nose—in the gut.

Microbiologist Thomas Sharpton of Oregon State University and his colleagues are currently studying triclosan’s effect on the gut microbiomes of zebrafish, a model organism for vertebrate development. Their preliminary data suggest that the antimicrobial causes swift, sweeping changes in the zebrafish gut microbiome, altering both diversity and community structure.

In another study, presented April 1 at the Endocrine Society‘s 98th annual meeting in Boston, researchers report that mother rats exposed to triclocarban—an antimicrobial used most frequently in bar soaps—passed on the chemical to their pups. The study, led by public health researcher Rebekah Kennedy of the University of Tennessee, Knoxville, also found that the chemical altered the microbiomes of both the mothers and the babies.

“Our research adds to the growing body of scientific literature suggesting unintended health consequences related to non-prescription antimicrobial use and will allow pregnant and nursing mothers to make informed decisions regarding use of these antimicrobial products,” said Dr. Kennedy.

But, Dr. Sharpton cautions, we don’t know yet if such microbiome changes are lasting or if they spark health effects. “We’re really are in the beginning days of understanding how to interpret changes in the microbiome,” he said to Ars.

Still, previous studies have linked dampened diversity and rapid microbial changes from prescription antibiotics to health effects, such as a greater risk of intestinal infections. The results certainly warrant follow-up research, both Sharpton and Kennedy said.

Flush with chemicals

While researchers continue to work out what antimicrobials do while they’re in people’s bodies, Dr. McNamara of Marquette University focuses on what the chemicals do once people pee them out or wash them down the drain. McNamara and his colleagues have been tracking both triclosan and triclocarban in wastewater treatment plants, where both chemicals can accumulate.

In a 2014 study, McNamara’s research team found that triclosan messed with the microbial communities that break down sewage, in some cases sabotaging their ability to digest the sludge. The chemical also caused a spike in the presence of a gene called mexB in the sewage microbes. This gene codes for a pump that allows bacteria to simply kick out triclosan before it can kill them off. This pump, McNamara hypothesizes, also spits out common prescription antibiotics, such as ciprofloxacin. In experiments, bacteria with mexB were resistant to antibiotics, too.

In a January study, McNamara, his graduate student Daniel Carey, and colleagues found that triclocarban had the same effect as triclosan—it also disrupts the microbial communities that digest sewage and spurs bacteria to become resistant to drugs.

From wastewater treatment plants, these superbugs can leak out into waterways, wildlife, and potentially back to people, McNamara told Ars.

While some experts are hopeful that actions by the FDA and state regulators may nix the use of these chemicals in commercial products, McNamara thinks consumer choices may be the most powerful way to reduce use of the chemicals. People could use regular soap or ethanol-based sanitizers and have effective, less risky cleansers, he said. “There’s a way that we can still keep our hygiene without having these extra chemicals.”

Tapeworm Drug Effective at Treating MRSA


With the ever persistent threat of bacterial drug-resistance looming like a carrion bird waiting for a meal, scientists are continually on the hunt for new therapeutics to thwart infections like those caused by methicillin-resistant Staphylococcus aureus (MRSA). Fortuitously, scientists from Brown University have come across two drug compounds that are already in use to treat human tapeworm infections, which they report in a new study show great promise in stopping MRSA infections.

The findings from this study were published recently in PLOS ONE under an article entitled “Repurposing Salicylanilide Anthelmintic Drugs to Combat Drug Resistant Staphylococcus aureus.”

The Brown researchers screened over 600 drugs for effectiveness against MRSA, using an in vivo assay that cultures live nematode worms infected with the drug-resistant bacteria. The investigators found two compounds niclosamide, which is on World Health Organization’s list of essential medicines, and oxyclozanide, a closely related veterinary drug, were effective at suppressing MRSA cultures. Moreover, both drugs were observed to be as effective as the current last-line clinical treatment, vancomycin.

“Since niclosamide is FDA approved and all of the salicylanilide anthelmintic drugs are already out of patent, they are attractive candidates for drug repurposing and warrant further clinical investigation for treating staphylococcal infections,” explained Rajmohan Rajamuthiah, Ph.D. a postdoctoral scholar in the Warren Alpert Medical School of Brown University and first author on the current study.

Dr. Rajamuthiah and his colleagues found that oxyclozanide was more effective at killing MRSA, while niclosamide was more bacteriostatic—effectively suppressing, but not completely eradicating the bacteria. However, the researchers speculate that niclosamide may still provide enough of a kick to keep MRSA at bay while the immune system gets up to speed handling the infection.

While results from the current study are very encouraging and have Dr. Rajamuthiah and his colleagues feeling optimistic, the researchers did point out a caveat that the feel warrants further analysis. Drugs such as oxyclozanide and niclosamide are rapidly cleared by the body and are less effective at diffusing out of the bloodstream and into peripheral tissues, where some MRSA infections could reside.

“The low level of systemic circulation coupled with the rapid elimination profile of niclosamide suggests the necessity for further testing of the potential of niclosamide and oxyclozanide for treating systemic infections,” wrote the scientists. “Further studies should include the evaluation of these compounds in systemic and localized infection models in rodents.”

However, the flipside of the rapid clearance scenario is that drugs may impart very limited toxicity to patients. In order to determine the actual effects of these drugs in mammals, the researchers have planned a series of experiments in rodents to determine the two compounds efficacy and overall toxicity, when used to treat MRSA infections.

“The relatively mild toxicity of oxyclozanide is encouraging based on in vitro tests,” stated Dr. Rajamuthiah. “Since it has never been tested in humans and since it belongs to the same structural family as niclosamide, our findings give strong impetus to using oxyclozanide for further investigations.”

Antibiotic Resistance Will Kill 300 Million People by 2050


New report says pharma companies make more money from other drugs, so shy away from new antibiotic development
MRSA

The true cost of antimicrobial resistance (AMR) will be 300 million premature deaths and up to $100 trillion (£64 trillion) lost to the global economy by 2050.

The true cost of antimicrobial resistance (AMR) will be 300 million premature deaths and up to $100 trillion (£64 trillion) lost to the global economy by 2050. This scenario is set out in a new report which looks to a future where drug resistance is not tackled between now and 2050.

The report predicts that the world’s GDP would be 0.5% smaller by 2020 and 1.4% smaller by 2030 with over 100 million premature deaths. The Review on Antimicrobial Resistance, chaired by Jim O’Neill, is significant in that it is a global review that seeks to quantify financial costs.

This issue goes beyond health policy and, on a strictly macroeconomic basis, it makes sense for governments to act now, the report argues. “One of the things that has been lacking is putting some pound signs in front of this problem,” says Michael Head at the Farr institute, University College London, UK, who sees hope in how a response to HIV came about. “The world was slow to respond [to HIV], but when the costs were calculated the world leapt into action.”

He recently totted up R&D for infectious diseases in the UK and found gross underinvestment in antibacterial research: £102 million compared to a total of £2.6 billion. Other research shows that less than 1% of available research funds in the UK and Europe were spent on antibiotic research in 2008–2013.

Bleak future
RAND Europe and KPMG both assessed the future impact of AMR. They looked at a subset of drug resistant pathogens and the public health issues surrounding them forKlebsiella pneumonia, Escherichia coli, Staphylococcus aureus, HIV, tuberculosis and malaria. The RAND Europe scenario modelled what would happen if antimicrobial drug resistance rates rose to 100% after 15 years, while infection rates held steady. The KPMG scenario looked at resistance rising to 40% from today’s levels and the number of infections doubling. Malaria resistance results in the greatest number of fatalities, while E. coli resistance accounts for almost half the total economic impact as it is so widespread and its incidence is so high.

“You can look at antibiotic resistance as a slow moving global train wreck, which will happen over the next 35 years,” says health law expert Kevin Outterson at Boston University, US. “If we do nothing, this report shows us the likely magnitude of the costs.”

Outterson headed up a recent Chatham House report on new business models for antibiotics that highlighted the problem of inadequate market incentives. “If I came out with a new cardiovascular drug, it could be worth tens of billions of dollars a year,” he says. “But if we had the same innovative product as an antibiotic, we would save it for the sickest and it would sell modestly in the first decade. So market uptake is extraordinarily limited for innovative antibiotics and all for excellent public health reasons.”

Incentivising action
The solution is to de-link return on investment and volume sales. “Instead of companies getting their return on R&D investment by selling volumes of product, they would be paid something by governments or health players for access to that antibiotic,” he explains. Outterson is now working on a report that will outline how this could work.

Another approach is to re-use old drugs. “Developing new antibiotics will take many years and we cannot wait,” says Ursula Theuretzbacher at the Center for Anti-Infective Agents in Vienna, Austria. “In the meantime we decided we need to improve the usage of some selected old drugs that had not been in use for many years.” An EU-funded project, AIDA, is running clinical trials on five drugs developed before the 1980s.

Theuretzbacher has been pleased by public money going into helping small companies move their innovative antibiotics towards market. In the US, companies such as Achaogen, Cempra and Trias, acquired by Cubist, itself just bought up by Merck, have made use of these schemes. Meanwhile, in Europe, there are several EU funded projects, Wellcome Trust schemes and public–private partnerships such as theInnovative Medicines Initiative and its New Drugs for Bad Bugs programme.

Richard Smith, health systems economist at the London School of Hygiene & Tropical Medicine, UK, was a member of the RAND team and adviser to KPMG. He says the report’s headline figures are not an exaggeration and are more likely an underestimate. “It takes into account effects on labour productivity and labour workforce issues, but we don’t know what the public reaction will be: from previous pandemics and outbreaks we know behavioural effects can be much worse on an economy than the impact of the disease,” he says. The report concluded that they “most likely underestimate the true costs of AMR” due to a lack of reliable data.

“When we understand a threat, governments respond with energy and with money,” Outterson says. The US recently agreed to put over $5 billion into fighting Ebola. “The threat posed by bacterial resistance is even greater than that of Ebola,” he adds. “If this report accurately predicts the world we live in in 2050, then we will have failed on a monumental scale to preserve a global public good.”

Coating on Aspirin Might Reduce Its Cardioprotective Effects.


Enteric coating can affect aspirin‘s inhibition of platelet aggregation, according to a study in Circulation.

Researchers used three assays — platelet aggregation, serum thromboxane formation, and urinary excretion of a thromboxane metabolite — to test response to an oral dose of 325-mg immediate-release or enteric-coated aspirin in 400 healthy volunteers (median age, 26). The study was partly funded by Bayer HealthCare.

No participant showed resistance to the immediate-release formulation. Up to 49% showed resistance to enteric-coated aspirin, but most were not resistant upon retesting.

The authors conclude that “we failed to find a single case of true drug resistance” and that their findings show “inconsistent platelet inhibition” after ingestion of enteric-coated aspirin.

Source: Circulation

 

Tuberculosis, Drug Resistance, and the History of Modern Medicine.


Tuberculosis is a treatable airborne infectious disease that kills almost 2 million people every year. Multidrug-resistant (MDR) tuberculosis — by convention, a disease caused by strains of Mycobacterium tuberculosis that are resistant to isoniazid and rifampin, the backbone of first-line antituberculosis treatment — afflicts an estimated 500,000 new patients annually. Resistance to antituberculosis agents has been studied since the 1940s; blueprints for containing MDR tuberculosis were laid out in the clinical literature and in practice, in several settings, more than 20 years ago.1,2 Yet today, barely 0.5% of persons with newly diagnosed MDR tuberculosis worldwide receive treatment that is considered the standard of care in the United States.3 Those who have not received appropriate treatment continue to fuel a global pandemic that now includes strains resistant to most — and by some accounts all — classes of drugs tested. 4,5 Despite the enormity of the threat, investments to contain the epidemic and to cure infected patients have been halting and meager when compared, for example, with those made to address the acquired immunodeficiency syndrome (AIDS) pandemic. In this essay we seek to elucidate the reasons for the anemic response to drug-resistant tuberculosis by examining the recent history of tuberculosis policy.

Research in Tuberculosis — Midwife of Modern Biomedicine

On the evening of March 24, 1882, when Robert Koch completed his presentation on the infectious cause of tuberculosis, silence enveloped the crowded room at the Berlin Physiological Society.6 A means of combating tuberculosis — a disease that in the 19th century caused, by some accounts, about 25% of all deaths in Massachusetts and New York and claimed the lives of one fourth of Europe’s population — was now within reach.7 Koch summarized the importance of his findings, for which he received the 1905 Nobel Prize, in a manuscript published in the Berliner Klinische Wochenschrift shortly after his announcement: “In the future the fight against this terrible plague of mankind will deal no longer with an undetermined something, but with a tangible parasite, whose living conditions are for the most part known and can be investigated further.”8

But therapy lagged. It was not until 60 years later, in 1943, that the first effective antituberculosis agent, streptomycin, was isolated in the laboratory of Selman Waksman at Rutgers University (see timeline, available with the full text of this article at NEJM.org). In November 1944, a patient with tuberculosis received streptomycin and was declared cured of the disease.6 Other cases of successful treatment soon followed.9,10 The British Medical Research Council conducted the first large-scale clinical trial of streptomycin in 1948.11 This study, said to be the world’s first published drug trial that involved the randomization of participants, set the methodologic standard for modern randomized, controlled trials. Although many patients were cured, a substantial proportion had a relapse; mycobacterial isolates cultured from the latter patients showed resistance to streptomycin.12 That same year, two new antituberculosis agents, thiacetazone and para-aminosalicylic acid, came on the market. When either of these agents was administered with streptomycin, cure rates rose and acquired antibiotic resistance declined.13 In 1951, isonicotinic acid hydrazide (isoniazid) was tested at Sea View Hospital in New York; it dramatically improved clinical outcomes and was soon introduced for wider use.14 Isoniazid was followed by the development of pyrazinamide (1952), cycloserine (1952), ethionamide (1956), rifampin (1957), and ethambutol (1962).

With its high level of efficacy and ease of administration, rifampin revolutionized the treatment of tuberculosis.15-17 But the advent of every new drug led to the selection of mutations conferring resistance to it. Resistance to rifampin was observed soon after it was first administered.18 Laboratory data from trials revealed the rapid onset of isoniazid resistance among patients receiving monotherapy and the suppression of resistance when isoniazid was given in combination with streptomycin or para-aminosalicylic acid.19 These observations led to the use of multidrug treatment regimens — a strategy widely used today to treat a variety of infectious diseases and cancers. Ultimately, through a series of multicountry clinical trials led by the British Medical Research Council, a four-drug regimen was recommended for use in patients with newly diagnosed tuberculosis. The backbone of such empirical regimens was the combination of isoniazid and rifampin, the most effective and reasonably well-tolerated oral agents, given for 6 to 8 months. Thus, short-course chemotherapy was born.19

Drug resistance, however, has remained a challenge. The early hypothesis that resistance always conferred a loss of bacterial fitness, and hence led to lower case fatality rates and decreased transmission of such strains, had been disproved by the 1950s.19 The first national drug-resistance survey in the world, which involved 974 clinical isolates cultured from newly diagnosed cases of tuberculosis in Britain (1955–1956), showed strains that were resistant to streptomycin (2.5%), para-aminosalicylic acid (2.6%), and isoniazid (1.3%).20 Similarly, data from the United States showed that isoniazid resistance increased from 6.3% (between 1961 and 1964) to 9.7% (between 1965 and 1968) among patients with newly diagnosed tuberculosis.21 Between 1970 and 1990, there were numerous outbreaks of drug-resistant tuberculosis involving strains resistant to two or more drugs.17,22,23 As early as 1970, an outbreak in New York City of highly virulent tuberculosis that was resistant to multiple drugs proved to be a grim reminder that resistance did not necessarily reduce a microbe’s fitness: the index patient died; 23 of 28 close contacts had evidence of new infection, and active, drug-resistant disease developed in 6 of these 23 contacts, 5 of whom were children.21

Tuberculosis, whether caused by drug-susceptible or drug-resistant strains, rarely made even medical headlines, in part because its importance as a cause of death continued to decline in areas in which headlines are written. In such settings, where many of the social determinants of tuberculosis — extreme poverty, severe malnutrition, and overcrowded living conditions — became the exception rather than the norm, some public health experts declared that “virtual elimination of the disease as a public health problem” was in sight.24 In the United States, federal funding for tuberculosis research was cut; consequently, drug discovery, development of diagnostics, and vaccine research ground almost to a halt.17

The Great Divergence in Tuberculosis Policy

Optimism that tuberculosis would soon be eliminated was not restricted to wealthy countries. At the 1978 International Conference on Primary Health Care in Alma-Ata (now called Almaty), Kazakhstan, delegates from around the world endorsed the goal of “health for all by the year 2000.” The eradication of smallpox had been announced the previous year, and the future of international public health looked promising to many who were gathered there.

But it was not to be. By the mid-20th century, tuberculosis outcomes had diverged along the fault lines of the global economy: while tuberculosis became rare in countries where income was high, epidemics of the disease raged on in low-income settings. In 1982, the Mexican government defaulted on many of its loan payments, triggering a debt crisis in many countries with weak economies. Increasing numbers of international health donors and policymakers, slow to contribute resources toward the ambitious Alma-Ata agenda, embraced the idea of selective primary health care: discrete, targeted, and inexpensive interventions.25,26 Bilateral assistance withered, and poor countries became increasingly reliant on loans from international financial institutions such as the World Bank, which based its health agenda on the principles of “cost-effectiveness” and “affordable health for all” — the latter concept a nod to the Alma-Ata Declaration.27

Selective primary health care offered clear targets, measurable outcomes, and a high return on health investments, all of which appealed to donors worried about investing in countries that were on the brink of default.28,29 But several leading causes of disability and death, including tuberculosis, were deemed too costly and complex to address in resource-poor settings and were largely excluded from the emerging, constricted agenda for effective health investments. “Leprosy and tuberculosis require years of drug therapy and even longer follow-up periods to ensure cure,” wrote two of the architects of selective primary health care in 1979. “Instead of attempting immediate, large-scale treatment programs for these infections, the most efficient approach may be to invest in research and development of less costly and more efficacious means of prevention and therapy.”25

But tuberculosis, which persisted in settings of poverty, could not be hidden away for long. In 1993, the World Bank began to use disability-adjusted life-years — a means of measuring the “cost-effectiveness” of a given health intervention that took into account morbidity, mortality, and age — to determine which health interventions to support.30 As a result of this new economic calculus, short-course chemotherapy for tuberculosis was declared a highly “cost-effective” intervention and gained momentum.31 Seizing the opportunity, the World Health Organization (WHO) shaped and promoted the DOTS (directly observed therapy, short-course) strategy, an approach that conformed to the selective primary health care agenda: simple to treat, algorithmic, and requiring no expensive inputs. According to this strategy, the diagnosis was to be made with the use of smear microscopy alone — in spite of the insensitivity and inability of this technique to detect drug resistance — and the treatment approach was to be based on the empirical use of first-line antituberculosis agents only. 32 Facility-based infection control was not part of the DOTS strategy. Despite these exclusions, DOTS was an important development in global tuberculosis policy. Increasingly, poor countries began implementing the DOTS approach; many lives were saved and many new cases averted. However, for children with tuberculosis, people with both tuberculosis and advanced disease from the human immunodeficiency virus (HIV), and the increasing proportion of patients infected with strains of tuberculosis that were already drug-resistant, the DOTS strategy provided limited options for prompt diagnosis and cure.

The Emergence of MDR Tuberculosis Globally

These shifts in tuberculosis policy — linked to the reconceptualization of this leading infectious killer of young adults and children from a disease deemed to be costly and difficult to treat to a disease deemed to be “cost-effective” to treat and slated for eradication — convey precisely what is meant by the “social construction of disease.”33 M. tuberculosis did not conform to the regnant disease-control strategy, and resistant strains continued to emerge and to be transmitted because empirical treatment with first-line antituberculosis drugs was ineffective for those sick with strains resistant to these drugs. HIV infection fanned epidemics of tuberculosis. In the late 1980s and early 1990s, outbreaks of MDR tuberculosis were again reported in the United States.17 Genetic analysis of drug-resistant strains showed that airborne transmission of undetected and untreated strains played a major role in these outbreaks, disabusing practitioners of the notion that resistance stemmed solely from “sporadic pill taking.”17,34 Public health officials developed a national action plan to combat drug-resistant tuberculosis and to increase funding for relevant research.17,35-37 The experience in New York City offered a blueprint that was quite different from the DOTS strategy; it consisted of diagnosis with the use of mycobacterial culture and fast-track drug-susceptibility testing, access to second-line antituberculosis medications, proper infection control, and delivery of medications under direct observation.1

Outbreaks of MDR tuberculosis in the United States were a harbinger of the coming global pandemic. By the early-to-mid-1990s, MDR tuberculosis had been found wherever the diagnostic capacity existed to reveal it. But in contrast to the U.S. strategy, the WHO — the principal standard-setting body for many countries — continued to advocate the use of sputum-smear microscopy and first-line antituberculosis treatment alone for combating epidemics in resource-poor settings. Some international policymakers thought that treating MDR tuberculosis would be too expensive and complex — claims similar to those made about treating drug-susceptible tuberculosis before this approach was found to be “cost-effective” — and would distract attention from the newly branded (and often successful) DOTS strategy.38 Contemporaneous experience in the United States and in several countries in the former Soviet Union suggested, however, that short-course chemotherapy was ineffective against strains shown to be resistant to precisely those drugs on which such therapy was based.1,17,39,40

The Limits of Short-Course Chemotherapy

The failure of short-course chemotherapy against MDR tuberculosis, though unsurprising clinically, was difficult politically. In Peru, for example, a campaign to promote the DOTS strategy had been so successful in making short-course chemotherapy available that the country’s leaders elevated it as a point of national pride. Peru emerged as a crucible for debates about the treatment and management of MDR tuberculosis in poor countries.2 In 1995, an outbreak in a shantytown in the northern reaches of Lima was identified.41 Many patients were infected with strains found to have broad-spectrum resistance to first-line drugs. Nongovernmental organizations worked with the Peruvian Health Ministry to apply the standard-of-care treatment used in New York City and elsewhere in the United States. The strategy was modified to provide community-based care, with good results.42 After arguing that the DOTS strategy alone could rein in the mutant bacteria, the WHO and other international public health authorities advised the Peruvian government to adopt a low-cost, standardized regimen for the treatment of MDR tuberculosis rather than protocols based on the results of drug-susceptibility testing. In the absence of tailored therapy, many hundreds of deaths occurred among some of Lima’s poorest people.43 As expected, amplification of drug resistance was documented.44,45

By the end of the 1990s, facing mounting evidence that MDR tuberculosis could be treated effectively in resource-poor settings,46,47 a multi-institutional mechanism — the Green Light Committee — was created to encourage and learn from pilot projects for treating MDR tuberculosis.2,17,48 This coincided with a grant from the Bill and Melinda Gates Foundation to scale up treatment of MDR tuberculosis in Peru and elsewhere and to change global policy.

Tuberculosis Policy and Global Health Equity

Drug resistance is well established as an inevitable outcome of antibiotic use; the fault lines of the MDR tuberculosis pandemic are largely man-made. The contours of global efforts against tuberculosis have always been mediated by both biologic and social determinants, and the reasons for the divergence in the rates of tuberculosis and drug resistance between rich and poor countries are biosocial.49 As case rates dropped in wealthy countries, funding for research and implementation programs dried up, even though tuberculosis remained the world’s leading infectious killer of young adults throughout the 20th century. Tuberculosis “control” in the 1990s was defined by the legacy of selective primary health care: targeted, “cost-effective” interventions packaged together, in the case of tuberculosis, as the DOTS strategy. Such protocols helped standardize tuberculosis treatment around the world — a process that was sorely needed — but they hamstrung practitioners wishing to address diagnostic and therapeutic complexities that could not be addressed by the use of sputum-smear microscopy and short-course chemotherapy or other one-size-fits-all approaches. These complexities, which now range from pan-resistant tuberculosis to undiagnosed pediatric disease, account for more than a trivial fraction of the 9 million new cases of tuberculosis and the almost 2 million deaths from this disease that occur around the globe each year.

The history of divergent policies for combating drug-resistant tuberculosis shows that decades of clinical research and effective programs in high-income settings did not lead to the deployment of similar approaches in settings of poverty. Achieving that goal demands a commitment to equity and to health care delivery.50 The U.S. response to the outbreaks of MDR tuberculosis in New York City and elsewhere was bold and comprehensive; it was designed to halt the epidemic.1,17 A similar response has not yet been attempted in low- and middle-income countries. Instead, selective primary health care and “cost-effectiveness” have shaped an anemic response to the ongoing global pandemic.

New diagnostics and therapeutics are urgently needed; most of the methods used currently were developed decades ago. Today, we have rapid nucleic acid–based tests for drug-resistant tuberculosis, sound models for laboratory expansion and for treatment delivery, and several drug candidates in the pipeline. To tackle tuberculosis, we also need an equity plan that takes seriously the biosocial complexity of a lethal airborne infection that has stalked us for centuries. The global AIDS effort of the past decade has shown how much can be accomplished in global health when effective diagnosis and care are matched with funding and political will. Stinting on investments or on bold action against tuberculosis — in all its forms — will ensure that it remains a leading killer of people living in poverty in this decade and the next.

Source Information

From the Program in Infectious Disease and Social Change, Department of Global Health and Social Medicine, Harvard Medical School; the Division of Global Health Equity, Brigham and Women’s Hospital; and Partners in Health — all in Boston.

Source: NEJM

 

 

Neighbouring cells help cancers dodge drugs.


Proteins in a tumour’s microenvironment play a part in drug resistance.

Cancers can resist destruction by drugs with the help of proteins recruited from surrounding tissues, find two studies published by Nature today. The presence of these cancer-assisting proteins in the stromal tissue that surrounds solid tumours could help to explain why targeted drug therapies rapidly lose their potency.

Targeted cancer therapies are a class of drugs tailored to a cancer’s genetic make-up. They work by identifying mutations that accelerate the growth of cancer cells and selectively blocking copies of the mutated proteins. Although such treatments avoid the side effects associated with conventional chemotherapy, their effectiveness tends to be short-lived. For example, patients treated with the recently approved drug vemurafenib initially show dramatic recovery from advanced melanoma, but in most cases the cancer returns within a few months.

Many forms of cancer are rising in prevalence: for example, in the United States, the incidence of invasive cutaneous melanoma — the deadliest form of skin cancer — increased by 50% in Caucasian women under 39 between 1980 and 2004. So there is a pressing need to work out how to extend the effects of targeted drug therapies. But, until now, researchers have focused on finding the mechanism of drug resistance within the cancerous cells themselves.

Two teams, led by Jeff Settleman of Genentech in South San Francisco, California, and Todd Golub at the Broad Institute in Cambridge, Massachusetts, expanded this search into tumours’ surrounding cellular environment.

Settleman’s team tested 41 human cancer cell lines, ranging from breast to lung to skin cancers. The researchers found that 37 of these became desensitized to a handful of targeted drugs when in the presence of proteins that are usually found in the cancer’s stroma, the supportive tissue that surrounds tumours. In the absence of these proteins, the drugs worked well1. By growing cancer cells along with cells typically found in a tumour’s immediate vicinity, Golub and his colleagues showed that these neighbouring cells are the likely source of the tumour-aiding proteins2.

Protein culprit

One of the most startling results of the teams’ experiments was the discovery that a protein called hepatocyte growth factor (HGF) boosts melanoma’s resistance to treatment with vemurafenib. Intrigued by this result, both teams looked at blood samples from people who had undergone treatment with vemurafenib, and found the higher a patient’s HGF levels, the less likely they were to remain in remission.

Martin McMahon, a cancer biologist at the University of California, San Francisco, who was not affiliated with either study, explains that the results have immediate implications for the design of clinical trials, which he says could combine targeted drug therapy with drugs capable of knocking down the production of proteins such as HGF.

“These papers show that the influence of the cell’s microenvironment is important not only for melanoma, but also for pancreatic, lung and breast cancer,” McMahon says, adding that they are “very exciting, because they expand the focus of where we should be looking for the mechanisms of drug resistance”.

Source: Nature.

 

MicroRNA-122 sensitizes HCC cancer cells to adriamycin and vincristine through modulating expression of MDR and inducing cell cycle arrest


Hepatocellular carcinoma (HCC) is a hypervascular cancer characterized by rapid progression as well as resistance to conventional chemotherapy. It has been shown that microRNAs play critical roles in pathogenesis of HCC. MicroRNA-122 (miR-122) is a liver-specific microRNA and is frequently downregulated in HCC. In the present study, we investigated whether restoration of miR-122 in HCC cells could render cells sensitive to chemotherapeutic agents adriamycin (ADM) or vincristine (VCR). Our data showed that overexpression of miR-122 in HCC cells induced by adenovirus expressing miR-122 could render cell sensitive to ADM or VCR. Analysis of cell cycle distribution showed that the anti-proliferative effect of miR-122 is associated with increase of cell number in the G2/M phase. Moreover, treatment with Ad-miR122 and ADM or VCR resulted in high accumulation of HCC cells in G2/M phase. We further demonstrated that overexpression of miR-122 could modulate the sensitivity of the HCC cells to chemotherapeutic drugs through downregulating MDR related genes MDR-1, GST-π, and MRP, antiapoptotic gene Bcl-w and cell cycle related gene cyclin B1. Taken together, our findings demonstrated that combination of Ad-miR122 with chemotherapeutic agents inhibited HCC cell growth by inducing G2/M arrest and that this arrest is associated, at least in part, with reduced expression of MDR related genes and Cyclin B1.

source: cancer letter

 

cancer chemotherapy resistance


Primary or acquired drug resistance remains a fundamental cause of therapeutic failure in cancer therapy. Post-hoc analyses of clinical trials have revealed the importance of selecting patients with the appropriate molecular phenotype for maximal therapeutic benefit, as well as the requirement to avoid exposure and potential harm for those who have drug resistant disease, particularly with respect to targeted agents. Unravelling drug resistance mechanisms not only facilitates rational treatment strategies to overcome existing limitations in therapeutic efficacy, but will enhance biomarker discovery and the development of companion diagnostics. Advances in genomics coupled with state-of-the-art biomarker platforms such as multi-parametric functional imaging and molecular characterisation of circulating tumour cells are expanding the scope of clinical trials – providing unprecedented opportunities for translational objectives that inform on both treatment response and disease biology. In this review, we propose a shift towards innovative trial designs, which are prospectively set up to answer key biological hypotheses in parallel with the RNA interference elucidation of drug resistance pathways in monotherapy pre-operative or ‘window of opportunity’ early phase trials. Systematic collection of paired clinical samples before and after treatment amenable to genomics analysis in such studies is mandated. With concurrent functional RNA interference analysis of drug response pathways, the identification of robust predictive biomarkers of response and clinically relevant resistance mechanisms may become feasible. This represents a rational approach to accelerate biomarker discovery, maximising the potential for therapeutic benefit and minimising the health economic cost of ineffective therapy.