The first Australians hunted giant kangaroos, rhinoceros-sized marsupials, huge goannas and other megafauna to extinction shortly after arriving in the country more than 40,000 years ago, new research claims.


A team of scientists from six universities say they have put an end to the long-running debate about the cause of the sudden disappearance of giantvertebrates from the Australian ecosystem, and the dramatic change to the landscape that followed.

Soon after the beasts disappeared, there was a rapid shift in vegetation across Australia. Once covered by patches of rainforest separated by areas of open grassland, it was soon smothered by eucalypt forest, the researchers say.

The change was caused by a decrease in consumption of plant material by large herbivores, which allowed forest to spread and also resulted in a build-up of dry fuel for bushfires.

In the past 100,000 years, many of the largest animals on Earth became extinct. The reasons remain contentious. In recent years, some scientists have argued that habitat loss through climate change or fire was the killer blow.

But the latest argument, published today in Science, refutes this.

Research leader Chris Johnson, from the University of Tasmania School of Zoology, said the team solved the mystery of the disappearance of Australia’s megafauna by using a method of tracking large herbivores through time by counting the spores of fungi in their dung.

Professor Johnson said the biggest herbivores – “rhino-sized wombat-like marsupials called Diprotodons, giant kangaroos, a goanna bigger than the living Komodo dragon, a giant goose twice the size of the emu and many others” – produced vast quantities of dung, and that there were special fungi that lived in it.

“The spores of these fungi can be preserved in sediments in swamps and lakes,“ Professor Johnson said. “As those sediments accumulate over time, they create a historical record of the abundance of very large herbivores in the environment.

 

“Pollen and charcoal particles are trapped in the same sediments, so that it is possible to match up the history of abundance of large herbivores with changes in vegetation and fire. Then, radiocarbon can be used to date these things.”

Professor Johnson said the research focussed largely on a swamp called Lynch’s Crater in northeast Queensland, where the sediment record reaches back to 130,000 years ago.

“It showed that the abundance of large mammals was stable until just before 40,000 years ago, when it suddenly crashed.

“This rules out climate change as a cause of extinction, as there were several periods of climate drying before the extinction and they had no effect on abundance. And when the animals did go extinct, the climate was stable.

Habitat change could not have been responsible for the loss of the large marsupsials, because the grassy forest expanded only after the spores abruptly declined.

“But the extinctions followed very soon after the time that people arrived in the region – so it seems that people did it,” Professor Johnson said.

“Our study didn’t directly address how people caused extinction, but the most likely mechanism is hunting. Quite a bit of other circumstantial evidence suggests that.”

The results also show that the extinctions were quickly followed by massive ecological change.

Gavin Prideaux, a lecturer in vertebrate palaeontology in the School of Biological Sciences at Flinders University, said the research was an important contribution to the understanding of what happened “when 90% of our large terrestrial species disappeared, depriving us of the sight of giant short-faced kangaroos, marsupial “lions”, giant horned tortoises and herds of Diprotodons meandering through the outback.”

The research team had presented compelling data in a field of inquiry that “aches under the strain of opinion pieces and the tired reworking of published data … The timing of the inferred extinction coincides with early human presence in the region, but not with significant climatic change.”

This supported a mounting number of studies that have argued that climate change was not primarily responsible for extinctions in other parts of the continent, Dr Prideaux said.

John Alroy, a Future Fellow in the Department of Biological Sciences, Faculty of Science at Macquarie University, said the new research put all debate over the matter to rest. “The key new data are the spore counts, and in combination with the charcoal and rainforest pollen data they tell the whole story,” Dr Alroy said. “There is simply no reasonable way to argue with the authors’ conclusions.”

But Judith Field, a Senior Research Fellow in the School of Biological, Earth and Environmental Sciences at The University of New South Wales, said the argument was flawed for several reasons.

Chief among these was the claim by the authors that climate was stable through the period in question – “the opposite is generally accepted”, Dr Field said – and the unproven assumption that megafauna were so abundant that their disappearance would have triggered a drastic change in vegetation.

“The facts of the matter are that most megafauna were extinct nearly 100,000 years before human arrival and there is no evidence for any particular time period to be significant in terms of faunal extinctions.”

Dr Field added that there was no evidence from archaeological sites that humans hunted megafauna. She said the earliest evidence for human occupation in northeast Queensland suggested their populations were small.

 

Source: The Conversation

 

 

Could Human and Computer Viruses Merge, Leaving Both Realms Vulnerable?


Mark Gasson had caught a bad bug. Though he was not in pain, he was keenly aware of the infection raging in his left hand, knowing he could put others at risk by simply coming too close. But his virus wasn’t a risk for humans. Gasson, a cybernetics scientist at the University of Reading, was walking around with an implanted microchip he had intentionally infected with a computer virus. If he got too close to a computer, he could in principle infect that machine.

Although this possibility may sound like a foray into science fiction, information security experts believe the blurring of the boundaries between computer and biological viruses is not so far-fetched—and could have very real consequences.

As TechWorld reports, Axelle Apvrille and Guillaume Lovet of the network security company Fortinet presented a paper comparing human and computer virology and exploring some of the potential dangers at last week’s Black Hat Europe conference.

Both computer and biological viruses, they explain in their paper, can be defined as “information that codes for parasitic behavior.” In biology, a virus’s code is written in DNA or RNA and is much smaller than the code making up a computer virus.  The DNA of a flu virus, for example, could be described with about 23,000 bits, whereas the average computer virus would fall in a range 10 to 100 times bigger.

The origins of each virus are strikingly different: A computer virus is designed, whereas a biological virus evolves under pressure from natural selection. But what would happen if these origins are switched? Could hackers code for a super-virus, or a computer virus emerge out of the information “wilderness” and evolve over time?

Apvrille and Lovet argued that both scenarios are possible, with a few caveats to each. Scientists have already synthesized viruses such as polio and SARS for research purposes, so it’s conceivable that someone could synthesize viruses as bioweaponry. That said, Apvrille and Lovet observed that viruses are notoriously difficult to control, and it’s hard to imagine anyone could use a viral weapon without it backfiring.

As for computer viruses speciating and evolving, Apvrille and Lovet believed that with enough data the code for a single computer virus might form spontaneously. Chances are slimmer, however, that it would include the necessary details to adapt and evolve. So far this scenario has only occurred in viruses in which researchers have encoded genetic algorithms to mimic evolutionary processes.

But there are more immediate possibilities for computer-biology crossovers. Synthetic biology uses computers to store genetic information, and Apvrille and Lovet explained that hackers could  infect these devices or the software used for DNA sequencing, thereby modifying whatever biological product is being synthesized.

For now, Mark Gasson’s example of an infected implant may spark the most concern, as it illustrates how cybernetic technologies leave humans vulnerable to unprecedented attacks. Just as a PC can download a virus after visiting a new website, cybernetic devices, such as cochlear implants or pacemakers, could be threatened when they connect to an external system. Once infected, the implant can then spread the virus to other systems.

Inevitably, as we rely more on computers, the impact of viruses grows. After all, our favorite technologies are extensions of ourselves, storing memories, expanding our knowledge and increasing our reach. As Gasson put it on his online Q&A about the study, even though the experiment had no effect on his health, it was still “surprisingly personal,” because “part of ‘me’ had been compromised.”

 

Source: Scientific American.

Big science zooms in on a new cure for baldness.


In mice and men, baldness is a scourge that cries out for a cure. Fortunately, a far-flung group of American researchers is on it — and on Wednesday reported progress on this front in the very sober journal Science Translational Medicine.

Plucking hair follicles from the pates of 22 men with male-pattern baldness and an army of mice, researchers detected a key difference between patches where hair was growing and patches where it was thinning or bald: In humans, a prostaglandin called PGD2 was far more plentiful in areas of the pate that were bald than in patches where hair continued to grow; and in mice, the same prostaglandin was in large supply when they were in the shedding phase of their normal hair follicle cycle.

The team was led by dermatologist Luis Garza (then of the University of Pennsylvania, now at Johns Hopkins University) and by Penn dermatologist George Cotsarelis. The discovery that prostaglandins might be the catalyst that sets baldness in motion, was a surprise to the researachers, who “hadn’t thought about prostaglandins in relation to hair loss,” said Cotsarelis.

From there, researchers were able to identify the receptor — the cellular landing dock — for D2, called GPR44. Find a way to block that receptor, or somehow thwart PGD2’s path to it, and, voila! —baldness doesn’t happen. That, say the researchers, will be their next effort — to try topical treatments that block the GPR44 receptor. They hope the same approach might help find treatments that prevent hair thinning in women.

Male pattern baldness strikes 80% of men younger than 70, causing hair growth to thin in a distinctive pattern. Currently, just two medications, Monoxidil (marketed as Rogaine) andFinasteride (marketed as Propecia or Proscar), are available to combat hair loss.

Source:  Los Angeles Times

Alcon gains exclusive ex-US rights for ocriplasmin, potential first pharmacological treatment for symptomatic vitreomacular adhesion.


  • Symptomatic vitreomacular adhesion (VMA) is a progressive, debilitating eye disease, there is currently no pharmacological treatment available 
  • Phase III clinical datademonstrate resolution of symptomatic VMA following a single administration of ocriplasmin[1] 
  • More than 300,000 patients[2] in Europe alone could potentially benefit from this new therapy
  • Alcon to make upfront payment to ThromboGenics, additional payments based on milestones and royalties on future sales  

Alcon, the global leader in eye care and a division of Novartis, announced today that it has gained exclusive rights to commercialize ocriplasmin outside the United States for the treatment of symptomatic vitreomacular adhesion (VMA). If approved, it will be the first pharmacological treatment for patients with symptomatic VMA, including macular hole. Symptomatic VMA is a progressive, debilitating eye disease that may lead to visual distortion, loss in visual acuity and central blindness[3].

Alcon is licensing ocriplasmin from ThromboGenics, a bio-pharmaceutical company based in Belgium. The agreement grants Alcon exclusive commercial rights outside the United States. Under the terms of the agreement, Alcon will make an upfront payment of €75 million (approx. USD 100 million) to ThromboGenics with additional potential payments based on milestones, as well as royalties on sales of ocriplasmin, if approved.

“There are thousands of symptomatic vitreomacular adhesion patients who currently do not have an available treatment option. The clinical results[1] for ocriplasmin show improved visual function and that earlier intervention may limit the progression of the disease,” said Kevin Buehler, Division Head Alcon. “Ocriplasmin is a strategic fit for Alcon and is expected to further enhance our portfolio of innovative treatments for the eye.”

Ocriplasmin is currently under review with the European Medicines Agency (EMA) as the first pharmacological treatment for symptomatic VMA, including macular hole. The drug was accepted for review by the EMA in October 2011. ThromboGenics retains the rights to commercialize ocriplasmin in the United States and a decision on approval is expected from the US Food and Drug Administration (FDA) in the second half of 2012.

Symptomatic VMA primarily involves the interface between the retina’s highly sensitive macular area, which is responsible for detailed, central vision, and the posterior vitreous membrane, which separates the clear, jelly-like substance in the center of the eye, called vitreous, from the retina.

With symptomatic VMA patients, the vitreous adheres in an abnormally strong way to the retina, which can lead to traction (‘pulling’) on the retina, causing symptoms including impaired vision. Further unresolved traction may lead to the development of macular holes and central blindness[3].

For many symptomatic VMA patients, there is no recommended treatment available. More than 300,000[2] symptomatic VMA patients in Europe alone could potentially benefit from this new treatment, if approved. The standard of care for patients advancing to late stage VMA is surgical vitrectomy.

Ocriplasmin is a recombinant truncated form of human protein (plasmin) administered through a one-time intra-vitreal injection. Clinical data1 demonstrate that ocriplasmin resolves symptomatic vitreomacular adhesion (VMA), on average within seven days, reducing the number of patients advancing to surgery.

The ocriplasmin in-licensing agreement confirms Alcon’s commitment to bringing innovative eye care treatments to patients with unmet medical needs. With the company’s extensive commercial capabilities, geographic footprint and strong relationships with retinal specialists and ophthalmologists around the globe, Alcon is well positioned to bring this innovative treatment to patients around the world.

References
1. Thrombogenics. MIVI-TRUST Phase III Clincal Data.
2. Alcon internal estimates.
3. Trese M, Kaiser P, Dugel P, Brown D, & Humayun M (2011). Symptomatic Vitreomacular Adhesion (VMA): Diagnosis, Pathologic Implications, and Management. Retina Today, July/August (Supplement).

 

Source: Novartis Release.

 

Aspirin Might Have a Role in Preventing and Treating Cancer .


Regular aspirin use is associated with lower risks for cancer, death from cancer, and cancer metastasis, according to three meta-analyses in the Lancet and Lancet Oncology.

In one, researchers examined patient data in some 50 controlled trials of aspirin versus placebo for prevention of vascular events. Aspirin recipients showed a significant reduction in cancer incidence and cancer-related deaths.

A separate analysis of five cardiovascular prevention trials found that among participants who developed cancer during the trials, aspirin recipients showed a lower rate of distant metastases, especially in colorectal cancer.

Finally, an analysis of these effects in observational versus randomized studies showed that aspirin’s benefits were consistent in both types of studies.

Commentators argue that the absence of data from the Women’s Health Study and the Physicians’ Health Study (neither of which found anticancer benefits from aspirin) makes the findings less compelling, and definitive conclusions about the routine use of aspirin for cancer prevention cannot yet be made.

Source: Lancet

 

 

 

 

Guideline Issued for Managing Acute Bacterial Rhinosinusitis .


The Infectious Diseases Society of America has released a clinical practice guideline on the management of acute bacterial rhinosinusitis in children and adults.

The guideline, published in Clinical Infectious Diseases, first points out that a bacterial cause accounts for 2%–10% of acute rhinosinusitis cases. Among the recommendations:

  • Bacterial rather than viral rhinosinusitis should be diagnosed when any of the following occurs:
    • persistent symptoms lasting at least 10 days, without improvement;
    • severe symptoms or high fever and purulent nasal discharge or facial pain for 3–4 days at illness onset;
    • worsening symptoms after an initial respiratory infection, lasting 5–6 days, has started to improve.
  • Empirical therapy should be started as soon as acute bacterial rhinosinusitis is diagnosed clinically; amoxicillin-clavulanate, instead of amoxicillin alone, is recommended for both children and adults.
  • Macrolides and trimethoprim-sulfamethoxazole are not recommended as empirical therapy, because of high rates of antimicrobial resistance.

The guideline includes an algorithm for sinusitis management, with recommendations for treating patients who do not respond to initial empirical therapy.

Source:IDSA guideline in Clinical Infectious Diseases

The Cost Benefit of Antimicrobial Stewardship.


An active antimicrobial stewardship program at an academic medical center was found to be highly cost-effective.

The increasing prevalence of antimicrobial-resistant microorganisms and the rising costs of medical care have led many institutions to consider developing active antimicrobial stewardship programs (ASPs). Given the up-front personnel expenses required to initiate such programs, return on investment remains a concern.

An ASP was in place at the University of Maryland Medical Center for 7 years. It was centered on an antimicrobial monitoring team that included an infectious diseases–trained clinical pharmacist and a part-time infectious diseases physician. Investigators recently analyzed the effects of this program on antimicrobial utilization costs.

Between the year before ASP implementation (fiscal year [FY] 2001) and the final year of the program (FY 2008), the institutional antimicrobial utilization costs per 1000 patient days were reduced by 45.8% — from US$44,181 to $23,933. An institutional cost savings of $2,949,705 was seen during the first 3 years of the program, with savings achieved through decreased use of antifungal and antibacterial agents and switching from intravenous to oral delivery. Defined daily doses of antimicrobial agents were also reduced significantly between FY 2004 (the first period with available data) and FY 2008.

After the program was discontinued, antimicrobial utilization costs per 1000 patient days increased to $27,833 in FY 2009 and $31,653 in FY 2010. Institutional antimicrobial utilization costs rose by $1,000,000 in FY 2009 and by an additional $873,184 in FY 2010. No significant changes were seen in length of stay, readmissions, mortality, or drug-related group case mix index over the 10-year study period.

Comment: The potential benefits of antimicrobial stewardship are likely dependent on the personnel involved and the changing costs of antimicrobial agents, and they will be influenced by the advent of more-robust electronic medical record systems. Still, these findings offer a compelling justification for academic medical centers, in particular, to institute ASPs. The authors note that, on the basis of cost-effectiveness data, an ASP has now been reimplemented at their institution.

Source: Journal Watch Infectious Diseases

Stenting or Medical Therapy for Stable CAD: A Game Changer?


A new meta-analysis convincingly supports optimal medical therapy as an initial approach.

The benefits of percutaneous coronary intervention (PCI) for acute myocardial infarction (MI) are well established, but the procedure’s effectiveness for stable coronary artery disease (CAD) has long been questioned. Trials have consistently failed to show that PCI reduces the risk for MI or death compared with an initial strategy of optimal medical therapy (OMT). Previous meta-analyses have been limited by the “moving targets” of evolving approaches to PCI and OMT, and by the conflation of stable CAD with acute coronary syndromes. Now, investigators have conducted a meta-analysis of trials comparing contemporary medical therapy and stent-era PCI in patients with stable ischemic heart disease.

Included were eight trials involving 7229 patients (mean age, 57–64; 68%–100% men). All trials were prospective, randomized comparisons of PCI plus medical therapy versus medical therapy alone and were limited to patients with stable CAD. Stents were implanted during 72% of initial PCIs, although drug-eluting stents were used in only a small minority. During a mean weighted follow-up of 4.3 years, no significant differences were found between the PCI and OMT groups in the risks for death (8.9% and 9.1%), nonfatal MI (8.9% and 8.1%), unplanned revascularization (21.4% and 30.7%), or persistent angina (29% and 33%).

Comment: Although rates of unplanned revascularization and persistent angina trended lower in the percutaneous coronary intervention group than in the medical therapy group, the differences were not statistically significant, and the rates of death and MI were similar in both groups. The angina differences are particularly difficult to interpret because the trial definitions of this endpoint varied, intermediate (1–2-year) comparisons were not possible, and not all crossovers to PCI were accounted for. Furthermore, whether a preponderance of drug-eluting stents would have altered these results is not known. Regardless, physicians can reasonably conclude (and inform their patients) that PCI for stable CAD will not reduce their risk for death or MI, and that in most cases of persistent angina, a trial of OMT before pursuing PCI is entirely rational, if not prudent.

Source: Journal Watch Cardiology

Magnetic cloak: Physicists create device invisible to magnetic fields.


Autonomous University of Barcelona researchers, in collaboration with an experimental group from the Academy of Sciences of Slovakia, have created a cylinder which hides contents and makes them invisible to magnetic fields. The device was built using superconductor and ferromagnetic materials available on the market. The invention is published this week in the journal Science.

 

 

The cylinder is built using high temperature superconductor material, easily refrigerated with liquid nitrogen and covered in a layer of iron, nickel and chrome. This simple and accessible formula has been used to create a true invisibility cloak.

The cylinder is invisible to magnetic fields and represents a step towards the invisibility of light – an electromagnetic wave. Never before had a device been created with such simplicity or exactness in theoretical calculations, or even with such important results in the laboratory.

Researchers at UAB, led by Àlvar Sánchez, lecturer of the Department of Physics, came up with the mathematical formula to design the device. Using an extraordinarily simple equation scientists described a cylinder which in theory is absolutely undetectable to magnetic

fields from the outside, and maintains everything in its interior completely isolated from these fields as well.

Equation in hand and with the aim of building the device, UAB researchers contacted the laboratory specialising in the precise measurement of magnetic fields at the Institute of Electrical Engineering of the Slovak Academy of Sciences in Bratislava. Only a few months later the experimental results were clear. The cylinder was completely invisible to magnetic fields, made invisible whatever content was found in its interior and fully isolated it from external fields.

The superconductor layer of the cylinder prevents the magnetic field from reaching the interior, but distorts the external field and thus makes it detectable. To avoid detection, the ferromagnetic outer layer made of iron, nickel and chrome, produce the opposite effect. It attracts the magnetic field lines and compensates the distortion created by the superconductor, but without allowing the field to reach the interior. The global effect is a completely non-existent magnetic field inside the cylinder and absolutely no distortions in the magnetic field outside.

 

Magnetic fields are fundamental for the production of electric energy – 99% of energy consumed is generated thanks to the magnetic camps within the turbines found in power stations – and for the design of engines for all types of mechanic devices, for new advances made in computer and mobile phone memory devices, etc. For this reason controlling this field represents an important achievement in technological development. Scientists are perfectly familiar with the process of creating magnetism. However, the process of cancelling at will is a scientific and technological challenge, and the device created by UAB scientists opens the way for this possibility.

The results of this research project also pave the way for possible medical applications. In the future, similar devices designed by UAB researchers could serve to block a pacemaker or a cochlear implant in a patient needing to undergo a magnetic resonance.

 

ABSTRACT 
Invisibility to electromagnetic fields has become an exciting theoretical possibility. However, the experimental realization of electromagnetic cloaks has only been achieved starting from simplified approaches (for instance, based on ray approximation, canceling only some terms of the scattering fields, or hiding a bulge in a plane instead of an object in free space). Here, we demonstrate, directly from Maxwell equations, that a specially designed cylindrical superconductor-ferromagnetic bilayer can exactly cloak uniform static magnetic fields, and we experimentally confirmed this effect in an actual setup.

Source: Universitat Autonoma de Barcelona

 

Rheumatoid Arthritis Linked to Risks for Atrial Fibrillation and Stroke.


Close monitoring of RA patients is warranted.

Rheumatoid arthritis (RA) is associated with elevated risks for myocardial infarction and cardiovascular mortality (for example, JW Womens Health May 20 2003). To assess whether people with RA are at excess risks for atrial fibrillation (AF) and stroke, researchers analyzed data from 4.2 million people (age >15 years) in Denmark who were free of RA, AF, and stroke before 1997 (baseline).

During a median follow-up of 4.8 years, roughly 18,000 people developed RA (mean age at onset, 59), 156,000 developed AF, and 165,000 experienced stroke. The risk for AF, adjusted for age and sex, was nearly 40% higher among people with RA than in the rest of the population (8.2 vs. 6.0 events per 1000 person-years). Adjusted risk for stroke was more than 30% higher among people with RA than in the rest of the population (7.6 vs. 5.7 events per 1000 person-years).

Comment: In this population-based study, people who developed rheumatoid arthritis had significantly increased risks for atrial fibrillation and stroke. This finding is biologically plausible; for example, systemic inflammation is associated with both AF and stroke. The authors estimate that for every 12 patients followed for 10 years after RA diagnosis, 1 will develop AF. They therefore recommend closely monitoring RA patients for development of AF and adding RA as a factor in risk-stratification schemes for stroke.

Source:Journal Watch General Medicine