Is Aspirin the New (Old) Immunotherapy?


Hello. I’m David Kerr, professor of cancer medicine from the University of Oxford.

For those of you who follow me on Medscape and WebMD, you know that I don’t like aspirin: I love it. I think it’s a wonderful drug. There’s a lot of work going on just now looking at its molecular pharmacology.

There’s a great recent paper published by Dr Tsuyoshi Hamada and colleagues[1] looking at the role of aspirin as an immune checkpoint blockade inhibitor. It’s a lovely study. Using part of a retrospective sample collection, they were able to look at the impact of post-primary treatment use of aspirin in patients with resectable colorectal cancer.

They hypothesized that patients who had tumors with low expression of programmed cell death ligand 1 (PD-L1) would be more sensitive to the beneficial effects of aspirin. They looked at just over 600 patients. [The study used] a beautiful statistical analysis that stratified [findings] and accounted for all of the other contributory factors that might be tied up with aspirin’s use: PIK3CA mutations, [CDX2 expression], and even tumor-infiltrating lymphocytes. It’s what you would expect from a research group of this quality. The analysis was done very carefully indeed.

At the end of the study, they showed that their hypothesis was correct. Patients with tumors with relatively low expression of PD-L1 (also known as CB274) did better than those patients who had tumors that expressed high levels of PD-L1, for which aspirin seemed to have no benefits at all.

This all fits in with the link-up between the prostaglandin E2 pathway and immune suppression. It suggests that aspirin may be yet another potential partner drug that may enhance the activity of the huge excitement around the drugs which block the PD-L1, PD-1, the whole immune checkpoint pathway just now.

It was a really nice, very carefully conducted study. The results were quite compelling in terms of the survival benefits accrued to postsurgical use of aspirin in patients with low levels of PD-L1 expression. It again shows the importance of the microenvironment in determining the outcome of tumor behavior. This gives some potential therapeutic insight into why aspirin might be a very useful companion drug to give in combination with these rather more expensive, more complex immune blockade inhibitors.

Aspirin wins again. There’s yet more plausible biological mechanism supporting its use.

 

For all book lovers please visit my friend’s website.
URL: http://www.romancewithbooks.com

Early Menarche, Menopause Tied to Higher CVD Risk


Several reproductive factors contributed to a higher risk of cardiovascular disease among women, including early periods and early menopause, researchers found.

A history of hysterectomy was also linked with increased risk of cardiovascular disease (CVD) and coronary heart disease, reported Sanne AE Peters, PhD, and Mark Woodward, PhD, both of the University of Oxford in England.

However, history of oophorectomy, as well as age at first birth, had either no associations or only minor inverse associations with increased risk for cardiovascular disease, the authors wrote in Heart.

They pointed to “increasing evidence” that in addition to traditional risk factors such as elevated blood pressure, smoking, and obesity, certain reproductive factorsmay be linked with later cardiovascular disease, though the evidence is “mixed and inconsistent.”

This cross-sectional analysis of UK Biobank data comprised 267,440 women and 215,088 men ages 40 to 69 without a history of cardiovascular disease. The authors found that during 7 years of follow-up, there were 9,054 cases of cardiovascular disease, 5,782 cases of coronary heart disease, and 3,489 cases of stroke. Women comprised about a third of cardiovascular disease cases, a little under 30% of coronary heart disease cases, and about 40% of stroke cases.

Examining demographic data for women, the mean age was 56, about half were from a higher socioeconomic bracket in the U.K., and 60% said they never smoked.

Results were mixed for certain reproductive factors and increased risk for cardiovascular disease. The mean age for menarche was 13 years, and women who had their first periods prior to age 12 had a higher risk of cardiovascular disease (adjusted HR 1.10, 95% CI 1.01-1.30) than women who had menarche at a later age. Similar increased risks were seen for coronary heart disease (adjusted HR 1.05, 95% CI 0.93-1.18) and stroke (adjusted HR 1.17, 95% CI 1.03-1.32).

 Sixty-one percent of women in the study were postmenopausal, with a mean age at natural menopause of 50 years. But early menopause was also linked with increased risk of cardiovascular disease (adjusted HR 1.33, 95% CI 1.19-1.49), coronary heart disease (adjusted HR 1.29, 95% CI 1.10-1.51), and stroke (adjusted HR 1.42, 95% CI 1.21-1.66).

Likewise, history of hysterectomy was linked with an increased risk of cardiovascular disease (adjusted HR 1.12, 95% CI 1.03-1.22) and coronary heart disease (adjusted HR 1.20, 95% CI 1.07-1.34).

Eighty-five percent of women had been pregnant, and 44% of women had two children, while 42% of men had fathered two children. Compared with women and men without children, there was a significantly higher risk of coronary heart disease in women (adjusted HR 1.21, 95% CI 1.05-1.40). But because these risks were similar among men (adjusted HR 1.13, 95% CI 1.04-1.23), the authors concluded that “this is unlikely to be due to a biological cause.”

The authors suggested that, “More frequent cardiovascular screening would seem to be sensible among women who are early in their reproductive cycle, or who have a history of adverse reproductive events or a hysterectomy, as this might help to delay or prevent their onset of CVD.”

Kamal R Mahtani: Using systematic reviews to reduce research waste—who really cares?


The global spend on biomedical research and development is estimated to be about $250 billion (£203 bn; €233 bn) each year—a not insignificant figure. In fact, it roughly equates to the amount that the UK government spends each year on its combined education, defence, and welfare budget.

Kamal R Mahtani

But suppose you heard that the UK government’s budget for education, defence, and welfare was being wasted and provided no public benefit at all—what would your reaction be? Surprise? Indignation? Anger?

Chalmers and Glasziou’s revelation that about 85% of biomedical research may be wasted should evoke at least as great an outraged response. This estimate of wastage is derived from a number of factors—including failure to conduct, describe, and publish research to standards that make the outputs useful. This is particularly concerning, given that much of this research is expected to provide some form of patient benefit.

Among the proposed steps to reduce this waste is a recommendation that new research should not be conducted until a systematic assessment of existing research has been performed. There are at least two reasons why this makes total sense. Why would you want to conduct a new study if the answer was already available? And, more importantly, if the answer was available already, would it even be ethical to enter participants into a new study when some might fare worse?

Some public funders of research, such as the UK’s National Institute for Health Research (NIHR), which manages an annual budget of about £1 billion for applied research, recognise the importance of systematic reviews in reducing research waste. As one example of this, new applicants seeking funding from the NIHR to support primary research must now ensure that their request is informed by knowledge of the existing evidence.

But what about other funders? The aim of a recent survey of the websites of 11 international research funding organisations was “to indicate the extent to which each organisation adopted waste reducing policies and processes.” The authors extracted information on whether the funder involved members of the public in setting research priorities; made full protocols for funded research publicly accessible; and had specific funding streams for methods research. The survey also identified information about how the funders supported the conduct of systematic reviews.

Although the authors reported that they used their judgment in the interpretation of their findings, they do also report that they checked the accuracy of their findings with each organisation. According to their survey results, only two of the funders, the NIHR and the Canadian Institutes of Health Research, had clearly dedicated streams to fund systematic reviews. The survey also concluded that only one funder, the NIHR, made it a prerequisite that all applications for support should be informed by a systematic review of the existing evidence.

Scientific, ethical, and economic reasons make it essential to conduct a systematic review of existing evidence before considering a new study. Failure to do this is simply poor practice. This recent survey shows that there is still work to be done to ensure that this principle is clearly and demonstrably supported. Given their large budgets and major influence on what does and does not get financial support, funding agencies have a particular responsibility to lead this initiative.

Kamal R Mahtani is an NHS GP and deputy director of the Centre for Evidence Based Medicine, Nuffield Department of Primary Care Health Sciences, University of Oxford.

Source: BMJ

Blind British man in world’s first operation to deliver modified DNA to his eyes


Eye surgeons carry out the operation 
A 29-year-old man with an inherited form of blindness has become the first in the world to receive groundbreaking gene therapy  

Thousands of people born with a faulty gene which makes them go blind have been offered new hope after a British man underwent the world’s first operation to deliver new DNA to his eyes and restore his sight.

Around 15,000 people in Britain suffer from x-linked retinitis pigmentosa, a deteriorating condition which brings a slow and irreversible loss of vision, and which is the leading cause of blindness in young people.

Loss of sight occurs because a gene responsible for maintaining the light sensitive cells at the back of the eye is missing half of its DNA code.

 But scientists can now replace the code using a groundbreaking technique which reprogrammes the gene in the lab, then delivers the healthy DNA into the eye, via a harmless virus.
Scientists used a harmless virus to transport the modified DNA to the man's eyes 
Scientists used a harmless virus to transport the modified DNA to the man’s eyes  

Last Thursday, a 29-year-old man became the first person in the world to undergo the procedure at Oxford Eye Hospital and is now recovering.

Robert MacLaren, Professor of Ophthalmology at the University of Oxford, who is leading the trial said: “He is doing well and now at home, but we will have to wait a few years to know if it has stopped his retina from degenerating.

“The effect of disease on families with retinitis pigmentosa is devastating and we have spent many years working out how to develop this gene therapy.

“Changing the genetic code is always undertaken with great caution, but the new sequence we are using has proven to be highly effective in our laboratory studies.

“The genetic code for all life on Earth is made up of four letters – G, T, A and C. In retinitis pigmentosa, however, half of the RPGR gene comprises only two letters – A and G.

“This makes the gene very unstable and prone to mutations, making it a lead cause of blindness in patients with retinitis pigmentosa. RPGR is vital for the light sensitive cells at the back of the eye.”

Robert MacLaren, Professor of Ophthalmology at the University of Oxford
Robert MacLaren, Professor of Ophthalmology at the University of Oxford

Retinitis pigmentosa affects 1 in 4000 people, with symptoms that typically appear between age 10 and 30. Night vision and peripheral vision go first, as the photoreceptors active in low light – the ‘rods’ – start to degenerate.

Eventually the condition affects the ‘cones’ – the photoreceptors responsible for central, detailed, colour vision, causing complete sight loss.

Doctors want to enroll at least 24 more patients in the trial to find out if the technique is safe and effective.

The combined NHS, University of Oxford and Nightstar gene therapy team
The combined NHS, University of Oxford and Nightstar gene therapy team

Are IVF pregnancies more ‘precious’?


Two pregnant women

Do women with IVF pregnancies need special attention?

  • Women who have gone through fertility treatment often say it had a huge emotional and psychological impact on them and their partners.

In many cases, couples have spent years trying to conceive before going through several cycles of IVF, which can be expensive and traumatic, with no guarantee of success.

So are pregnancies achieved through assisted fertility treatments viewed as inherently more precious to everyone involved?

“Start Quote

“Until you go home with your baby in your arms, that anxiety is always there.”

Susan SeenanInfertility Network UK

A study from Plymouth University published last month suggests they are. Dr Yaniv Hanoch asked 160 Israeli obstetricians and gynaecologists whether they would recommend a test for a serious medical condition during pregnancy.

He found that doctors were three times more likely to recommend the test, which carried a small risk, for a natural pregnancy than for an IVF pregnancy.

Dr Hanoch, associate professor in psychology, said: “When considering a procedure that may endanger a pregnancy, the value ascribed to loss of that pregnancy may seem greater if the pregnancy was achieved by tremendous effort.”

In 2005, Minkoff and Berkowitz published a study in the American journal, Obstetrics and Gynecology entitled ‘The Myth of the Precious Baby’.

It said that because increasing numbers of pregnant women were aged over 40 and more were pregnant thanks to assisted reproductive technologies, this had resulted in more ceasarean deliveries, reinforcing the idea among obstetricians that they were dealing with ‘precious babies’.

Ante-natal check up
IVF women often want reassurance on aches and pains during pregnancy

On the ground, there is less evidence of sensitivity and understanding from health professionals towards women with IVF pregnancies.

Susan Seenan, from the Infertility Network UK, says the system lets these women down.

“When these women finally go to their GP and say they are pregnant, they are referred for ante-natal care and that’s it.

“Even when they make it known they have had IVF, they are seen as just another pregnant lady.”

She says sometimes even when women reveal they have suffered miscarriages or have had fertility issues, there is a lack of sympathy.

“Start Quote

Some women like to feel they have access to extra information as required, even if it’s just a phone number to speak to a midwife about any aches or pains.”

Mr Tim ChildOxford Fertility Unit

She says fertility treatment is widely recognised to be a physically, psychologically and financially demanding process – and it can leave women feeling they have been on an ’emotional rollercoaster’.

“A lot of women feel very anxious, because they have been through so much, and many women really do worry that everything will be OK.

“Until you go home with your baby in your arms, that anxiety is always there. People need to understand why they are feeling vulnerable and anxious.

“If they have been through the IVF system they will have had a lot of attention, appointments, blood tests and scans – and they expect that attention to continue.”

Instead, many women are left feeling isolated when they are most in need of reassurance.

Seenan says this could be remedied by providing support in the form of a phoneline to call in times of anxiety or information leaflets to read.

Research does seem to confirm higher levels of anxiety in women with IVF pregnancies, says Julie Jomeen, professor of midwifery at Hull University, who adds that their feelings can mean they want a more medicalised approach to their pregnancy.

Older mums-to-be may request a caesarean section delivery, believing that it is safer, for example.

An obstetrician discussing options with a pregnant womanSome women choose not reveal they had fertility treatment

Or a woman who is scared of losing her baby throughout pregnancy, may need reassurance that normal symptoms of pregnancy, such as backache, are not something more serious.

Mr Tim Child, medical director at the Oxford Fertility Unit at the University of Oxford, acknowledges that women who have conceived naturally can have anxieties too, but he says it would be understandable if IVF women felt they needed more support.

“Some women like to feel they have access to extra information as required, even if it’s just a phone number to speak to a midwife about any aches or pains.”

He says not all women want to disclose that they have been through IVF because there is still some stigma attached to it. Others may want to be treated the same as every other woman, so their IVF history may not always appear on their personal notes.

Medically, there are slightly higher risks of complications in IVF pregnancies, particularly if the woman is older, has underlying health problems or is having twins, so Mr Child says consultants should be vigilant.

A study is currently underway at Oxford into how midwives care for women have had fertility treatment.

When women with IVF pregnancies are open about their anxieties, what they are looking for is not special treatment in the belief that their baby is more precious than anyone else’s, but reassurance and support during the final stages of a long and emotional journey.

Even when the baby is born, it doesn’t end, Susan Seenan says.

“Because the baby has been wanted for so along, you put pressure on yourself to be a perfect parent. So you’re not allowed to complain when it cries at night or doesn’t feed well. But in the end, we are just parents like anyone else.”

Treatment for Dormant Malaria Shows Promise.


The first new drug in half a century to target malaria parasites in one of their best hideouts is showing encouraging results. The researchers developing the drug, called tafenoquine, said today that data from a recently completed phase II trial were promising enough that they will soon start a phase III trial—the last step before asking drug regulators for approval.

Tafenoquine kills the malaria parasite when it is lurking in liver cells, in a form called the hypnozoite, or “sleeping parasite.” Hypnozoites don’t cause any symptoms and are impossible to detect with blood tests. But when triggered by signals that aren’t fully understood, they can reactivate to cause a new bout of malaria—which can then be picked up by mosquitoes and passed on to new victims. Five species of Plasmodium can cause malaria in humans. Two of them—Plasmodium vivax, which is widespread, and the relatively rare P. ovale—can form hypnozoites. This ability to hide is one of the things that makes P. vivax so difficult to eliminate from a region.

Now, the only treatment that can cure vivax malaria—hiding parasites and all—is a 14-day course of a drug called primaquine, which was developed in the 1940s. It works fairly well, but it is difficult for people who don’t feel ill to complete the whole 2 weeks. “The compliance with the current regimen is really a problem,” says JP Kleim, director of clinical development for the pharmaceutical company GlaxoSmithKline (GSK). “The acute malaria is gone after a few days [of treatment],” so patients’ motivation to continue taking drugs is low. That’s why GSK decided to develop tafenoquine, together with the Medicines for Malaria Venture, a Geneva-based nonprofit. The partners launched a trial in 2011 to test whether a single dose of tafenoquine could work as well as the 2-week course of primaquine.

Vivax malaria, shown here in the blood stream, can hide out—undetectable—in liver cells.

The data, presented today at the American Society of Tropical Medicine and Hygiene Annual Meeting in Washington, D.C., suggest that a single dose works very well. The trial involved 329 patients in Brazil, India, Thailand, and Peru. In patients who received either a 300 mg or 600 mg dose of the drug, 90% had no relapses after 4 months. The partners will now go forward with a phase III trial, testing the safety and efficacy of the 300 mg dose in 600 patients, says Marcus Lacerda of the Fundação de Medicina Tropical Dr. Heitor Vieira Dourado in Manaus, Brazil, who helped coordinate the study and presented the results at the meeting today.

A single-dose drug would be a huge advantage in the fight against vivax malaria, says Ric Price of the Menzies School of Health Research in Darwin, Australia, and the University of Oxford in the United Kingdom. “One of the biggest challenges we face is how can we adequately and reliably treat the hypnozoite stage.”

Toddler brain scan language insight.


Regions of the brain that show leftward asymmetry of myelin
The left hand side of the brain has more myelin

The brain has a critical window for language development between the ages of two and four, brain scans suggest.

Environmental influences have their biggest impact before the age of four, as the brain’s wiring develops to process new words, say UK and US scientists.

The research in The Journal of Neuroscience suggests disorders causing language delay should be tackled early.

It also explains why young children are good at learning two languages.

The scientists, based at King’s College London, and Brown University, Rhode Island, studied 108 children with normal brain development between the ages of one and six.

“Start Quote

Our work seems to indicate that brain circuits associated with language are more flexible before the age of 4, early intervention for children with delayed language attainment should be initiated before this critical age”

Dr Jonathan O’Muircheartaigh King’s College London

They used brain scans to look at myelin – the insulation that develops from birth within the circuitry of the brain.

To their surprise, they found the distribution of myelin is fixed from the age of four, suggesting the brain is most plastic in very early life.

Any environmental influences on brain development will be strongest in infanthood, they predict.

This explains why immersing children in a bilingual environment before the age of four gives them the best chance of becoming fluent in both languages, the research suggests.

It also suggests that there is a critical time during development when environmental influence on cognitive skills may be greatest.

Dr Jonathan O’Muircheartaigh, from King’s College London, led the study.

He told the BBC: “Since our work seems to indicate that brain circuits associated with language are more flexible before the age of four, early intervention for children with delayed language attainment should be initiated before this critical age.

“This may be relevant to many developmental disorders, such as autism, since delayed language is a common early trait.”

Growing vocabulary

Early childhood is a time when language skills develop very rapidly.

Babies have a vocabulary of up to 50 words at 12 months but by the age of six this has expanded to about 5,000 words.

Language skills are localised in the frontal areas of the left-hand side of the brain.

The researchers therefore expected more myelin to develop in the left-hand side of the brain, as the children learned more language.

In fact, they found it remained constant, but had a stronger influence on language ability before the age of four, suggesting there is a crucial window for interventions in developmental disorders.

“This work is important as it is the first to investigate the relationship between brain structure and language across early childhood and demonstrate how this relationship changes with age,” said Dr Sean Deoni from Brown University, a co-researcher on the study.

“This is important since language is commonly altered or delayed in many developmental disorders, such as autism.”

Commenting on the study, Prof Dorothy Bishop of the department of Developmental Neuropsychology at the University of Oxford said the research added important new information about early development of connections in brain regions important for cognitive functions.

“There is suggestive evidence of links with language development but it is too early to be confident about functional implications of the findings,” she said.

“Ideally we would need a longitudinal study following children over time to track how structural brain changes relate to language function.”

The study was funded by the National Institutes for Mental Health (US) and the Wellcome Trust (UK).

How to Regrow a Head.


A single gene switch makes worms regenerate their whole bodies from their tails

Knocking out a single gene can switch on a worm’s ability to regenerate parts of its body, even enabling it to grow a new head. The fact that such a simple manipulation can restore healing abilities provides new insight into how the stem cells involved in this process are marshaled in animals.

how-to-regrow-a-head_1

Some animals, such as salamanders and newts, can regenerate entire body parts, and mice can regrow toes if left with enough nail (see ‘How nails regenerate lost fingertips’). Yet other species, including humans, merely produce scar tissue after an amputation. A trio of studies published on Nature’s website today offers new clues as to what is behind these differences.

All three studies looked at Wnt genes, which code for a series of enzymes that relay information from outside the cell to the nucleus, eventually producing proteins called β-catenins, which regulate gene expression. Wnt genes occur in all animals, but the studies looked at their roles in planarian flatworms. Some planarians can completely regenerate from small body parts such as their tails, whereas other flatworm species have more limited regenerative abilities.

Flatworm, heal thyself
Scientists already knew that the Wnt genes are expressed in a gradient along the worms’ bodies—from high at the tail to low at the head—and suspected that the genes were involved in directing stem cells during healing. In the latest studies, researchers wanted to find out if a lack of Wnt gene expression was responsible for the poorer regenerative abilities in particular worm species.

When these species are sliced apart at a point more than halfway to their tail ends, they can regenerate a tail from the head piece, but the tail section is unable to form a new head. However, if the wound is closer to the head—not more than about one-third of the way from it—then both parts will fully regenerate.

To explain the disparity, Jochen Rink, a molecular biologist at the Max Planck Institute of Molecular Cell Biology and Genetics in Dresden, Germany, sliced a worm called Dendrocoelum lacteum at different positions along its body. He and his team then sequenced RNA from the various wounds. The researchers found that, in wounds that did regrow heads, genes coding for a series of enzymes involved in the Wnt pathway had their expression turned up. But in the pieces that couldn’t regrow, the Wnt genes “didn’t even twitch”, Rink says.

In the second study, developmental biologists Phillip Newmark of the University of Illinois at Urbana-Champaign and James Sikes (now at the University of San Francisco in California) found similar roles for Wnt genes in a different species of worm, called Procotyla fluviatilis.

But perhaps most surprisingly, both teams found that by suppressing a gene that regulated Wnt function in their flatworms, they could get chunks of the normally non-regenerative tissue to grow fully functional heads.

“This is a fantastic advert for our field,” says Aziz Aboobaker, a biologist who studies planarian worms at the University of Oxford, UK, but was not involved in any of the studies. “Here’s a scenario where these animals don’t regenerate a brain, and then by knocking out just one gene, it’s possible to rescue that.”

Heady stuff
In the third study Yoshihiko Umesono, now at the University of Tokushima in Japan, and colleagues found that in the flatworm Phagocata kawakatsui, another signaling cascade—the extracellular signal-related kinase (ERK) pathway—had apreviously unsuspected role in regeneration.

In an e-mail to Nature, Umesono suggests that the effects of ERK proteins and Wnt proteins counteract each other. If the Wnt pathway dominates then it signals tail growth, but if ERK suppresses its influence then heads can form.

Because Wnt and ERK proteins are present in all animals, Rink suggests that regenerative capacity could exist in many species, but might be in a latent state because it is silenced. Once the silencing is removed, regeneration could reappear, he thinks.

“Sure, that’s a possibility,” says Aboobaker. But he thinks that the implications are broader than just worms regrowing heads.

“What’s happening here is that cells are reading their position in the body and then rebuilding the requisite structures,” Aboobaker says. “That’s also what happens when cells from your liver or kidney replace themselves—if we can understand those processes better, that’s useful.”

Source: http://www.scientificamerican.com

Computed Tomography for Adults with Right Lower Quadrant Pain.


In about a third of patients, diagnoses other than appendicitis were evident.

Most adults with acute right lower quadrant abdominal pain now undergo computed tomography (CT) when suspicion for appendicitis is at least moderate. In a study from one teaching hospital in Wisconsin, researchers reviewed the results of CT scans ordered explicitly to evaluate 1571 consecutive adults for appendicitis or right lower quadrant pain. All patients were referred from the emergency department or urgent-care settings.

CT revealed appendicitis in 24% of patients; according to review of clinical records, sensitivity and specificity of CT for appendicitis were 99% and 98%, respectively. CT also demonstrated specific alternative diagnoses in 32% of patients, and no specific diagnoses in 45%. Adnexal abnormalities accounted for nearly one third of alternative diagnoses in women; otherwise, the spectrum of alternative diagnoses was fairly similar in men and women. The most common alternative diagnoses (as a proportion of the 496 patients with alternative diagnoses) were inflammatory enteritis or adenitis (17%), urolithiasis (12%), diverticulitis (8%), and constipation (7%). Small bowel obstruction, inflammatory bowel disease, and cholecystitis each accounted for 4% of alternative diagnoses.

Comment: This study confirms that CT is useful not only to improve accuracy of clinical examination for appendicitis diagnosis but also to identify alternative causes of right lower abdominal pain. However, because the threshold for performing CT has become quite low, CT results will be normal or nonspecific in many cases. Note that most patients in this study underwent full abdominal and pelvic CT with intravenous and oral contrast and not just focused appendix CT.

Source: Journal Watch General Medicine.