A spoon that steadies tremors


Just in time for the holidays, Google is throwing its money, brain power and technology at the humble spoon.

Anupam Pathak, a senior hardware engineer at
Google, displays the prototype of the Liftware
Spoon.

Of course these spoons (don’t call them spoogles) are a bit more than your basic utensil: using hundreds of algorithms, they allow people with essential tremors and Parkinson’s disease to eat without spilling.

The technology senses how a hand is shaking and makes instant adjustments to stay balanced. In clinical trials, the Liftware spoons reduced shaking of the spoon bowl by an average of 76 per cent.

“We want to help people in their daily lives today and hopefully increase understanding of disease in the long run,” said Google spokesperson Katelin Jabbari.

“It’s totally novel,” said UC San Francisco Medical Center neurologist Jill Ostrem who specializes in movement disorders like Parkinson’s disease and essential tremors.

She helped advise the inventors, and says the device has been a remarkable asset for some of her patients.

“I have some patients who couldn’t eat independently, they had to be fed, and now they can eat on their own,” she said. “It doesn’t cure the disease, they still have tremor, but it’s a very positive change.”

More than 10 million people worldwide have essential tremors or Parkinson’s disease.

 

 

Antarctic ice thicker than previously thought: study


Groundbreaking 3D mapping of previously inaccessible areas of the Antarctic has found that the sea ice fringing the vast continent is thicker than previous thought.

Two expeditions to Antarctica by scientists from the U.K., U.S. and Australia analysed an area of ice spanning 5,00,000 metres squared, using a robot known as SeaBed.

They survey discovered ice thickness average between 1.4m and 5.5m, with a maximum ice thickness of 16m. Scientists also discovered that 76 per cent of the mapped ice was ‘deformed’ — meaning that huge slabs of ice have crashed into each other to create larger, denser bodies of ice.

The team behind the research, published in Nature Geoscience, have hailed it as an important breakthrough in better understanding the vast icy wilderness. The findings will provide a starting point to further work to discover how ice thickness, as well as extent, is changing. Previously, measurements of Antarctic ice thickness were hindered by technological constraints.

SeaBed, an autonomous underwater vehicle (or AUV), was used by the research team to analyse ice thickness at an underwater depth of 20 to 30 metres. Driven in a “lawnmower” pattern, the two-metre long robot used upward-looking sonar to measure and map the underside of sea ice floes. Oceanography robots are usually focussed on the sea floor.

The mapping took place in 2010 and 2012. It took researchers to the coastal areas of the Weddell, Bellingshausen, and Wilkes Land regions of Antarctica. The teams came from the British Antarctic Survey, the Institute of Marine and Antarctic Studies in Tasmania and the Woods Hole Oceanographic Institution in the U.S.

Dr. Guy Williams, from IMAS, said the research is an important step in gauging changes to Antarctic ice. “Sea ice is an important indicator of the polar climate but measuring its thickness has been tricky,” he said. “Along with the satellite data, it was a bit like taking an X-ray of the ice although we haven’t X-rayed much of it, just a postage stamp.”

As well as tracking alterations due to climate change, the research will be of interest to marine biologists due to the creatures, such as krill, that inhabit the region.

Blu-ray disc can be used to improve solar cell performance


Who knew Blu-ray discs were so useful? Already one of the best ways to store high-definition movies and television shows because of their high-density data storage, Blu-ray discs also improve the performance of solar cells—suggesting a second use for unwanted discs—according to new research from Northwestern University.

An interdisciplinary research team has discovered that the pattern of information written on a Blu-ray disc—and it doesn’t matter if it’s Jackie Chan’s “Supercop” or the cartoon “Family Guy”—works very well for improving across the solar spectrum. And better yet, the researchers know why.

“We had a hunch that Blu-ray discs might work for improving , and, to our delight, we found the existing patterns are already very good,” said Jiaxing Huang, a materials chemist and an associate professor of materials science and engineering in the McCormick School of Engineering and Applied Science. “It’s as if electrical engineers and computer scientists developing the Blu-ray technology have been subconsciously doing our jobs, too.”

Blu-ray discs contain a higher density of data than DVDs or CDs, and it is this quasi-random pattern, perfected by engineers over decades for data storage, that, when transferred to the surface of solar cells, provides the right texture to improve the cells’ light absorption and performance.

Working with Cheng Sun, an associate professor of mechanical engineering at McCormick, Huang and his team tested a wide range of movies and television shows stored on Blu-ray discs, including action movies, dramas, documentaries, cartoons and black-and-white content, and found the video content did not matter. All worked equally well for enhancing light absorption in solar cells.

The findings will be published Nov. 25 in the journal Nature Communications.

In the field of solar cells, it is known that if texture is placed on the surface of a solar cell, light is scattered more effectively, increasing a cell’s efficiency. Scientists have long been searching for the most effective texture with a reasonable manufacturing cost.

The Northwestern researchers have demonstrated that a Blu-ray disc’s strings of binary code 0s and 1s, embedded as islands and pits to store video information, give solar cells the near-optimal surface texture to improve their absorption over the broad spectrum of sunlight.

In their study, the researchers first selected the Jackie Chan movie “Supercop.” They replicated the pattern on the active layer of a polymer solar cell and found the cell was more efficient than a control solar cell with a random pattern on its surface.

“We found a random pattern or texture does work better than no pattern, but a Blu-ray disc pattern is best of all,” Huang said. “Then I wondered, why did it work? If you don’t understand why, it’s not good science.”

Huang puzzled over the question of why for some time. One day, his wife, Shaorong Liu, a database engineer at IBM, suggested it likely had something to do with data compression. That was the insight Huang needed.

Huang and Sun then turned to McCormick colleague Dongning Guo, an expert in information theory, to investigate this idea. Guo is an associate professor of electrical engineering and computer science.

The researchers looked closely at the data processing algorithms in the Blu-ray standard and noted the algorithms serve two major purposes:

  • Achieving as high a degree of compression as possible by converting the video signals into a seemingly random sequence of 0s and 1s; and
  • Increasing error tolerance by adding controlled redundancy into the data sequence, which also limits the number of consecutive 0s and 1s.

These two purposes, the researchers said, have resulted in a quasi-random array of islands and pits (0s and 1s) with feature sizes between 150 and 525 nanometers. And this range, it turns out, works quite well for light-trapping applications over the entire solar spectrum.

The overall broadband absorption enhancement of a Blu-ray patterned solar cell was measured to be 21.8 percent, the researchers report.

“In addition to improving , our simulation suggests the Blu-ray patterns could be broadly applied for light trapping in other kinds of solar cells,” Sun said.

“It has been quite unexpected and truly thrilling to see new science coming out of the intersection of information theory, nanophotonics and materials science,” Huang said.

Sweet-smelling breath to help diabetes diagnosis in children


The potential to quickly diagnose children with type 1 diabetes before the onset of serious illness could be achieved using a simple, non-invasive breath test, according to new research published today.

In one of the most comprehensive breath-based studies of with type 1 diabetes performed to date, a team of researchers from Oxford, UK have linked a sweet-smelling chemical marker in the breath with a build-up of potentially harmful chemicals in the blood that accumulate when insulin levels are low.

It is hoped these results—linking an increased level of breath acetone with increased levels of ketones in the blood—could inspire the development of a diagnostic device to identify children with new diabetes before the onset of diabetic ketoacidosis (DKA).

The results of the study have been published today, 26 November, in IOP Publishing’s Journal of Breath Research.

DKA occurs when a severe lack of insulin means the body cannot use glucose for energy and starts to break down fat instead. Organic compounds called ketones are the by-product of the breakdown of fat and, if left unchecked, can build up and cause the body to become acidic.

About one in four children diagnosed with type 1 diabetes don’t know they have it until they develop DKA, which can cause severe illness.

Acetone, which is the simplest ketone, is one of the by-products produced in the development of DKA and is usually disposed of through the breath. Indeed, for over 200 years acetone has been known to produce a sweet smell on the breath of diabetes sufferers.

In their study, the researchers, from the University of Oxford, Oxford Medical Diagnostics and Oxford Children’s Hospital, collected the from 113 children and adolescents between the ages 7 and 18.

Isoprene and acetone were collected in breath bags and measurements were compared with capillary blood glucose and ketone levels, which were taken at the same time during a single visit to Oxford Children’s Hospital.

The researchers found a significant relationship between increased levels of acetone in the breath of the subjects and increased levels of blood ketones—specifically β hydroxybutyrate.

They found no link between isoprene and acetone levels in breath and glucose levels in the blood.

Co-author of the study, Professor Gus Hancock, said: “While breath acetone has been measured in relatively large cohorts of healthy individuals, most measurements on people with type 1 diabetes have been carried out on relatively small cohorts, typically made up of less than 20 people, with relatively few measurements on children.

“Our results have shown that it is realistically possible to use measurements of breath acetone to estimate blood ketones.

“We are working on the development of a small hand held device that would allow the possibility of breath measurements for ketone levels and help to identify children with new diabetes before DKA supervenes. Currently testing for diabetes requires a blood test which can be traumatic for children.

“Also, if the relationship between breath acetone and blood ketone levels is true at higher levels of ketones, a simple breath-test could assist with the management of sick days in children with, preventing hospital admissions by providing a warning of the possible development of DKA.”

Researchers find brain network link between development, aging and brain disease.


A team of bio-researchers with members from across Europe has found evidence that suggests that grey matter development early in life tends to be the first to regress later in life—related findings also suggest a possible link between brain diseases such as Alzheimer’s and schizophrenia. In their paper published in Proceedings of the National Academy of Sciences, the team describes how they came to these conclusions after studying a large number of brain scans.

Many years ago, doctors often referred to as premature or early dementia. This was based on a theory first developed in the late 1800’s called retrogenesis, where it was suggested that brain ability deteriorated in reverse order to how it develops. It was also believed to apply in an evolutionary sense, e.g. evolving from apes into humans. In this new effort, the researchers appear to have found evidence to back up this early theory.

The researchers analyzed MRI scans of 484 people who ranged in age from 8 to 85, looking for patterns, most specifically in grey matter, the so-called of the brain—it’s believed it serves to coordinate so-called high order processing, such as information related to sights and sounds. They did find a pattern—they noticed that the same parts of the grey matter that developed later in life in young people, were the first to deteriorate later on in life due to natural aging. They also found that the same brain regions were impacted by both Alzheimer’s disease and schizophrenia, suggesting a possible link between the two and a link between brain disease and higher order regions of the brain. The findings also suggest that it may soon be possible to offer an early diagnosis for, by tracking late development of grey matter.

The researchers note, that it’s even conceivable that some day in the future it might be possible to prevent the brain changes from occurring in the first place if abnormally late development can be prevented. They also note that their findings suggest that environmental factors that slow neural network development could be a contributing factor to life-long ailments, and eventually to certain types of dementia.

More information: A common brain network links development, aging, and vulnerability to disease, Gwenaëlle Douaud, PNAS, DOI: 10.1073/pnas.1410378111

Abstract
Several theories link processes of development and aging in humans. In neuroscience, one model posits for instance that healthy age-related brain degeneration mirrors development, with the areas of the brain thought to develop later also degenerating earlier. However, intrinsic evidence for such a link between healthy aging and development in brain structure remains elusive. Here, we show that a data-driven analysis of brain structural variation across 484 healthy participants (8–85 y) reveals a largely—but not only—transmodal network whose lifespan pattern of age-related change intrinsically supports this model of mirroring development and aging. We further demonstrate that this network of brain regions, which develops relatively late during adolescence and shows accelerated degeneration in old age compared with the rest of the brain, characterizes areas of heightened vulnerability to unhealthy developmental and aging processes, as exemplified by schizophrenia and Alzheimer’s disease, respectively. Specifically, this network, while derived solely from healthy subjects, spatially recapitulates the pattern of brain abnormalities observed in both schizophrenia and Alzheimer’s disease. This network is further associated in our large-scale healthy population with intellectual ability and episodic memory, whose impairment contributes to key symptoms of schizophrenia and Alzheimer’s disease. Taken together, our results suggest that the common spatial pattern of abnormalities observed in these two disorders, which emerge at opposite ends of the life spectrum, might be influenced by the timing of their separate and distinct pathological processes in disrupting healthy cerebral development and aging, respectively.

AVOID Oxygen? Evidence of Harm in MI


Results of a new trial suggest supplemental oxygen therapy in patients with ST-elevation MI (STEMI) may actually be harmful for patients who are not hypoxic[1].

The Air Versus Oxygen in ST-Elevation Myocardial Infarction(AVOID) trial compared supplemental oxygen vs no oxygen unless O2 fell below 94%.

“The AVOID study found that in patients with ST-elevation myocardial infarction who were not hypoxic, there was this suggestion that, potentially, oxygen is increasing myocardial injury, recurrent myocardial infarction, and major cardiac arrhythmia and may be associated with greater infarct size at 6 months,” lead author Dr Dion Stub (St Paul’s Hospital, Vancouver, BC, and the Baker IDI Heart and Diabetes Institute, Melbourne, Australia) concluded.

“These findings certainly need to be confirmed in larger randomized trials that are powered for hard clinical end points, but the AVOID study investigators would really question the current practice of giving oxygen to all patients and certainly to those who have normal oxygen levels to begin with,” he concluded.

The results were presented here at the American Heart Association (AHA) 2014 Scientific Sessions.

Effects on Infarct

Following the first report in 1900 of supplemental oxygen relieving angina, pre- and in-hospital oxygen “has really been a fundamental component of first-aid management of patients with suspected acute myocardial infarction, and this is done the world over,” Stub said. International guidelines differ on who should be given oxygen, he noted, “but all guidelines recognize that this fundamental of practice has very limited evidence behind it in a randomized clinical-trial fashion.”

However, recent physiologic data suggest that even as little as 15 minutes of oxygen can cause hyperoxia, leading to a reduction in coronary blood flow, increased coronary vascular resistance, increased oxygen free radicals, and disturbed microcirculation, he said, “and this all may contribute to increased reperfusion injury, myocardial injury during acute coronary syndromes.”

AVOID was an investigator-initiated randomized, controlled, multicenter trial with the aim of comparing supplemental oxygen therapy with no oxygen in STEMI patients with oxygen saturation in the normal range.

It was a “pragmatic” trial coordinated by the research division of Ambulance Victoria, in conjunction with nine tertiary-care centers in Melbourne, Australia. “A key component of the trial is that all patients were randomized by the paramedics prehospital,” he noted.

Patients were included if they had symptoms suspicious of MI for less than 12 hours, normal oxygen levels, defined as O2sat>94% measured by pulse oximeter, and a diagnostic prehospital ECG with ST-elevation on two or more contiguous leads. Patients were excluded if oxygen saturation was below 94%, they were in an altered conscious state, they received oxygen prior to randomization, or there was planned transport to a nonstudy hospital.

Patients with confirmed STEMI randomized to the oxygen arm (n=218) received 8 L/min of O2 given prehospital right through to admission to the coronary cath lab for primary angioplasty and until they were stable on the ward. Patients in the no-oxygen arm (n=223) were given no oxygen unless they became hypoxic (O2sat<94%).

The co–primary end point was myocardial infarct size based on cardiac enzymes and other markers (mean peak creatinine kinase, mean peak troponin I, area under the curve of creatinine kinase and troponin I).

Patient characteristics in the groups were well-matched. The oxygen was appropriately administered, with 99.5% receiving oxygen via face mask in the prehospital and in-hospital setting, he said. In the no-oxygen group, 4.5% received oxygen prehospital for low oxygen saturations, 7.7% were treated during the procedure in the cath lab, and over 20% while on the ward.

This resulted in significant differences in oxygen saturation throughout the study, Stub noted. Cardiac arrest andcardiogenic shock occurred similarly between groups. Time from paramedic arrival on the scene to hospital arrival was approximately 55 minutes in both groups. Interestingly, they found no indication of symptomatic benefits of oxygen, with pain scores and administration of analgesics also similar in both groups, he pointed out. Details of the procedures were also not different between the study arms.

On the primary end point, they found a significant 25% increase in creatine kinase (CK)—”so the suggestion of increased myocardial injury in those delivered oxygen,” Stub noted.

For the troponin I co–primary end point, the curves were similar, but with wider confidence intervals. “We had one site that had issues early in the trial with regard to ascertaining the standardized troponin assay, so the confidence intervals were a little wider, and this was a nonsignificant result,” Stub said. “So on the one hand, you had a highly significant CK result, with a nonsignificant troponin.”

AVOID Primary End Point: Infarct Size on Cardiac Enzymes

Measure Oxygen No oxygen Ratio of means (oxygen/no oxygen) P
Creatinine kinase (U/L)
Geometric mean peak (95% CI) 1948 (1721–2205) 1543 (1341–1776) 1.26 (1.05–1.52) 0.01
Median peak (IQR) 2073 (1065, 3753) 1727 (737, 3598) 0.04
Troponin I (µg/L)
Median peak (IQR) 65.7 (30.1, 145.1) 62.1 (19.2, 144.0) 0.17
Geometric mean peak (95% CI ) 57.4 (48.0–68.6) 48.0 (39.6–58.1) 1.20 (0.92–1.55) 0.18
IQR=interquartile range

A secondary end point was cardiac MRI (CMR), “the gold standard of final infarct size,” which was done in a subgroup of 135 patients at 6 months, he said.

The suggestion of increased infarct size and myocardial injury was again seen. “When we looked at late gadolinium enhancement, there was a significant difference between the oxygen and the no-oxygen group. When this was normalized for left ventricular mass, it was just a nonsignificant trend,” Stub added, but it still suggests the potential for increased myocardial injury.

AVOID: Infarct Size on CMR

Infarct Size Oxygen No Oxygen Ratio of means (oxygen/no oxygen P
Median (IQR, g) 20.3 (9.6, 29.6) 13.1 (5.2, 23.6) 0.04
Geometric mean (95% CI), g 14.6 (11.3–18.8) 10.2 (7.7–13.4) 1.43 (0.99–2.07) 0.06
Median (IQR) proportion of LV mass 12.6 (6.7, 19.2) 9.0 (4.1, 16.3) 0.08
Geometric mean (95% CI) proportion of LV mass 10.0 (8.1–12.5) 7.3 (5.7–9.3) 1.38 (0.99–1.92) 0.06
IQR=interquartile range

The study wasn’t powered to look at major adverse cardiac events, he noted. Mortality was similar between the groups, but significant increases were seen in recurrent MI and in significant arrhythmias in the oxygen group. No significant differences were seen at 6 months on the clinical end points, Stub said.

AVOID: Clinical End Points

End point Oxygen (%) No oxygen (%) p
Hospital discharge
Mortality 1.8 4.5 0.11
Recurrent MI 5.5 0.9 <.01
Stroke 1.4 0.4 0.30
Major bleeding 4.1 2.7 0.41
Significant arrhythmia 40.4 31.4 0.05
6 mo
Mortality 3.8 5.9 0.32
Recurrent MI 7.6 3.6 0.07
Stroke 2.4 1.4 0.43
Repeat revascularization 11.0 7.2 0.17

“Hypothesis-generating” subgroup analysis showed that females, those with the longest symptom-to-intervention times (>180 minutes) and preintervention TIMI 2 or 3 blood flow all favored no oxygen therapy.

During discussion of the trial at a press conference, Stub stressed that these patients were normoxic. “We certainly are not proposing that all patients should not be given oxygen,” he said. “Clearly if you’re having an acute coronary syndrome or any condition in which you’re hypoxic, then oxygen clearly is a lifesaving drug.”

Stub noted that this issue is being further studied by researchers with the Swedish Coronary Angiography and Angioplasty Registry (SCAAR), a national registry of consecutive patients from 29 hospitals in Sweden where angiography and PCIs are performed. The registry was established in 1989 and is independent of funding from industry.

“I think this will be a fantastic study,” Stub told heartwire , and at about 5000 patients, it will have the power to look at clinical end points like mortality. Results are expected to be available in a couple of years.

Dr Robert Harrington (Stanford University, CA), chair of the program committee for this year’s AHA Scientific Sessions, who moderated the press conference, pointed out the SCAAR group conducted the recent TASTE trial of thrombectomy in MI.

“They’re now doing a series of trials, and one of them is a randomized trial of oxygen and mortality as the primary outcome, so kudos,” he said. “They have a system where they can do this very readily.”

Breaking Up With MONA?

The invited discussant for this trial was Dr Karl B Kern (University of Arizona, Tucson), who pointed out that all cardiologists are familiar with MONA, which stands for morphine, oxygen therapy, nitrates, and aspirin.

“We were all taught that MONA is our friend anytime we met a cardiac patient with ischemic chest pain,” Kern said. “But this may be, with the AVOID trial, the beginning of her demise.”

Every good trial has to start with the pneumonic, he said wryly, “and I congratulate the AVOID trial. Should we avoid oxygen after the AVOID trial? Maybe so.”

Results of a Cochrane review on oxygen therapy released last year[2] combined the small studies that have been done to date, he said, “and when combined, the data were clearly inconclusive. It actually suggested harm but was underpowered, so no real conclusion could be made.”

The current study took on that question, and in the prehospital setting, which makes it “even more remarkable,” Kern said. “They found, as they prespecified, that their primary end point of infarct size was significantly less without oxygen. That’s an astounding finding, and really one that I think will cause many cardiologists and physicians to take note and perhaps step back.”

That’s an astounding finding, and really one that I think will cause many cardiologists and physicians to take note and perhaps step back.”

He cautioned, though, that infarct size using biomarkers is still a surrogate end point, not hard clinical outcomes, and the use of biomarkers, “although admirable, is perhaps not today the most accurate.” What is accurate, on the other hand, is the use of CMR, “and the data held up at 6 months as well—with oxygen therapy there was a larger infarct.”

He raised a few issues, though, with the study. One is that they used 6 to 8 L/min of oxygen, while in the hospital phase “we would use less,” 2 to 4 L/min, particularly for this population of patients who were not hypoxic, Kern noted. “The effect of that extra oxygen is not known.”

Although the curves for oxygen saturation clearly separated in the trial, he said he would be interested to see blood gas measures, but these were not available since it was done out of hospital.

Finally, while they were secondary end points, there was an increase in significant arrhythmias and recurrent infarction in those given oxygen. “Certainly the arrhythmias could be explained by microvascular damage and ongoing ischemia, but perhaps not quite so with the recurrent infarctions, which are more typically plaque rupture during the hospital course before discharge,” he said.

“But I really congratulate the authors on this very provocative, if not in fact definitive, trial,” Kern said, and it will be of interest to see if there will ultimately be a mortality difference.

“So back to MONA,” he concluded. “Should we divorce her, or as Neil Sedaka said, at least break up? I guess I’m not quite ready to do that, but I’m certainly willing to date her less often.”

The study was funded by Alfred Hospital Foundation, FALCK Foundation, and Paramedics Australia. Stub reports research grants from the Royal Australian College of Physicians and Cardiac Society of Australia and other research support from St Jude Medical. The coauthors report no relevant financial relationships.

Obesity Tied to Brain Volume Loss


Being overweight or obese is associated with poorer brain health in cognitively healthy adults in their 60s, according to new data from the long-running Australian PATH Through Life Study.

After adjustment for multiple factors, participants who were overweight or obese had smaller hippocampal volume at baseline and experienced greater hippocampal atrophy over 8 years than their normal-weight peers.

“The results further underscore the importance of reducing the rate of obesity through education, population health interventions, and policy,” Nicolas Cherbuin, PhD, from the Australian National University in Canberra, Australia, said in a statement.

He reported the findings in Washington, DC, at the Society for Neuroscience 2014 Annual Meeting.

Increased Dementia

Obesity is a “major concern” and has been linked to an increased risk for dementia, Dr Cherbuin said during a media briefing. The hippocampus plays a key role in long-term memory, and hippocampal atrophy is a hallmark of cognitive decline.

Dr Cherbuin reported on 420 cognitively healthy adults aged 60 to 64 years participating in the PATH study on aging. As part of the study, body mass index (BMI) was recorded and high-resolution T1-weighted MRI was performed at study outset and then 4 and 8 years later.

At baseline, BMI was negatively correlated with left hippocampal volume (estimate per unit BMI above 25: –10.65 mm3; P = .027) and right hippocampal volume (estimate: –8.18 mm3; P = .097).

During follow-up, participants with higher BMI experienced greater atrophy in the left (P = .001) but not the right (P = .058) hippocampus, even after adjustment for age, sex, education, diabetes, hypertension, smoking, and depression.

Each 2-point increment in BMI at baseline was associated with a 7.2% decrease in left hippocampal volume during follow-up. “This is particularly significant in an aging population, and further research should be conducted to determine how obesity affects thinking abilities,” Dr Cherbuin said.

“We did not investigate the relationship between shrinkage and function, but other studies in this research field have shown that greater shrinkage in the hippocampus is linked with a greater risk of cognitive decline and a greater risk of dementia as well,” he said.

In an interview with Medscape Medical News, Ralph DiLeone, PhD, from Yale University in New Haven, Connecticut, who moderated the media briefing, said more information on outcomes would be of interest.

“Because the hippocampus is so important for memory function, mood regulation and is implicated in cognitive aging and dementia, it will be very interesting to see if the researchers can correlate some of those brain changes with specific behavioral deficits or disease states,” he said.

Lung cancer prevalence on the rise


Ontario’s lung cancer prevalence, or the number of people with a previous lung cancer diagnosis who are still alive, is increasing and shows slightly different patterns for men and women. The 10-year prevalence of lung cancer cases in men increased from 1991 to 1998, remained stable from 1998 to 2004, and has shown modest increases since 2004. The 10-year lung cancer prevalence in women, however, has consistently increased since 1991. Lung cancer is one of the top four cancers diagnosed in both sexes.

line graph showing Trends in 10-year prevalence* for lung cancer†, Ontario, by sex

  • Prevalence depends on incidence and survival.
  • Lung cancer prevalence is increasing due to increasing incidence, improved survival, and population aging and growth.
  • Lung cancer is a common cancer with high fatality, resulting in relatively low prevalence.

 

Figure Description

This figure is a line chart titled Trends in 10-year prevalence for lung cancer, Ontario, by sex’.

The horizontal axis is labeled ‘Year’ and lists the years from 1991 to 2009 in increments of two years.

The vertical axis is labeled ‘Number of prevalent cases’ and increase from 0 to 9,000 in increments of 1,000.

Two lines representing the number of prevalent cases for each males and females are shown.

Trends in 10-year prevalence* for lung cancer,
Ontario, by sex

Year Males Females
1991 6312 3984
1992 6515 4295
1993 6640 4436
1994 6661 4566
1995 6714 4818
1996 6843 5038
1997 6905 5343
1998 7008 5573
1999 6944 5812
2000 7055 6004
2001 7138 6249
2002 7088 6401
2003 6989 6589
2004 7028 6925
2005 7195 7321
2006 7213 7520
2007 7257 7809
2008 7406 8031
2009 7558 8381

Data source: Cancer Care Ontario (Ontario Cancer Registry, 2012)
*Prevalence is the number of Ontarians diagnosed during the previous ten years who are still alive on January 1, each year.
✝ Lung cancer (ICD-O-3 C34).

 

Prevalence is impacted by the number of new people diagnosed (incidence), the number of people who survive the disease, and population aging and growth. Therefore, this steadily increasing trend in women is expected due to a combination of the rising number of new lung cancer cases and improved survival. Survival for lung cancer in both sexes has improved significantly, rising from 17% in 1996–2000 to 19% in 2006–2010.1 Almost 16,000 Ontarians alive on January 1, 2010 had been diagnosed with lung cancer in the previous 10 years.

Lung cancer incidence in men is more complex, with trends differing over time by age group. The incidence counts in men for all ages combined have only modestly increased since 1997, which is likely the result of population aging and growth, improvements in survival for lung cancer in general, and a reduction in prevalence of certain risk factors, such as smoking and occupational exposures (e.g., asbestos and radon). Although smoking rates have declined in both sexes, this decline began earlier in males than females contributing to the more modest increase in both incidence and prevalence for males from 1991–2009. Despite these trends, the prevalence for lung cancer is relatively low because lung cancer is highly fatal.

Estimates of prevalence for a 10-year period represent a mixture of individuals at different stages of the cancer experience, from the newly diagnosed through to the long-term survivors. Healthcare needs vary across this 10-year period and include active treatment, follow-up and treatment for recurrences, and end-of-life or palliative care. Prevalence is one of many indicators of the burden of cancer for individuals, families and health services.

References

  1. Cancer Quality Council of Ontario: Cancer System Quality Index (CSQI) 2014 [Internet]. Cancer in Ontario [cited 2014 November 18]. Available from: http://www.csqi.on.ca/cancer_in_ontario.

First US bariatric embolization procedure performed


The first US bariatric embolization was recently performed by physicians at Dayton Interventional Radiology, according to a press release from the institution.

The procedure is minimally invasive. A catheter is placed into the patient’s groin or wrist and guided to the left gastric artery, blood flow to the branches of the artery is then blocked by particles smaller than a grain of sand. To date, this is the first catheter procedure used to treat obesity, according to the release.

“More than one-third of US adults are obese and approximately 5% to 7% are morbidly so,” Mubin I. Syed, MD, of Dayton Interventional Radiology, said in the release.

According to Syed, bariatric surgery is currently the only long-term procedural solution for substantial weight loss.

“You’ve got the ideal rationale to develop a minimally invasive procedure that could deliver long-term success,” Syed said. “To me that’s promising and why I practice medicine. Despite the potential to help the obese, bariatric embolization is still in the experimental phase. Therefore, its safety and long-term benefit has yet to be proven.

Research on this procedure is expected to conclude in September 2015, and investigators will collect data on safety and efficacy for patients with obesity. Patients are aged 22 to 65 years, in relatively good health and <400 lb.

“We are combining years of scientific research on the hormone ghrelin, looking at its role with respect to appetite suppression and combining it with an everyday type of embolization procedure that interventional radiologists routinely perform,” study researcher Kamal Morar, MD, said in the release.

Optimal Duration of Low Molecular Weight Heparin for the Treatment of Cancer-Related Deep Vein Thrombosis: The Cancer-DACUS Study.


PURPOSE: We evaluated the role of residual vein thrombosis (RVT) to assess the optimal duration of anticoagulants in patients with cancer who have deep vein thrombosis (DVT) of the lower limbs.
PATIENTS AND METHODS: Patients with active cancer and a first episode of DVT treated with low molecular weight heparin (LMWH) for 6 months were eligible. Patients were managed according to RVT findings: those with RVT were randomly assigned to continue LMWH for an additional 6 months (group A1) or to discontinue it (group A2), and patients without RVT stopped LMWH (group B). The primary end point was recurrent venous thromboembolism (VTE) during the 1 year after disconinuation of LMWH, and the secondary end point was major bleeding. Analyses are from the time of random assignment.
RESULTS: Between October 2005 and April 2010, 347 patients were enrolled. RVT was detected in 242 patients (69.7%); recurrence occurred in 22 of the 119 patients in group A1compared with 27 of 123 patients in group A2. The adjusted hazard ratio (HR) for group A2 versus A1 was 1.37 (95% CI, 0.7 to 2.5; P = .311). Three of the 105 patients in group B developed recurrent VTE; adjusted HR for group A1 versus B was 6.0 (95% CI, 1.7 to 21.2; P = .005). Three major bleeding events occurred in group A1, and two events each occurred in groups A2 and B. The HR for major bleeding in group A1 versus group A2 was 3.78 (95% CI, 0.77 to 18.58; P = .102). Overall, 42 patients (12.1%) died during follow-up as a result of cancer progression.
CONCLUSION: In patients with cancer with a first DVT, treated for 6 months with LMWH, absence of RVT identifies a population at low risk for recurrent thrombotic events. Continuation of LMWH in patients with RVT up to 1 year did not reduce recurrent VTE.