The Coming Amnesia


In a talk delivered in Amsterdam a few years ago, science fiction writer Alastair Reynolds outlined an unnerving future scenario for the universe, something he had also recently used as the premise of a short story (collected here).

As the universe expands over hundreds of billions of years, Reynolds explained, there will be a point, in the very far future, at which all galaxies will be so far apart that they will no longer be visible from one another.

Upon reaching that moment, it will no longer be possible to understand the universe’s history—or perhaps even that it had one—as all evidence of a broader cosmos outside of one’s own galaxy will have forever disappeared. Cosmology itself will be impossible.

In such a radically expanded future universe, Reynolds continued, some of the most basic insights offered by today’s astronomy will be unavailable. After all, he points out, “you can’t measure the redshift of galaxies if you can’t see galaxies. And if you can’t see galaxies, how do you even know that the universe is expanding? How would you ever determine that the universe had had an origin?”

There would be no reason to theorize that other galaxies had ever existed in the first place. The universe, in effect, will have disappeared over its own horizon, into a state of irreversible amnesia.

It was an interesting talk that I had the pleasure to catch in person, and, for those interested, it includes Reynolds’s explanation of how he shaped this idea into a short story.

More to the point, however, Reynolds was originally inspired by an article published in Scientific American back in 2008 called “The End of Cosmology?” by Lawrence M. Krauss and Robert J. Scherrer.

That article’s sub-head suggests what’s at stake: “An accelerating universe,” we read, “wipes out traces of its own origins.”

As Krauss and Scherrer point out in their provocative essay, “We may be living in the only epoch in the history of the universe when scientists can achieve an accurate understanding of the true nature of the universe.”

“What will the scientists of the future see as they peer into the skies 100 billion years from now?” they ask. “Without telescopes, they will see pretty much what we see today: the stars of our galaxy… The big difference will occur when these future scientists build telescopes capable of detecting galaxies outside our own. They won’t see any! The nearby galaxies will have merged with the Milky Way to form one large galaxy, and essentially all the other galaxies will be long gone, having escaped beyond the event horizon.”

This won’t only mean fewer luminous objects to see in space; it will mean that, “as a result, Hubble’s crucial discovery of the expanding universe will become irreproducible.”

The authors go on to explain that even the chemical composition of this future universe will no longer allow for its history to be deduced, including the Big Bang.

“Astronomers and physicists who develop an understanding of nuclear physics,” they write, “will correctly conclude that stars burn nuclear fuel. If they then conclude (incorrectly) that all the helium they observe was produced in earlier generations of stars, they will be able to place an upper limit on the age of the universe. These scientists will thus correctly infer that their galactic universe is not eternal but has a finite age. Yet the origin of the matter they observe will remain shrouded in mystery.”

In other words, essentially no observational tool available to future astronomers will lead to an accurate understanding of the universe’s origins. The authors call this an “apocalypse of knowledge.”

[Image: “The Christianized constellation St. Sylvester (a.k.a. Bootes), from the 1627 edition of Schiller’s Coelum Stellatum Christianum.” 

There are many interesting things here, including the somewhat existentially horrifying possibility that any intelligent creatures alive in that distant era will have no way to know what is happening to them, where things came from, even where they currently are (an empty space? a dream?), or why.

Informed cosmology will, by necessity, be replaced with religious speculation—with myths, poetry, and folklore.

It is worth asking, however briefly and with multiple grains of salt, if something similar has perhaps already occurred in the universe we think we know today—if something has not already disappeared beyond the horizon of cosmic amnesia—making even our most well-structured, observation-based theories obsolete. For example, could even the widely accepted conclusion that there was a Big Bang be just an ironic side-effect of having lost some other form of cosmic evidence that long ago slipped eternally away from view?

Remember that these future astronomers will not know anything is missing. They will merrily forge ahead with their own complicated, internally convincing new theories and tests. It is not out of the question, then, to ask if we might be in a similarly ignorant situation.

In any case, what kinds of future devices and instruments might be invented to measure or explore a cosmic scenario such as this? What explanations and narratives would such devices be trying to prove?

[Image: “Woodcut illustration depicting the 7th day of Creation, from a page of the 1493 Latin edition of Schedel’s Nuremberg Chronicle. Note the Aristotelian cosmological system that was used in the Middle Ages, below, with God and His retinue of angels looking down on His creation from above.” Image (and caption) from Star Maps: History, Artistry, and Cartography

Science writer Sarah Scoles looked at this same dilemma last year for PBS, interviewing astronomer Avi Loeb.

Scoles was able to find a small glimmer of light in this infinite future darkness, however: Loeb believes that there might actually be a way out of this universal amnesia.

“The center of our galaxy keeps ejecting stars at high enough speeds that they can exit the galaxy,” Loeb says. The intense and dynamic gravity near the black hole ejects them into space, where they will glide away forever like radiating rocket ships. The same thing should happen a trillion years from now.

“These stars that leave the galaxy will be carried away by the same cosmic acceleration,” Loeb says. Future astronomers can monitor them as they depart. They will see stars leave, become alone in extragalactic space, and begin rushing faster and faster toward nothingness. It would look like magic. But if those future people dig into that strangeness, they will catch a glimpse of the true nature of the universe.

There might yet be hope for cosmological discovery, in the other words, encoded in the trajectories of these bizarre, fleeing stars.

[Images: (top) “An illustration of the Aristotelian/Ptolemaic cosmological system that was used in the Middle Ages, from the 1579 edition of Piccolomini’s De la Sfera del Mondo.” (bottom) “An illustration (influenced by Peurbach’s Theoricae Planetarum Novae) explaining the retrograde motion of an outer planet in the sky, from the 1647 Leiden edition of Sacrobosco’s De Sphaera.” Images and captions from Star Maps: History, Artistry, and Cartography 

There are at least two reasons why I have been thinking about this today. One was the publication of an article by Dennis Overbye earlier this week about the rate of the universe’s expansion.

“There is a crisis brewing in the cosmos,” Overbye writes, “or perhaps in the community of cosmologists. The universe seems to be expanding too fast, some astronomers say.”

Indeed, the universe might be more “virulent and controversial” than currently believed, he explains, caught-up in the long process of simply tearing itself apart.

One implication of this finding, Overbye adds, “is that the most popular version of dark energy—known as the cosmological constant, invented by Einstein 100 years ago and then rejected as a blunder—might have to be replaced in the cosmological model by a more virulent and controversial form known as phantom energy, which could cause the universe to eventually expand so fast that even atoms would be torn apart in a Big Rip billions of years from now.”

In the process, perhaps the far-future dark ages envisioned by Krauss and Scherrer will thus arrive a billion or two years earlier than expected.

The second thing that made me think of this, however, was a short essay called “Dante in Orbit,” originally published in 1963, that a friend sent to me last night. It is about stars, constellations, and the possibility of determining astronomical time in The Divine Comedy.

In that paper, Frederick A. Stebbins writes that Dante “seems far removed from the space age; yet we find him concerned with problems of astronomy that had no practical importance until man went into orbit. He had occasion to deal with local time, elapsed time, and the International Date Line. His solutions appear to be correct.”

Stebbins goes on to describe “numerous astronomical references in [Dante’s] chief work, The Divine Comedy”—albeit doing so in a way that remains unconvincing. He suggests, for example, that Dante’s descriptions of constellations, sunrises, full moons, and more will allow an astute reader to measure exactly how much time was meant to have passed in his mythic story, and even that Dante himself had somehow been aware of differential, or relativistic, time differences between far-flung locations. (Recall, on the other hand, that Dante’s work has been discussed elsewhere for its possible insights into physics.)

But what’s interesting about this is not whether or not Stebbins was correct in his conclusions. What’s interesting is the very idea that a medieval cosmology might have been soft-wired, so to speak, into Dante’s poetic universe and that the stars and constellations he referred to would have had clear narrative significance for contemporary readers. It was part of their era’s shared understanding of how the world was structured.

Now, though, imagine some new Dante of a hundred billion years from now—some new Divine Comedy published in a trillion years—and how it might come to grips with the universal isolation and darkness of Krauss and Scherrer. What cycles of time might be perceived in the lonely, shining bulk of the Milky Way, a dying glow with no neighbor; what shared folklore about the growing darkness might be communicated to readers who don’t know, who cannot know, how incorrect their model of the cosmos truly is?

Could diabetes drug slow Alzheimer’s?


A trial has begun to see whether a drug used to treat diabetes can slow the progression of Alzheimer’s disease.

The study will involve 200 patients with memory problems due to early Alzheimer’s. Laboratory research suggests that the drug, liraglutide, reduces brain inflammation, improving the growth of brain cells and the connections between them.

Patients will be recruited in London – at Imperial College and King’s College – and sites in Oxford, Southampton and Swindon.

One of those on the trial is 65-year-old Geoff Payne. He became concerned about short-term memory loss three years ago and was eventually diagnosed with Alzheimer’s.

“My older brother died of Alzheimer’s at the age of 79,” he said.

“His disease was spotted quite late and I remember him being almost entirely silent and withdrawn at family gatherings.

 Geoff and Sue Payne

“I wish I’d tried to talk more to him about it. When I finally got my diagnosis it confirmed my own suspicions. I have had the disease for three years but fortunately I have not yet declined substantially.

Hope

“My wife and I know what to expect in the years ahead, so we take one day at a time. Hopefully this drug may help.”

Those on the trial will receive a daily injection of liraglutide or a placebo for 12 months. They will have scans and memory tests before and after the treatment.

It’s a decade since the last new treatment for Alzheimer’s was introduced and some major drug trials have failed in recent years.

“New drugs can take decades to filter through and cost billions,” said Dr Paul Edison, Imperial College London, who’s leading the trial.

“Liraglutide is a tried and tested diabetes treatment, so we know it is safe. This trial will show within three years whether it can slow the progression of Alzheimer’s.”

Alzheimer’s Society is providing more than £300,000 towards the project. Dr Doug Brown, Director of Research and Development said: “This exciting study suggests that one of these drugs can reverse the biological causes of Alzheimer’s even in the late stages and demonstrates we’re on the right track.

“We are now funding a major new trial to bring it closer to a position where it can be improving the lives of people with dementia.’

G8 summit

The need for more research and new treatments will be the key focus of the G8 dementia summit in London on Wednesday.

The Department of Health says health ministers will discuss how they can coordinate and accelerate efforts and try to break down barriers between companies, researchers and clinicians.

Dementia is already a significant global issue, and cases are predicted to rise from 44 million to 135 million by 2050 – a reflection of the growing and ageing global population.

It is thought to cost the global economy £370bn ($604bn) each year and there are concerns that future demands could overwhelm some health services.

Are Alzheimer’s and diabetes the same disease?


HAVING type 2 diabetes may mean you are already on the path to Alzheimer’s. This startling claim comes from a study linking the two diseases more intimately than ever before. There is some good news: the same research also offers a way to reverse memory problems associated with diabetes – albeit in rats – which may hint at a new treatment for Alzheimer’s.

“Perhaps you should use Alzheimer’s drugs at the diabetes stage to prevent cognitive impairment in the first place,” says Ewan McNay from the University at Albany in New York.

Alzheimer’s cost the US $130 billion in 2011 alone. One of the biggest risk factors is having type 2 diabetes. This kind of diabetes occurs when liver, muscle and fat cells stop responding efficiently to insulin, the hormone that tells them to absorb glucose from the blood. The illness is usually triggered by eating too many sugary and high-fat foods that cause insulin to spike, desensitising cells to its presence. As well as causing obesity, insulin resistance can also lead to cognitive problems such as memory loss and confusion.

Are brain changes associated with Alzheimer's (green) reversible? <i>(Image: Medical Body Scans/Jessica Wilson/Photo Researchers/SPL)</i>

In 2005, a study by Susanne de la Monte’s group at Brown University in Providence, Rhode Island, identified a reason why people with type 2 diabetes had a higher risk of developing Alzheimer’s. In this kind of dementia, the hippocampus, a part of the brain involved in learning and memory, seemed to be insensitive to insulin. Not only could your liver, muscle and fat cells be “diabetic” but so it seemed, could your brain.

Feeding animals a diet designed to give them type 2 diabetes leaves their brains riddled with insoluble plaques of a protein called beta-amyloid – one of the calling cards of Alzheimer’s. We also know that insulin plays a key role in memory. Taken together, the findings suggest that Alzheimer’s might be caused by a type of brain diabetes.

If that is the case, the memory problems that often accompany type 2 diabetes may in fact be early-stage Alzheimer’s rather than mere cognitive decline.

Although there is no definitive consensus on the exact causes of Alzheimer’s, we do know that brains get clogged with beta-amyloid plaques. One idea gaining ground is that it is not the plaques themselves that cause the symptoms, but their precursors – small, soluble clumps of beta-amyloid called oligomersMovie Camera. The insoluble plaques could actually be the brain’s way of trying to isolate the toxic oligomers.

To investigate whether beta-amyloid might also be a cause of cognitive decline in type 2 diabetes, McNay, Danielle Osborne and their colleagues fed 20 rats a high-fat diet to give them type 2 diabetes. These rats, and another 20 on a healthy diet, were then trained to associate a dark cage with an electric shock. Whenever the rats were returned to this dark cage, they froze in fear – measuring how long they stayed still is a standard way of inferring how good their memory is.

Memory boost

As expected, the diabetic rats had weaker memories than the healthy ones – they froze in the dark for less than half the time of their healthy counterparts. To figure out whether this was due to the beta-amyloid plaques or the soluble precursors, Pete Tessier at the Rensselaer Polytechnic Institute in Troy, New York, engineered fragments of antibodies that disrupt the action of one or the other.

When the plaque-disrupting antibodies were injected into diabetic rats, no change was seen. However, after receiving antibodies specific for oligomers, they froze for just as long as the healthy rats. “The cognitive deficit brought on by their diabetes is entirely reversed,” says McNay.

Until now, the standard explanation for the cognitive decline associated with type 2 diabetes is that it is a result of insulin signalling gone awry. One effect is to reduce the hippocampus’s ability to transport energy, or glucose, to neurons during a cognitive task. The fact that amyloid builds up in the brains of diabetic animals – and also in people, was seen as an unhappy consequence of insulin imbalance.

These experiments suggest oligomers are actually to blame. Previous work from other groups has shown that the same enzymes break down both insulin and beta-amyloid oligomers – and that the oligomers prevent insulin binding to its receptors in the hippocampus. So when there is too much insulin around – as there is in someone with type 2 diabetes – those enzymes are working flat out to break it down. This preferential treatment of insulin leaves the oligomers to form clumps, which then keep insulin from its receptors, causing a vicious spiral of impaired brain insulin signalling coupled with cognitive decline.

“We think that our treatment soaked up the amyloid oligomers, so that they could no longer block insulin from binding to its receptors,” says McNay, who presented the preliminary data at the Society for Neuroscience meeting in San Diego earlier this month. “Everyone thinks of amyloid build-up as a consequence of the events that cause cognitive impairment in diabetes, but we’re saying it’s actually a cause.” It means, he says, that the cognitive decline seen in type 2 diabetes may be thought of as early-stage Alzheimer’s.

It’s a bold claim, and if correct, one with big implications. Given that the number of people with type 2 diabetes is expected to jump from 382 million now to 592 million by 2035, we might expect to see a similar trajectory for associated Alzheimer’s (New Scientist, 1 September 2012). If beta-amyloid build-up can be stopped in people with type 2 diabetes and their cognitive impairment reversed – perhaps many of them will never progress to Alzheimer’s.

For the last few years, organisations like the UK’s Alzheimer’s Society have been backing clinical trials to look for diabetes drugs that may have an effect on Alzheimer’s patients. “We’re saying that this may be not the only way to think about it,” says McNay.

The next step is to repeat the work, and if the results are corroborated, start looking for a drug that would do the same thing as the group’s modified antibodies, without having to inject the drug directly into the hippocampus. It will also be necessary to work out just how much amyloid the brain can safely do without, since low levels are important for memory formation.

“The work opens the door to inoculating the most at risk group, people with type 2 diabetes,” says Tres Thompson of the University of Texas at Dallas. There have been plenty of failed attempts to use antibodies to relieve Alzheimer’s in the past. “But these were all in people with advanced stages of the disease. Vaccinating people much earlier could give better results.”

Some researchers are still wary of focusing on beta-amyloid when 20 years of working on a treatment for that particular aspect of the disease has come to nothing. “I think it’s brilliant work – he’s using new techniques that seem to be working, but it’s still very beta-centric,” says Olivier Thibault at the University of Kentucky in Lexington. He cautiously agrees that McNay’s data do seem to suggest a causative link between beta-amyloid and impaired insulin signalling but says the group needs to factor in the effect of ageing – both diabetes and Alzheimer’s become more likely as we grow older.

Jessica Smith, spokeswoman for the UK Alzheimer’s Society in London welcomes the work. “We need to tease out the difference between those with type 2 diabetes who develop Alzheimer’s and those who don’t. If people were developing the signs earlier than we thought, then perhaps we can intervene earlier, rather than waiting until they have full clinical Alzheimer’s.”

Of course, there is another solution to staving off type 2 diabetes and any consequential Alzheimer’s that requires no drugs at all. “Go to the gym and eat fewer twinkies,” says McNay.

Blood sugar levels could be linked to memory loss in people without diabetes – Mirror.co.uk


Journal study finds with with lower blood sugar levels achieved highest scores in memory tests – those with high levels could suffer memory loss

People who have even slightly raised blood sugar levels may suffer memory loss, a study shows.

Researchers performed tests on 141 healthy people with an average age of 63.

None had diabetes or pre-diabetic symptoms.

But the study published in journal Neurology found those with with lower blood sugar levels achieved better scores in memory tests.

In a test to recall 15 words 30 minutes after hearing them, higher blood sugar levels were linked with poorer memory.

Lead researcher Dr Agnes Floel, of the Charite University Medicine in Berlin, Germany, said: “These results suggest that even for people within the normal range of blood sugar, lowering their blood sugar levels could be a promising strategy for preventing memory problems and cognitive decline as they age.

“Strategies such as lowering calorie intake and increasing physical activity should be tested.”

Dr Clare Walton, of the Alzheimer’s Society, said: “We already know that Type 2 diabetes is a risk factor for developing Alzheimer’s disease but this new study suggests that higher blood sugar levels may also be linked to poor memory in people without diabetes.

“The research suggests that regulating blood sugar levels might be a way to improve people’s memory, even if they don’t have diabetes.”

Dr Simon Ridley, of Alzheimer’s Research UK, added: “While we do not know whether the people in this study would have gone on to develop dementia, the findings serve as a warning that we should be conscious of the impact that subtle changes in our health could have on our brain.

“Current evidence suggests the best way to keep the brain healthy is to eat a balanced diet, take regular exercise, maintain a healthy weight, not smoke and keep blood pressure and cholesterol levels in check.”

Cardiac disease linked with faltering brain risk.


http://m.timesofindia.com/life-style/health-fitness/health/Cardiac-disease-linked-with-faltering-brain-risk/articleshow/18240083.cms