Postmortem of Famous Patient’s Brain Explains Why “H. M.” Couldn’t Learn.


 

Brain Cutting
By slicing up and reconstructing the brain of Henry Gustav Molaison, researchers have confirmed predictions about a patient that has already contributed more than most to neuroscience.
No big scientific surprises emerge from the anatomical analysis, which was carried out by Jacopo Annese of the Brain Observatory at the University of California, San Diego, and his colleagues, and published today inNature Communications. But it has confirmed scientists’ deductions about the parts of the brain involved in learning and memory.“The confirmation is surely important,” says Richard Morris, who studies learning and memory at the University of Edinburgh, UK. “The patient is a classic case, and so the paper will be extensively cited.”Molaison, known in the scientific literature as patient H.M., lost his ability to store new memories in 1953 after surgeon William Scoville removed part of his brain — including a large swathe of the hippocampus — to treat his epilepsy. That provided the first conclusive evidence that the hippocampus is fundamental for memory. H.M. was studied extensively by cognitive neuroscientists during his life.After H.M. died in 2008, Annese set out to discover exactly what Scoville had excised. The surgeon had made sketches during the operation, and brain-imaging studies in the 1990s confirmed that the lesion corresponded to the sketches, although was slightly smaller. But whereas brain imaging is relatively low-resolution, Annese and his colleagues were able to carry out an analysis at the micrometer scale.

Fine detail
Using the most modern neuropathological technologies, the researchers cut the brain into 2,401 razor-thin slices, and stained every thirtieth slice to reveal the details of each cell and its projections. They used the slices to create a three-dimensional computer model of the area of the brain around the excision.

The results confirmed that the residue of the posterior hippocampus that was suspected to have survived the operation did exist, and that it was a little larger than anticipated. Because it was disconnected from other anatomical structures in the circuitry involved in consolidating long-term memories — including the entorhinal cortex, which the results show to have been almost completely excised — it could not have rescued H.M.’s condition. The results also showed expected damage to the cerebellum caused by the anti-epileptic drug phenytoin, which he was required to take throughout his life, as well as other damage typical of ageing.

The slicing operation — all 53 hours of it — was shown live in a webcast. Directors of a UK theater company were among the viewers and went on to write and produce the play 2401 Objects, an exploration of H.M.’s life that was critically acclaimed at the 2011 Edinburgh Festival.

 

Advertisements

Stephen Hawking snared the world’s attention.


At the centre of the Milky Way galaxy, there is thought to be a supermassive black hole christened Sagittarius A*.

Stephen Hawking snared the world’s attention—as he often does—recently by rethinking the black hole, an object that has had an insidiously genius impact on the way physicists think about the universe.

Isaac Newton imagined the gravitational force to be the never-repulsive force that acted on all massive bodies. And in imagining so, the biggest problems of his time were presented by the motion of planets in the Solar System around the Sun. In 1684, Christopher Wren, Robert Hooke and Edmund Halley met at a coffee shop in England to discuss planetary physics. In their conversation, the doubt arose as to what the shape of the path was that a planet took in its journey around the Sun.

Following this meeting, Halley paid a visit to Newton in an attempt to get the question answered. When asked, Newton didn’t hesitate to say they’d be elliptical. When asked for the solution, Newton told Halley that he’d misplaced it and would recreate it for him soon. This incident is acknowledged to have prompted Newton to compose hisPhilosophiae Naturalis Principia Mathematica in the next 18 months. And in thePrincipia, Newton had this paragraph:

From the three last Propositions it follows, that if any body P goes from the place P with any velocity in the direction of any right line PR, and at the same time is urged by the action of a centripetal force that is inversely proportional to the square of the distance of the places from the centre, the body will move in one of the conic sections, having its focus in the centre of force; and conversely.

While Newton says the path is conical—like an ellipse—he doesn’t provide a proof, and so Halley’s request went unanswered. Nevertheless, the Principia became the first major work of science in centuries and provided the dominant foundation for at least two more centuries of physical study.

In 1783, the geologist John Michell wrote a letter to the chemist Henry Canevdish discussing the masses and motion of celestial bodies in which he contemplates the idea of a star so dense that even light cannot escape it. This idea was discouraged because it disagreed with the Newtonian zeitgeist of the time. In fact, there were so many things dreamt of in his philosophy that it was not until Albert Einstein’s coming that the theory of gravitation could accommodate all the paradigm-altering discoveries that had been made until then.

The continuum

And in his turn, Einstein birthed the beginnings of an outrageous legacy in 1915, when he published his first paper on general relativity. This theory presented a new way of looking at gravity: it was no longer just the force that acted on all massive bodies but also the force that arose out of the curvature of the space-time continuum.

The theory of general relativity (GR) works out well on paper. To date, a few experimental proofs of the theory have been acquired (gravitational lensing, for one). However, the outrageousness stems from a unique possibility—even a Gödelian inconsistency—that GR allows for: the black hole.

To employ the cliché, imagine space-time to be a fabric of smooth weave that you’re holding between your outstretched arms. Now, imagine how the weave would bend down if you dropped a heavy marble onto it. That’s how space-time would be bent, too, if the marble were Earth. And the bending would be attributed to Earth’s gravitational field.

In another instance, instead of a marble, drop a bowling ball onto the fabric. Assuming you’re unable to carry the weight, you let the ball and the fabric drop to the floor, where it is completely wrapped around the ball.

This is what happens to space-time in the presence of a black hole. There is a distortion of the continuum—as if the ‘rest’ of space-time cannot reach within the black hole itself. Even though its tremendous gravitational strength is understood to be centred at its core, a black hole is thought to begin to the world outside at the event horizon, the distance from the black hole beyond which it is impossible to escape its pull.

They are formed when the gravitational field in a volume of space collapses inward to some point. Such a collapse is thought to be triggered when a suitably heavy star runs out of fuel, blows away its outermost gaseous layers, while the rest of the star is unable to resist the gravitational temptation of shooting into the core.

If the star is heavier than the Chandrasekhar limit but lighter than the Tolman-Oppenheimer-Volkoff (TOV) limit, the inward collapse is prevented by the formation of a smaller neutron star, at which point there is a bounce-back followed by a vicious titanic explosion called a supernova. If the star is heavier than the TOV limit, then the inward collapse could result in the formation of a black hole—one of the better understood processes of black-hole formation.

Paradoxes

Thanks to the ‘Golden Age of general relativity’ in the mid-20th century, we’re in a position to better understand these enigmatic objects. Some of the notable physicists who made heady theoretical progress in this era include David Finkelstein, Werner Israel, Ezra Newman, James Bardeen, Roy Kerr, Evgeny Lifshitz, Brandon Carter, Jacob Bekenstein, Roger Penrose and Stephen Hawking.

Among them, Stephen Hawking is to this day considered to be the first-among-equals when it comes to expertise on black holes. Alongside Bardeen, Bekenstein, Carter, Penrose and others, Hawking was responsible for establishing many properties of black holes and how they could be determined. Around this time, an interesting paradox was discovered by Hawking and Bekenstein about black holes—the beginnings of a problem that Hawking attempted to resolve last week.

In 1974, they discovered—theoretically—that black holes emit some radiation, as if it was running a fever on account of all that it had swallowed. This radiation is called Hawking radiation. The heavier a black hole is, the weaker is its Hawking radiation.

Conversely, smaller black holes should be letting off their energy faster through Hawking radiation and, hypothetically, completely evaporate over time. In fact, NASA’s Fermi telescope is out there orbiting Earth while looking out for this black-hole wheeze in the darker depths of space.

“Black holes have no hair”

In the meantime, in 1973, the physicists Charles Misner, Kip Thorne and John Wheeler had established the intriguingly titled no-hair theorem. They argued that no matter the process of formation of a black hole, all black holes could be understood in terms of three basic properties: mass, electric charge, and angular momentum.

So, no matter if a black hole had formed by the inward collapse of a bucket, a heavy star or the Solar System, it will be describable only in terms of its mass, electric charge, and angular momentum.

Now, imagine a black hole has already formed and you lob a bucket, a screwdriver and a building into it. Despite the initial nature of these objects, the black hole would emit Hawking radiation that is compositionally identical—in its randomness—in all three cases. And thanks to the no-hair theorem, the black hole’s characteristics also wouldn’t change.

For physicists, this is unacceptable because where, then, is the information of the bucket, the screwdriver, the building? Is it lost?

The dying astronaut

There are many competing answers to this problem: the information paradox. Between some ten options presented over the years, they violate one of, or a combination of, the law of conservation of energy, existing theories of gravity, the laws of black-hole thermodynamics, the viewpoint that nature evolves with time, quantum theory, and GR.

One among these options was proposed by American physicist Joseph Polchinski; it is notable because it was an extension to this option that Hawking submitted a paper on January 22, drawing the attention of the world’s media. In 2012, Polchinski and two of his students at the Kavli Institute for Theoretical Physics, California, were wondering what would happen to an astronaut should he fall inside a black hole. Specifically, they were wondering how he would die.

As he started to sink toward the black hole’s centre, the gravitational force on him would continuously build up. Over time, he’d realize the part of his body closer to the centre was being pulled on stronger than the part farther away. Soon, because of this difference, he’d be ripped apart before all his parts would be compressed and, well, digested.

… but there was an issue.

Their calculations showed that quantum mechanical effects on the black hole’s event horizon would almost instantaneously burn the astronaut “to a crisp”, an outcome that contradicted general relativity. At the same time, the firewall idea was useful because it appeared to resolve the information paradox. According to Polchinski, the information of doomed objects would be stored in terms of its impact on all energy radiated from the black hole.

However, the violation of GR still remained. So, when Polchinski attempted to craft a scenario in which such a firewall doesn’t form, he was startled: he couldn’t do it without violating quantum mechanics.

Here was another paradox.

As physicist Raphael Bousso told Nature, the firewall idea “essentially pits quantum mechanics against general relativity, without giving us any clues as to which direction to go next.” Seen another way, it seems a solution would be found only in the conciliation of quantum theory and GR, a feat proving unachievable to this day.

Hawking’s solution(s)

At the 17th International Conference on General Relativity and Gravitation in Dublin, 2004, Hawking suggested that the information was never lost. His solution required that a ‘true’ event horizon never formed around a black hole, simply an ‘apparent horizon’ that let information gradually leak out.

At the time, this apparent solution meant Hawking losing a bet against another physicist, John Preskill, who had said that, of course, information wouldn’t be lost inside a black hole. On the other hand, the other person who had sided with Hawking against Preskill, Kip Thorne, couldn’t agree with Hawking’s solution.

And now, 10 years later, Hawking has expanded on the idea of this ‘apparent horizon’. In a yet-to-be-published paper titled ‘Information Preservation and Weather Forecasting for Black Holes’, Hawking has postulated that information that has gone inside a black hole isn’t lost, but actually comes out in a wildly mangled form.

Moreover, he has argued that although the information is there in that form, attempting to reconstruct it into what it was would be almost impossible—like forecasting weather. Essentially, he’s trying to eke out a solution that minimally violates quantum theory and GR. Even though this would change the mathematics behind physicists’ calculations, astronomers wouldn’t see any difference—if they were looking at a black hole, they’d still be looking at what seems like the event horizon even if it is, in fact, the apparent horizon.

So, through the years, our understanding of gravity has evolved—first on the basis of what we could observe, then to what we could speculate based on what we thought we knew, and then on to conceiving solutions based on what we think could be—to have become more discrete. Simultaneously, it seems only eventual that as we resolved the mechanisms of the cosmos by the principles of GR, the dominant theory of physics outside a black hole, and the effects of a singularity by the principles of quantum theory, our ultimate solution seems to lie at the boundary between the two: the event horizon.

And if Hawking is to be believed, event horizons would no longer even be the defining property of black holes.

Height perception and paranoia.


Height perception and paranoia

virtual reality height experiment

Views from differing heights in Prof Freeman’s virtual reality experiment.

Experiencing the world from lower down than usual can increase how mistrustful and paranoid people feel, according to research in a virtual world.

Scientists believe that feelings of persecution were triggered when people lost height and suffered a fall in self-esteem, which led them to see themselves as inferior and more vulnerable.

The findings could help researchers develop more effective psychological treatments for severe paranoia, through simulations that let patients confront and overcome their delusions.

Scientists at Oxford University recruited 60 adult women who had reported feelings of paranoia in the month beforehand. Each donned a virtual reality helmet and took two virtual tube rides, complete with computer-generated passengers that nattered around them.

While most of the women sensed there was something strange about one of the rides, few realised this was down to the scientists lowering their height, and so their point of view, by around 30cm.

To see how the change in perspective affected the virtual passengers, the researchers asked the women to fill out two questionnaires before and after the rides. One measured how well the women felt they compared to others, such as being more or less talented, or more or less attractive. The second questionnaire provided a paranoia score by asking them to rate statements like “someone had it in for me” on a scale from one to five.

The results, published in the journal Psychiatry Research, show that the women’s social comparison scores fell on average from 60 to 52 when they saw the world from lower down. At the same time, their paranoia scores rose from 12 to 14.

“When you are lower down than normal, it makes you feel more inferior to other people, and that I think makes you feel more vulnerable, and that’s what leads you to see hostility where there isn’t any,” said Daniel Freeman, a professor of clinical psychology who led the study.

“It’s not saying that short people have greater levels of paranoia, it’s about what happens when your own normal height is reduced in social situations,” he added.

Freeman said the work could help researchers to find better ways to treat paranoia by boosting people’s self-esteem. One way might be to artificially raise their height in a virtual world to give them more confidence than normal, and gradually reduce it. One unknown is how long the effects last for.

Willem-Paul Brinkman, who works on virtual reality therapy for people with mental health problems at Delft University in the Netherlands, said the work added to other studies that show that giving people taller avatars in a virtual world made them more confident negotiators, while giving them a more attractive avatar made them more intimate towards others.

Brinkman said the latest study showed the potential for VR to treat patients whose paranoid thoughts interfered with their daily life. “Giving therapists the ability to put patients in a VR environment and discuss these thoughts when they are actually experiencing them, could be very beneficial. At the moment, therapists have to rely on the patient’s recollections of these experiences.”

The challenge, he said, was to find triggers to provoke feelings of paranoia. The ethnicity of an avatar might be one trigger, but changes to a person’s height could be another. “These triggers might affect individuals differently, so it would be good to offer therapists a number of them so they can tailor it to the needs of the patient,” Brinkman said.

Genetically modified monkeys created with cut-and-paste DNA.


Breakthrough could help battle diseases such as Alzheimer’s and Parkinson’s but ethical concerns remain over animal testing.

The twin cynomolgus monkeys born

The twin cynomolgus monkeys, Ningning and Mingming, born at Nanjing Medical University in China. Photograph: Cell, Niu et al

Researchers have created genetically modified monkeys with a revolutionary new procedure that enables scientists to cut and paste DNA in living organisms.

The macaques are the first primates to have their genetic makeup altered with the powerful technology which many scientists believe will lead to a new era of genetic medicine.

The feat was applauded by some researchers who said it would help them to recreate devastating human diseases in monkeys, such asAlzheimer’s and Parkinson’s. The ability to alter DNA with such precision is already being investigated as a way to make people resistant to HIV.

But the breakthrough is controversial, with groups opposed to animal testing warning that it could drive a rise in the use of monkeys in research. One critic said that genetic engineering gave researchers “almost limitless power to create sick animals“.

The work was carried out in a lab in China, where scientists said they had used a genome editing procedure, called Crispr/Cas9, to manipulate two genes in fertilised monkey eggs before transferring them to surrogate mothers.

Writing in the journal, Cell, the team from Nanjing Medical University reported the delivery of twin female long-tailed macaques, called Ningning and Mingming. Five surrogates miscarried and four more pregnancies are ongoing.

The Crispr procedure has been welcomed by geneticists in labs around the world because of its enormous potential. Unlike standard gene therapy, Crispr allows scientists to remove faulty genes from cells, or replace them with healthy ones. It can even correct single letter spelling mistakes in the DNA code.

The Chinese team, led by Jiahao Sha, said their work demonstrates how Crispr could be used to create monkeys that carry genetic faults that lead to diseases in humans. But the same could be done to small pieces of human organs grown in the lab, and used to test drugs, or to monitor the progress of serious diseases.

Nelson Freimer, director of the centre for neurobehavioural genetics at the University of California in Los Angeles, said that while researchers often use mice to study human diseases, brain disorders are particularly hard to recreate in the animals because their brains are so different.

“People have been looking for primate models for a whole list of diseases, but in the past it’s been either completely unfeasible, or incredibly expensive. This is saying we can do this relatively inexpensively and quickly, and that is a major advance,” said Freimer.

But Freimer added that the use of monkeys was likely to remain a last resort. “It’s going to be really critical to define the problems for which this is used, just as you always do with animal research. You want to use all the alternatives before you propose animal research. This will be reserved for terrible diseases for which it offers hope that cannot be gotten any other way,” he said.

Tipu Aziz, who has used primates in his work on Parkinson’s disease at Oxford University, welcomed the new procedure. “If we can identify genes for neurological disorders in a clinical setting and transpose those into a monkey it would be of massive benefit. I don’t know that it’ll lead to a rise in the use of monkeys, but it will lead to more focused studies,” he said.

Robin Lovell-Badge, head of genetics at the MRC’s National Institute forMedical Research in London, said that genetically modified monkeys could be valuable to check new therapies before they are tried in humans. “Mice are fantastic models for some aspects of human physiology, but they are not always perfect, and it’s good to have alternatives,” he said. “If you are trying to develop a stem cell therapy and want to graft cells back into the brain, it’s difficult to know how it will work in a complex brain, and mice or rats are not suitable.” With Crispr, scientists could perform far more subtle genetic tweaks than is possible with other methods, he added.

George Church, professor of genetics at Harvard University, has co-founded a company, Editas Medicine, that aims to use Crispr to treat a number of human diseases. While monkeys had a role to play, he said another approach was to grow human “organoids” or small clumps of human organ tissue in the lab, and use Crispr to give them genetic faults that cause disease. “This is a really big moment, because if you think something has a genetic component, you can prove it with Crispr, and then improve it with Crispr, or other therapies,” he said.

One idea in trials already uses genome editing to remove a gene called CCR5 from human immune cells. Without the gene, the HIV virus cannot get into immune cells, so patients could be cured of the disease. In future, the same procedure could be used on healthy people at risk of the disease to make them resistant to infection.

Vicky Robinson, chief executive of National Centre for the Replacement, Refinement and Reduction of Animals in Research (NC3Rs), said: “This research could drive an increase in the use of non-human primates worldwide. Whether that would be justified in terms of the benefits to scientific and medical research, let alone the ethical considerations, is open to debate. Just because the monkey has greater similarity to man than other animal species does not guarantee that it will be a better surrogate for studying human disease, a point that decision makers – funders and regulators – should take seriously.”

Troy Seidle, director of research and toxicology at Humane Society International, called for an outright ban on the genetic manipulation of monkeys. “You can’t genetically manipulate a highly sentient non-human primate without compromising its welfare, perhaps significantly. GM primates will be just as intelligent, just as sensitive to physical and psychological suffering as their non-GM counterparts, and our moral responsibility toward them is no less. In fact, the scope for animal suffering is increased because genetic engineering gives researchers almost limitless power to create sick animals with potentially devastating and disabling symptoms, which can include entirely unexpected phenotypic mutations. It’s also worth noting that this research is being pioneered in China, where there are currently no laws or enforced ethical controls on animal experiments.”

Dr Andrew Bennett, a scientist with the Fund for the Replacement of Animals in Medical Experiments (Frame), added: “Whilst the technological advances in genetic engineering are to be both applauded and admired, their subsequent use to produce genetically modified monkeys is questionable at best. Frame would call for more funding to be used to produce model systems based on human tissues and cells rather than try to develop more sophisticated laboratory animal species. If you’re working on human disease, then it is necessary to use human-derived material to predict human responses.”

Hidden hierarchy in music revealed


String quartet
Studying tiny changes in timing can reveal if a string quartet has a leader or not

Scientists have come up with a way to reveal the pecking order within a string quartet.

A team from the Royal Academy of Music and the University of Birmingham found that analysing how individual musicians vary their timing to follow the rest of the group can indicate a hierarchy.

They say it shows some quartets have a clear leader to ensure perfect harmony.

However, in other “democratic” quartets the musicians all follow each other, playing an equal role.

Prof Alan Wing, from the University of Birmingham, UK, said of the study, published in the Journal of the Royal Society Interface: “In one quartet, it was as if there was an autocracy. In the other, it was more like a democracy.”

Making changes

The subtle interactions within a string quartet can bring a performance to life, but the team says it is this interplay that reveals the hidden hierarchy.

To investigate, the researchers asked two well-established groups of chamber musicians to play a composition by Joseph Haydn.

“Start Quote

She wasn’t correcting to the timing of the other players – the other players were correcting much more to her”

Prof Alan WingUniversity of Birmingham

Prof Wing said: “We took them into a recording room and we fitted their instruments up with little microphones under the strings very close to the bridge, which would pick up the sound from each of the players individually.”

The team analysed each musician’s timing as they played, and noted any tiny changes to the tempo.

They then looked at how these variations, which were in the order of one hundredth of a second, affected the rest of the group.

In one of the quartets, they found that three of the musicians were constantly having to speed up or slow down to stay in sync. However, the fourth player did not budge, letting the others adjust to her.

“The first violin was quite clearly providing a leadership,” explained Prof Wing.

“She wasn’t correcting to the timing of the other players – the other players were correcting much more to her.”

However, in the other quartet, all of the members altered their timing equally, suggesting a more democratic arrangement.

Prof Wing said: “There was no distinction between the first violin and the other players – they were all making equal corrections to each other.”

He added that the players were surprised to find that these kinds of hierarchies existed within their quartets. However, the musicians suspected that different pieces of music might alter the organisation within the group.

The scientists now want to find out if audiences notice a difference, and which performances they prefer.

They also want to discover whether similar hierarchies exist within other types of music.

Adrian Bradbury, a co-author from the Royal Academy of Music in London, said: ‘Live interaction between musicians on stage is often the most electrifying element of a performance, but remains one of the least well understood.

“I hope fellow musicians will agree that this method of ‘X-raying’ a performance to expose a group’s hierarchy will prove useful to us and fascinating to our audiences.”

Stem cell ‘major discovery’ claimed


Petri dishes filled with stem cells

Stem cell researchers are heralding a “major scientific discovery”, with the potential to start a new age of personalised medicine.

Scientists in Japan showed stem cells can now be made quickly just by dipping blood cells into acid.

Stem cells can transform into any tissue and are already being trialled for healing the eye, heart and brain.

The latest development, published in the journal Nature, could make the technology cheaper, faster and safer.

The human body is built of cells with a specific role – nerve cells, liver cells, muscle cells – and that role is fixed.

However, stem cells can become any other type of cell, and they have become a major field of research in medicine for their potential to regenerate the body.

Embryos are one, ethically charged, source of stem cells. Nobel prize winning research also showed that skin cells could be “genetically reprogrammed” to become stem cells (termed induced pluripotent stem cells).

Acid bath

Now a study shows that shocking blood cells with acid could also trigger the transformation into stem cells – this time termed STAP (stimulus-triggered acquisition of pluripotency) cells.

Dr Haruko Obokata, from the Riken Centre for Developmental Biology in Japan, said she was “really surprised” that cells could respond to their environment in this way.

She added: “It’s exciting to think about the new possibilities these findings offer us, not only in regenerative medicine, but cancer as well.”

The breakthrough was achieved in mouse blood cells, but research is now taking place to achieve the same results with human blood.

Chris Mason, professor of regenerative medicine at University College London, said if it also works in humans then “the age of personalised medicine would have finally arrived.”

He told the BBC: “I thought – ‘my God that’s a game changer!’ It’s a very exciting, but surprise, finding.

“It looks a bit too good to be true, but the number of experts who have reviewed and checked this, I’m sure that it is.

“If this works in people as well as it does in mice, it looks faster, cheaper and possibly safer than other cell reprogramming technologies – personalised reprogrammed cell therapies may now be viable.”

For age-related macular degeneration, which causes sight loss, it takes 10 months to go from a patient’s skin sample to a therapy that could be injected into their eye -and at huge cost.

Prof Mason said weeks could be knocked off that time which would save money, as would cheaper components.

Dr Haruko Obokata explains how she nearly gave up on the project when fellow researchers didn’t believe what she had found

‘Revolutionary’

The finding has been described as “remarkable” by the Medical Research Council’s Prof Robin Lovell-Badge and as “a major scientific discovery” by Dr Dusko Ilic, a reader in stem cell science at Kings College London.

Dr Ilic added: “The approach is indeed revolutionary.

“It will make a fundamental change in how scientists perceive the interplay of environment and genome.”

But he added: “It does not bring stem cell-based therapy closer. We will need to use the same precautions for the cells generated in this way as for the cells isolated from embryos or reprogrammed with a standard method.”

And Prof Lovell-Badge said: “It is going to be a while before the nature of these cells are understood, and whether they might prove to be useful for developing therapies, but the really intriguing thing to discover will be the mechanism underlying how a low pH shock triggers reprogramming – and why it does not happen when we eat lemon or vinegar or drink cola?”

Magnetic poles can ‘split in two’


Magnetic field demonstrated by iron filings

If you break a magnet in two, you don’t get a north half and a south half – you get two new magnets, each with two poles.

“Monopoles” were famously predicted to exist by physicist Paul Dirac in 1931 – but they have remained elusive.

Now scientists have engineered a synthetic monopole in a quantum system for the first time, allowing its mysterious properties to be explored.

They describe their breakthrough in Nature journal.

“Detecting a natural magnetic monopole would be a revolutionary event comparable to the discovery of the electron,” wrote the team from Aalto University, Finland, and Amherst College, US, in their paper.

“[Our work] provides conclusive and long-awaited experimental evidence of the existence of Dirac monopoles.

“It provides an unprecedented opportunity to observe and manipulate these quantum mechanical entities in a controlled environment.”

The discovery of magnetic monopoles has been long-awaited as they can help to explain various physical phenomena.

Researchers have hunted for them since Paul Dirac first theorised their quantum-mechanical characteristics in 1931.

He demonstrated that if even a single monopole exists, then all electrical charge must come in discrete packets – which has indeed been demonstrated.

To observe and test them in the lab, scientists engineered a quantum system – the magnetic field of a cloud of rubidium atoms in an unusual state of matter known as a Bose-Einstein condensate.

Using direct imaging, they detected a distinct signature of the Dirac monopole – known as a “Dirac string“.

The researchers note that – while other teams have previously made analogues of monopoles, their demonstration is the first in a quantum system which can be tested by experiment.

“This creation of a Dirac monopole is a beautiful demonstration of quantum simulation,” said Lindsay LeBlanc, of the University of Alberta, a physicist not involved in the study.

“Although these results offer only an analogy to a magnetic monopole, their compatibility with theory reinforces the expectation that this particle will be detected experimentally.

“As Dirac said in 1931: ‘Under these circumstances one would be surprised if Nature had made no use of it’.”

Why city life may be bad for you


Man walking
People say they would be more active if there were safe and attractive green spaces near to where they live

When it comes to getting people to be more active, much of the attention is focused on the improving sports facilities, encouraging people to join the gym or lambasting schools for not doing enough PE.

But could another crucial factor be the way neighbourhoods are designed?

The Royal Institute of British Architects (RIBA) thinks so.

The organisation has carried out an analysis of the nine major cities in England – Birmingham, Bristol, Leeds, Liverpool, London, Manchester, Newcastle, Nottingham and Sheffield – to explore this.

Its researchers looked at housing density and the availability of green spaces.

‘Healthier cities’

“Start Quote

It’s vital that planners and developers take the lead and ensure healthier cities”

Stephen HodderRoyal Institute of British Architects

The least active areas – deprived parts of Birmingham, Newcastle and London – had twice the housing density and 20% less green space than the most active places.

This is important.

Nearly 60% of people living in these cities do not do the recommended levels of activity.

But, crucially, three quarters said they would be happy to walk more and get outside in the fresh air if their local environment was more suitable, according to a poll cited by RIBA.

People cited safer streets and more attractive green spaces as two key factors.

RIBA has published the findings as it wants councils to take note.

Under the shake-up of the NHS last year, local government was given responsibility for public health.

So RIBA president Stephen Hodder said he wanted councils to ensure public health becomes an important part of the planning process.

“It’s vital that planners and developers take the lead and ensure healthier cities,” he added.

Play area in Huthwaite, Nottinghamshire
The play area in Huthwaite was developed thanks to a lottery grant

To be fair, this is already happening in many places.

Health impact assessments have become a crucial part of the process.

But as always – for councils which have seen their funding cut dramatically in recent years – it comes down to money.

One of the examples of good practice cited by RIBA in its report was the re-development of the Brownfield Estate, an inner-London housing estate.

It under-went a major £7m building programme with money invested from a variety of public and private sources.

The project saw the walk-ways between flats become “green grids” lined with grass and trees, while play areas were created across the site.

Another scheme highlighted was the creation of a natural play area with climbing frames, a water foundation and wetland on a disused field in the former mining town of Huthwaite in north Nottinghamshire.

Once empty, the area is now packed with children (when the weather permits).

But this project was only possible because the area was given over £200,000 of lottery money.

Whatever happened to the term New Man?


A man holds a baby in the style of 1980s Athena poster

The New Man was once a radical way to describe a male who wholeheartedly accepted equality in domestic life. But 30 years on, what has happened to the term?

The New Man rose to prominence in the 1980s like an exotic new species, happy to do the washing up or change a nappy.

According to the Oxford English Dictionary, the New Man was someone “who rejects sexist attitudes and the traditional male role, esp. in the context of domestic responsibilities and childcare, and who is (or is held to be) caring, sensitive, and non-aggressive”.

Its first reference was a 1982 Washington Post article about Dustin Hoffman’s cross-dressing comedy, Tootsie. “(It) has enough rowdy, inconsequential fun in it to take the curse off Hoffman’s sentimentalized notion of The New Man, but it’s also in the nature of a lucky tightwire act that comes close to tripping him up.”

It’s noticeable that the newspaper felt no need to elaborate. The term was presumably already in common circulation.

The dictionary also flags up as significant a reference in Kate Atkinson’s 1995 novel Behind the Scenes at the Museum, whose narration looked back at earlier decades. “Bunty and I were in the Co-op mobile shop… when Mr Roper bounded on board, looking for washing-powder – a new man ahead of his time.”

Man washing dishes

Martin Kelner, who contributes to BBC Radio 5 Live’s Fighting Talk, had children in the 1980s and saw himself as a New Man. “I suppose I became a New Man by being entirely different to my dad. I was really, really hands on.” He changed nappies “from day one”, mashed vegetables for his children’s tea, and got up in the night with a bottle.

“Start Quote

[New Men] were around for a bit until they realised women didn’t want to sleep with them”

Giles Coren

Hardly radical stuff, a father today might respond. But it was a departure from previous generations, Kelner argues.

A conference at London’s Southbank Centre this weekend – Being a Man – aims to shed light on where masculinity has got to. One panel discussion features author Nick Hornby, singer Billy Bragg and designer Wayne Hemingway talking about “being a bloke”.

The Bloke like the Chap, Alpha Male, Metrosexual and Ubersexual are terms that have followed, with varying degrees of popularity. Another term that has cropped up on and off is “sensitive new age guy.”

Today the New Man is like a relic of gender history. Times columnist Giles Coren says it became a cartoon-like figure of fun. “They were terrible limp men carrying babies around their chest. They ate vegetables and gave up drinking. They were around for a bit until they realised women didn’t want to sleep with them.”

When the New Man arrived in the 1980s gender roles and the labour market were in flux. Men were being laid off from industrial jobs in Western countries, while the service sector expanded rapidly.

Neil Morrissey as Tony and Martin Clunes as Gary in Men Behaving Badly
Tony and Gary in Men Behaving Badly – new lads, not new men

“To me it mostly meant Athena posters of ‘hunky’ young men holding babies,” says the writer Mark Simpson. The emphasis was on sensitivity as a reaction against traditional stoicism, although a lot of it was about marketing to women, he believes.

“If there was a real actual living ‘New Man’ in the 1980s it was probably Morrissey – and a big part of his appeal back then to a generation of young men and women was precisely his willingness to go against Dad-ish gender norms and express emotion and ‘feminine’ qualities.”

Trotsky’s man of the future

Leon Trotsky speculated about the man of the future: “Man will make it his purpose to master his own feelings, to raise his instincts to the heights of consciousness, to make them transparent, to extend the wires of his will into hidden recesses, and thereby to raise himself to a new plane, to create a higher social biologic type, or, if you please, a superman.” The New Soviet Man followed, someone who would work tirelessly for the common good. The most famous exponent was the miner Alexey Stakhanov.

Labels come and go. Soon the New Lad arrived. He was an amalgam of various cultural trends, evident from TV shows Men Behaving Badly and Fantasy Football, lad mags like Loaded and FHM, and celebrities like Chris Evans and Liam Gallagher.

In the UK at least, the New Man had gone as a media term, although presumably many dads were still changing nappies and mashing vegetables.

But by 2000 the Observer was predicting a return. “What is significant is that it appears that straight men are taking an interest in fashion, which is a sure sign that some New Man thing is about to kick in again.”

Victoria and David Beckham pictured in 2002

It was the era of David Beckham – a man comfortable with wearing a sarong, not drinking alcohol and looking after his pecs and six-pack. It was the beginnings of the metrosexual, a term that Simpson may have coined in a 1994 Independent piece: “One sharply dressed ‘metrosexual’ in his early 20s… has a perfect complexion and precisely gelled hair, and is inspecting a display of costly aftershaves.” Moisturiser was de rigueur.

Then the tide turned again. In 2007 Observer columnist Barbara Ellenwrote of metrosexual men: “Could it be that post-feminism has created its own Frankenstein’s monster? The man who is so like a woman he’s unfanciable?”

The term “man up” became common. It used to be a piece of staffing terminology. The OED cites a 1947 letter to the editor of The Times from Henry Strauss, a Conservative member of Parliament, complaining about man up as an insidious Americanism. “Must industries be fully ‘manned up’ rather than ‘manned’?” Strauss asked. “Must the strong, simple transitive verb, which is one of the main glories of our tongue, become as obsolete in England as it appears to be in America?”

The recent usage has its own controversy, with “man up” today being a slight, however jocular, like “pull yourself together” but in terms that feminists might regard as macho. Some even use “woman up”.

At the same time a small but committed band of men’s rights activists has attempted to create a a male version of feminism. Although exactly what male activists are fighting for, when men still dominate so many important areas of society, bemuses many others.

Guests in period costumes attend "The Chap Olympiad" in central London on July 13, 2013.
Participants in the Chap Olympiad, 2013

Much of today’s male movements seem to be about sartorial standards. The Chaps hold an Olympiad where they sport handlebar moustaches, pith helmets, tweeds and “immaculate trouser creases”.

In 2005 Marian Salzman introduced a new term “the ubersexual” in her book The Future of Men. “Ubersexuals are confident, masculine and stylish, and committed to uncompromising quality in all areas of life,” she said.

And since then advertisers and cosmetics firms have moved in with gusto.

“Start Quote

Males given a chance will revert to being what they were years before, which was a bit prehistoric”

Martin Kelner

Dirk Haeusermann, a creative director at advertising firm Draftfcb Deutschland, says the hipster movement has built on metrosexuality’s foundations. “What I’ve observed is how men have rediscovered their masculinity without losing that care in their appearance. There’s the return of the full beard as a hipster item.”

It’s all a far cry from the New Man. “I don’t really know what that is any more,” says Tim Samuels, presenter of Men’s Hour. A lot of the virgin territory the New Man was said to be exploring is now just expected as part of the job of being a colleague, boyfriend or husband.

Instead those who don’t come up to scratch in the US might earn the epithet “deadbeat dad”.

Times columnist David Aaronovitch says New Man was useful “journalistic shorthand” for a muddying of the gender waters. “New man was a term of art like ‘ladette’ that suggests the discussion about crossing roles.”

But today it’s redundant. “That hugely defined male role has disappeared and so the need for something called the New Man has gone,” he argues.

But just how much masculine and feminine have crossed over remains a source of debate. Channel Four newsreader Jon Snow, who is chairing a talk at this weekend’s conference, has raised eyebrows with his admission that sex would be on his mind whenever he met a woman for the first time.

Kelner thinks some of the talk of gender convergence is overblown. For him the New Man was about being considerate to your partner rather than a sign that men have fundamentally changed. “There’s always going to be that atavistic thing,” says Kelner. “Males given a chance will revert to being what they were years before which was a bit prehistoric.”

Men as a rule don’t like talking about being men. There’s a serious danger of metropolitan “navel gazing” at the South Bank, he says.

“If they were holding this conference in Mansfield they’d be lucky to sell a ticket.”

Integration brings quantum computer a step closer


An international research group led by the University of Bristol has made an important advance towards a quantum computer by shrinking down key components and integrating them onto a silicon microchip.

Scientists and engineers from an international collaboration led by Dr Mark Thompson from the University of Bristol have, for the first time, generated and manipulated single particles of light (photons) on a  – a major step forward in the race to build a quantum computer.

Quantum computers and quantum technologies in general are widely anticipated as the next major technology advancement, and are poised to replace conventional information and computing devices in applications ranging from ultra-secure communications and high-precision sensing to immensely powerful computers. While many of the components for a quantum computer already exist, for a quantum computer to be realised, these components need to be integrated onto a single chip.

Featuring today on the front cover of Nature Photonics, this latest advancement is one of the important pieces in the jigsaw needed in order to realise a quantum computer. While previous attempts have required external light sources to generate the photons, this new chip integrates components that can generate photons inside the chip. “We were surprised by how well the integrated sources performed together,” admits Joshua Silverstone, lead author of the paper. “They produced high-quality identical photons in a reproducible way, confirming that we could one day manufacture a silicon chip with hundreds of similar sources on it, all working together. This could eventually lead to an optical quantum computer capable of perform enormously complex calculations.”
 


The group, which, includes researchers from Toshiba Corporation (Japan), Stanford University (US), University of Glasgow (UK) and TU Delft (The Netherlands), now plans to integrate the remaining necessary components onto a chip, and show that large-scale quantum devices using photons are possible.

“Our group has been making steady progress towards a functioning quantum computer over the last five years,” said Thompson. “We hope to have a photon-based device which can rival modern computing hardware for highly-specialised tasks within the next couple of years.”