New research suggests pre-Homo human ancestral species, such asAustralopithecus africanus, used human-like hand postures much earlier than was previously thought.

Anthropologists from the University of Kent, working with researchers from University College London, the Max Planck Institute for Evolutionary Anthropology in Leipzig (Germany) and the Vienna University of Technology (Austria), have produced the first research findings to support archaeological evidence for stone tool use among fossil australopiths 3-2 million years ago.

The distinctly human ability for forceful precision (e.g. when turning a key) and power “squeeze” gripping (e.g. when using a hammer) is linked to two key evolutionary transitions in hand use: a reduction in arboreal climbing and the manufacture and use of stone tools. However, it is unclear when these locomotory and manipulative transitions occurred.

Dr Matthew Skinner, Senior Lecturer in Biological Anthropology and Dr Tracy Kivell, Reader in Biological Anthropology, both of Kent’s School of Anthropology and Conservation, used new techniques to reveal how fossil species were using their hands by examining the internal spongey structure of bone called trabeculae. Trabecular bone remodels quickly during life and can reflect the actual behaviour of individuals in their lifetime.

The researchers first examined the trabeculae of hand bones of humans and chimpanzees. They found clear differences between humans, who have a unique ability for forceful precision gripping between thumb and fingers, and chimpanzees, who cannot adopt human-like postures. This unique human pattern is present in known non-arboreal and stone tool-making fossil human species, such as Neanderthals.

The research, titled Human-like hand use in Australopithecus africanus, shows that Australopithecus africanus, a 3-2 million-year-old species from South Africa traditionally considered not to have engaged in habitual tool manufacture, has a human-like trabecular bone pattern in the bones of the thumb and palm (the metacarpals) consistent with forceful opposition of the thumb and fingers typically adopted during tool use.

These results support previously published archaeological evidence for stone tool use in australopiths and provide skeletal evidence that our early ancestors used human-like hand postures much earlier and more frequently than previously considered.

Neuroscientists discover that cold is contagious

Just looking at someone else shiver from the cold can cause certain parts of our bodies to drop in temperature, scientists have discovered, providing the first evidence for a phenomenon known as ‘temperature contagion’.

Hot day? Cool yourself down by watching a video about someone being cold, researchers are saying, thanks to a new study that shows the feeling of being cold is contagious, whereas the feeling of hot is not.

A team of neuroscientists led by Ella Cooper from the University of Sussex and John Garlick from University College London, both in the UK, gathered together 36 volunteers who watched eight different three-minute videos that depicted actors with either their right or left hands in visibly warm or cold water, or their hands in front of the water as a control component. No emotional cues were given by the actors to show pain or discomfort.

Before and after each video was watched, the temperatures of the volunteers’ right and left hands were measured and compared to see if any change had occurred. The temperature in the room was managed and the volunteers were asked to keep their hands as still as possible to minimise changes in temperature due to muscle movement. Even the tiny amount of heat that comes off a television screen was accounted for.


“While watching the warm and neutral videos did not produce any changes in subjects’ hand temperature, watching the cold videos caused a small, but unmistakable drop,” says Ross Pomeroy at Real Clear Science. “The temperature of subjects’ right hands fell by an average of 0.1 degrees Fahrenheit, and the temperature of their left hands fell [by] 0.4 degrees. There was no change in heart rate.”

So we’re talking about the tiniest fraction of a degree Celsius – barely anything, but it’s still something, just from watching a video, and that’s fascinating.

Publishing their results in the journal PLoS ONE, the team suggest that the reason the warm videos didn’t provoke a physiological response in the volunteers while the cold videos did could be because while the steam coming off the warm water indicated that it was warm, perhaps it wasn’t as visible to the volunteers as the ice cubes floating around in the water of the cold videos. That, and the fact that previous research “has highlighted that temperature decreases are typically easier to elicit and of greater magnitude than temperature increases,” they report.

Interestingly, the team got the volunteers to self-report their levels of empathy, and these measurements actually predicted their differences in sensitivity to the temperature contagion.

This means the researchers can add a new physical condition to the phenomenon of emotional contagion – the tendency for two individual people to mimic each other’s expressions and emotional states, the researchers concluded. They explain further in the paper:

“Emotional contagion is thought to be mediated by mirror neurons, brain cells that fire both when an animal performs a certain action or observes that action. The study also broadly substantiates an extreme case of human temperature fluctuation documented in 1920 by scientist J.A. Hadfield, who worked with a patient who was able to selectively adjust their right and left hand temperature by as much as 5 degrees Fahrenheit through suggestions of heat or cold.


Cartoons kill’: Kids’ movies show more death than adult ones, study finds.

AFP Photo/Str

The number of onscreen deaths of main characters in children’s films is two-and-a-half times greater than in movies for adult audiences, a survey that examined 45 of the highest-grossing animated flicks in history revealed.

“Rather than being the innocuous form of entertainment they are assumed to be, children’s animated films are rife with on-screen death and murder,” a survey published by the BMJ (formerly British Medical Journal) this week said.

According to the researchers from University College London and the University of Ottawa, the death of an important character occurred in two thirds of children’s animated films, with the figure for adult flicks standing at around a half.

The deaths in movies for kids are often violent, which “might be more traumatic for children”, the survey entitled ‘Cartoons kill’ said.

The surveyed movies included three gunshot deaths (Bambi, Peter Pan, Pocahontas), two stabbings (Sleeping Beauty, The Little Mermaid) and five animal attacks (A Bug’s Life, The Croods, How to Train Your Dragon, Finding Nemo and Tarzan).

The demise of main characters was likely to happen in the early minutes of children’s animation, the survey said, like in Finding Nemo where Nemo’s mom is eaten alive by a barracuda at 04:03 and Tarzan where a leopard kills Tarzan’s parents at 04:08.

In general, the parents were five times more likely to perish in kid’s films than in movies for older audiences.

The survey examined 45 of the highest-grossing animated films in history – from 1937’s Snow White to last year’s blockbuster Frozen.

The adult films they were matched against consisted of the two highest box office grossing movies in the same year as each animated film was released, including horror and thriller flicks like The Exorcism of Emily Rose, Pulp Fiction, The Departed and Black Swan.

The researchers advised the parents to “consider watching such movies alongside their children, in the event that the children need emotional support after witnessing the inevitable horrors that will unfold.”

However, onscreen death is not only bad for kids as films, which model “appropriate grief responses could help children to gain a deeper understanding of the meaning of death.”

As an example, the survey uses The Lion King, in which a lion cub forgives the murder of his father in order to cope with his loss.

“Films depicting death in this more nuanced way could provide a valuable resource for initiating discussions about death between children and adults. Indeed, cinematherapy is sometimes used to facilitate counseling with grieving adolescents, a therapeutic practice that might be extended to younger children,” the researches explained.

Young teens’ weight terror ‘common’

About 10% of 13-year-old girls are “terrified” about putting on weight, the first UK study looking for warning signs of eating disorders suggests.

Doctors said they were “worried” by the high degree of weight fixation found in 13-year-old girls, years before eating disorders typically start.

Researchers say it might be possible to stop full eating disorders developing.

Their findings, in the Journal of Adolescent Health, came from interviews with the parents of 7,082 13-year-olds.

Eating disorders, such as anorexia, tend to start in the mid-teenage years, although they can begin before then.

“Start Quote

For me the results were particularly worrying”

Dr Nadia Micali

The study, by University College London and the London School of Hygiene and Tropical Medicine, looked at the years before those disorders tend to start.

Interviews with the parents of 7,082 13-year-old schoolchildren showed:

  • Nearly 12% of girls and 5% of boys were “terrified” by the thought of getting fat
  • 52% of girls and a third of boys said they were “a little worried” about getting fat
  • One in three girls and one in five boys were “distressed” by their body shape
  • 26% of girls and 15% of boys had “eating disorder behaviours” such as fasting
  • Some habits, such as uncontrolled bingeing, were linked to higher weight two years later

One of the researchers, Dr Nadia Micali, told the BBC she was surprised children were so concerned about weight at such a young age.

“For me the results were particularly worrying, I wouldn’t have thought they’d be so common at this age.

“Part of me thinks it’s a shame we didn’t ask earlier, we don’t know when this behaviour starts.

“Quite a large proportion will develop full-blown eating disorders, maybe more than half.”

However, she said there might be an opportunity to help children before they develop an eating disorder if a reliable set of warning signs could be developed for parents and teachers.

In a statement the eating disorder charity Beat said it was an interesting and important study.

“This is the first time a study like this has been carried out so we have nothing to compare it to and therefore don’t know if the problem is increasing or getting worse.

“However it is striking and worrying how many young people had concerns about their weight from such a young age.

“It does not mean that they will all go on to develop eating disorders, but they could be tempted by unhealthy ways to control their weight and shape.”

The findings came through data collected from the Children of the 90s study.


A separate analysis of those children, by a team at the University of Warwick, suggested bullying was linked to an increased risk of psychotic experiences, such as hearing voices, and paranoia.

Lead researcher Prof Dieter Wolke said: “We want to eradicate the myth that bullying at a young age could be viewed as a harmless rite of passage that everyone goes through – it casts a long shadow over a person’s life and can have serious consequences for mental health.

“It strengthens on the evidence base that reducing bullying in childhood could substantially reduce mental health problems.

“The benefit to society would be huge, but of course, the greatest benefit would be to the individual.”

The science of dread: anticipating pain makes it worse

For most people, a chocolate today is better than one tomorrow. Economists refer to this as “future discounting”, where we prefer to have nice things now rather than wait and unpleasant things later rather than now.

Rather be in the waiting room?

But this isn’t always the case in reality. When it comes to a potentially painful experience, like having an operation, many people choose to get it over and done with rather than put it off.

In a new paper published in PLOS Computational Biology, researchers from Imperial College London and University College London explore how this reluctance to wait for pain – a feeling commonly known as dread – changes depending on the timing of the painful event. The researchers wanted to know if dread is worse when pain is more delayed.

The study involved a series of experiments in which pain took the form of brief electric shocks to the back of the hand of 35 participants. Over the course of the experiments, participants could choose to receive shocks soon, or delay them by a certain amount of time, which could range from a few seconds to a quarter of an hour.

A minority chose to wait and receive the shocks further into the future. But 71% of participants opted to receive pain sooner, even if meant it would be worse, because, in half of the experiments, choosing earlier pain resulted in more shocks.

The researchers also compared the size of the delay and the probability a participant would choose the later shock. They found the relationship between the two was best described by what they called “exponential dread”: the bigger the delay, the more likely it was the person would opt for the earlier shock.

The shock experiments were relatively brief, so researchers also looked at what happens when people can delay a painful experience much further into the future. The participants were given a hypothetical scenario, in which they had to schedule an appointment for a painful dental procedure. They were told they could have the procedure “today”, or at a fixed later time. This time varied between participants: it could be 1, 5, 13, 32, 89 or 237 days.

Once again, the participants’ choices suggested dread increases exponentially as people approach a painful event:

These results build on previous work, such as a 2006 study which assumed people experienced constant amount of dread over time, rather than having dread increase with the size of the delay. There are, however, still some questions that remain unanswered.

First, how does dread scales with time? The researchers looked at events that occur over minutes and weeks, but can the same patterns be found at other timescales?

Second, what causes us to experience dread in the first place? The researchers suggest a couple of potential explanations. It might be that the brain processes designed to prepare us for a painful experience overrule other types of behaviour, even if this other behaviour could be beneficial. Alternatively, dread could be a form of “stimulus substitution”, whereby the anticipation of pain triggers the same responses that we experience during an actual pain event.

Even if the causes of dread remain elusive, understanding how people deal with the anticipation pain could help in a number of fields. In particular, it could be useful when assessing options about a potentially painful future event, whether that event is an electric shock, a medical procedure, or your girlfriend finding out you’ve eaten all the chocolates.

Why do we value gold?

Mankind’s attitude to gold is bizarre. Chemically, it is uninteresting – it barely reacts with any other element. Yet, of all the 118 elements in the periodic table, gold is the one we humans have always tended to choose to use as currency. Why?

Why not osmium or chromium, or helium, say – or maybe seaborgium?

I’m not the first to ask the question, but I like to think I’m asking it in one of the most compelling locations possible – the extraordinary exhibition of pre-Columbian gold artefacts at the British Museum?

That’s where I meet Andrea Sella, a professor of chemistry at University College London, beside an exquisite breastplate of pure beaten gold.

He pulls out a copy of the periodic table.

Periodic table

“Some elements are pretty easy to dismiss,” he tells me, gesturing to the right-hand side of the table.

“Here you’ve got the noble gases and the halogens. A gas is never going to be much good as a currency. It isn’t really going to be practical to carry around little phials of gas is it?

Gold – key facts

Gold - symbol, atomic number and weight
  • Symbol: Au (from Latin aurum)
  • Atomic number: 79
  • Weight: 196.97
  • One of the “noble” metals that do not oxidise under ordinary conditions
  • Used in jewellery, electronics, aerospace and medicine
  • Most gold in the earth’s crust is thought to derive from meteors
  • Biggest producers: China, Australia, US, Russia
  • “And then there’s the fact that they are colourless. How on earth would you know what it is?”

The two liquid elements (at everyday temperature and pressure) – mercury and bromine – would be impractical too. Both are also poisonous – not a good quality in something you plan to use as money. Similarly, we can cross out arsenic and several others.

Sella now turns his attention to the left-hand side of the table.

“We can rule out most of the elements here as well,” he says confidently.

“The alkaline metals and earths are just too reactive. Many people will remember from school dropping sodium or potassium into a dish of water. It fizzes around and goes pop – an explosive currency just isn’t a good idea.”

A similar argument applies to another whole class of elements, the radioactive ones: you don’t want your cash to give you cancer.

Out go thorium, uranium and plutonium, along with a whole bestiary of synthetically-created elements – rutherfordium, seaborgium, ununpentium, einsteinium – which only ever exist momentarily as part of a lab experiment, before radioactively decomposing.

Then there’s the group called “rare earths”, most of which are actually less rare than gold.

Unfortunately, they are chemically hard to distinguish from each other, so you would never know what you had in your pocket.

This leaves us with the middle area of the periodic table, the “transition” and “post-transition” metals.

Elementary Business

Plant, balloons and aluminium can

This group of 49 elements includes some familiar names – iron, aluminium, copper, lead, silver.

But examine them in detail and you realise almost all have serious drawbacks.

We’ve got some very tough and durable elements on the left-hand side – titanium and zirconium, for example.

The problem is they are very hard to smelt. You need to get your furnace up into the region of 1,000C before you can begin to extract these metals from their ores. That kind of specialist equipment wasn’t available to ancient man.

Aluminium is also hard to extract, and it’s just too flimsy for coinage. Most of the others in the group aren’t stable – they corrode if exposed to water or oxidise in the air.

Take iron. In theory it looks quite a good prospect for currency. It is attractive and polishes up to a lovely sheen. The problem is rust: unless you keep it completely dry it is liable to corrode away.

“A self-debasing currency is clearly not a good idea,” says Sella.

We can rule out lead and copper on the same basis. Both are liable to corrosion. Societies have made both into money but the currencies did not last, literally.

So, what’s left?

Why is gold golden?

2,000-year-old golden funerary mask from Colombia

Gold’s golden colour has been a mystery until very recently, says Andrea Sella.

The secret lies in its atomic structure. “Quantum mechanics alone doesn’t explain it,” he says.

“When you get to gold you find the atom is so heavy and the electrons move so fast that you now have to include Einstein’s theory of relativity into the mathematics.

“It is only when you fold together quantum mechanics with relativity that suddenly you understand it.”

Unlike other metals, which in their pure form reflect light straight back, electrons in the gold “slosh around a little,” Sella says, with the result that gold “absorbs a bit of the blue spectrum light, giving the light that is reflected back its distinctive golden colour”.

Of the 118 elements we are now down to just eight contenders: platinum, palladium, rhodium, iridium, osmium and ruthenium, along with the old familiars, gold and silver.

These are known as the noble metals, “noble” because they stand apart, barely reacting with the other elements.

They are also all pretty rare, another important criterion for a currency.

Even if iron didn’t rust, it wouldn’t make a good basis for money because there’s just too much of it around. You would end up having to carry some very big coins about.

With all the noble metals except silver and gold, you have the opposite problem. They are so rare that you would have to cast some very tiny coins, which you might easily lose.

They are also very hard to extract. The melting point of platinum is 1,768C.

That leaves just two elements – silver and gold.

Both are scarce but not impossibly rare. Both also have a relatively low melting point, and are therefore easy to turn into coins, ingots or jewellery.

Silver tarnishes – it reacts with minute amounts of sulphur in the air. That’s why we place particular value on gold.

It turns out then, that the reason gold is precious is precisely that it is so chemically uninteresting.

Gold’s relative inertness means you can create an elaborate golden jaguar and be confident that 1,000 years later it can be found in a museum display case in central London, still in pristine condition.

So what does this process of elemental elimination tell us about what makes a good currency?

First off, it doesn’t have to have any intrinsic value. A currency only has value because we, as a society, decide that it does.

“Start Quote

That’s the other secret of gold’s success as a currency – gold is unbelievably beautiful”

Andrea Sella

As we’ve seen, it also needs to be stable, portable and non-toxic. And it needs to be fairly rare – you might be surprised just how little gold there is in the world.

If you were to collect together every earring, every gold sovereign, the tiny traces gold in every computer chip, every pre-Columbian statuette, every wedding ring and melt it down, it’s guesstimated that you’d be left with just one 20-metre cube, or thereabouts.

But scarcity and stability aren’t the whole story. Gold has one other quality that makes it the stand-out contender for currency in the periodic table. Gold is… golden.

All the other metals in the periodic table are silvery-coloured except for copper – and as we’ve already seen, copper corrodes, turning green when exposed to moist air. That makes gold very distinctive.

“That’s the other secret of gold’s success as a currency,” says Sella. “Gold is unbelievably beautiful.”

But how come no-one actually uses gold as a currency any more?

Chart showing gold price adjusted for inflation

The seminal moment came in 1973, when Richard Nixon decided to sever the US dollar’s tie to gold.

Since then, every major currency has been backed by no more than legal “fiat” – the law of the land says you must accept it as payment.

Nixon made his decision for the simple reason that the US was running out of the necessary gold to back all the dollars it had printed.

Find out more

In Elementary Business, BBC World Service’s Business Daily goes back to basics and examines key chemical elements – and asks what they mean for businesses and the global economy.

  • And here lies the problem with gold. Its supply bears no relation to the needs of the economy. The supply of gold depends on what can be mined.

In the 16th Century, the discovery of South America and its vast gold deposits led to an enormous fall in the value of gold – and therefore an enormous increase in the price of everything else.

Since then, the problem has typically been the opposite – the supply of gold has been too rigid. For example, many countries escaped the Great Depression in the 1930s by unhitching their currencies from the Gold Standard. Doing so freed them up to print more money and reflate their economies.

The demand for gold can vary wildly – and with a fixed supply, that can lead to equally wild swings in its price.

Most recently for example, the price has gone from $260 per troy ounce in 2001, to peak at $1,921.15 in September 2011, before falling back to $1,230 currently.

That is hardly the behaviour of a stable store of value.

So, to paraphrase Churchill, out of all the elements, gold makes the worst possible currency.

Apart from all the others.

Why it’s time for brain science to ditch the ‘Venus and Mars’ cliche.

Reports trumpeting basic differences between male and female brains are biological determinism at its most trivial, says the science writer of the year
brains illustration male female

There is little evidence to suggest differences between male and female brains are caused by anything other than cultural factors. Photograph: Alamy

As hardy perennials go, there is little to beat that science hacks’ favourite: the hard-wiring of male and female brains. For more than 30 years, I have seen a stream of tales about gender differences in brain structure under headlines that assure me that from birth men are innately more rational and better at map-reading than women, who are emotional, empathetic multi-taskers, useless at telling jokes. I am from Mars, apparently, while the ladies in my life are from Venus.

And there are no signs that this flow is drying up, with last week witnessing publication of a particularly lurid example of the genre. Writing in the US journal Proceedings of the National Academy of Sciences, researchers at the University of Pennsylvania in Philadelphia revealed they had used a technique called diffusion tensor imaging to show that the neurons in men’s brains are connected to each other in a very different way from neurons in women’s brains.

This point was even illustrated by the team, led by Professor Ragini Verma, with a helpful diagram. A male brain was depicted with its main connections – coloured blue, needless to say – running from the front to the back. Connections within cranial hemispheres were strong, but connections between the two hemispheres were weak. By contrast, the female brain had thick connections running from side to side with strong links between the two hemispheres.

Men and women brains U.Penn studyA photo issued by University of Pennsylvania researchers showing intra-hemispheric connections (blue) and inter- hemispheric connections (orange) in men’s and women’s brains. Male top row, female bottom row. Photograph: National Academy Of Sciences/PA”These maps show us a stark difference in the architecture of the human brain that helps provide a potential neural basis as to why men excel at certain tasks and women at others,” said Verma.

The response of the press was predictable. Once again scientists had “proved” that from birth men have brains which are hardwired to give us better spatial skills, to leave us bereft of empathy for others, and to make us run, like mascara, at the first hint of emotion. Equally, the team had provided an explanation for the “fact” that women cannot use corkscrews or park cars but can remember names and faces better than males. It is all written in our neurons at birth.

As I have said, I have read this sort of thing before. I didn’t believe it then and I don’t believe it now. It is biological determinism at its silly, trivial worst. Yes, men and women probably do have differently wired brains, but there is little convincing evidence to suggest these variations are caused by anything other than cultural factors. Males develop improved spatial skills not because of an innate superiority but because they are expected and encouraged to be strong at sport, which requires expertise at catching and throwing. Similarly, it is anticipated that girls will be more emotional and talkative, and so their verbal skills are emphasised by teachers and parents. As the years pass, these different lifestyles produce variations in brain wiring – which is a lot more plastic than most biological determinists realise. This possibility was simply not addressed by Verma and her team.

Equally, when gender differences are uncovered by researchers they are frequently found to be trivial, a point made by Robert Plomin, a professor of behavioural genetics at London’s Institute of Psychiatry, whose studies have found that a mere 3% of the variation in young children’s verbal development is due to their gender. “If you map the distribution of scores for verbal skills of boys and of girls, you get two graphs that overlap so much you would need a very fine pencil indeed to show the difference between them. Yet people ignore this huge similarity between boys and girls and instead exaggerate wildly the tiny difference between them. It drives me wild.”

I should make it clear that Plomin made that remark three years ago when I last wrote about the issue of gender and brain wiring. It was not my first incursion, I should stress. Indeed, I have returned to the subject – which is an intriguing, important one – on a number of occasions over the years as neurological studies have been hyped in the media, often by the scientists who carried them out. It has taken a great deal of effort by other researchers to put the issue in proper perspective.

A major problem is the lack of consistent work in the field, a point stressed to me in 2005 – during an earlier outbreak of brain-gender difference stories – by Professor Steve Jones, a geneticist at University College London, and author of Y: The Descent of Men. “Researching my book, I discovered there was no consensus at all about the science [of gender and brain structure],” he told me. “There were studies that said completely contradictory things about male and female brains. That means you can pick whatever study you like and build a thesis around it. The whole field is like that. It is very subjective. That doesn’t mean there are no differences between the brains of the sexes, but we should take care not to exaggerate them.”

Needless to say that is not what has happened over the years. Indeed, this has become a topic whose coverage has been typified mainly by flaky claims, wild hyperbole and sexism. It is all very depressing. The question is: why has this happened? Why is there such divergence in explanations for the differences in mental abilities that we observe in men and women? And why do so many people want to exaggerate them so badly?

The first issue is the easier to answer. The field suffers because it is bedevilled by its extraordinary complexity. The human brain is a vast, convoluted edifice and scientists are only now beginning to develop adequate tools to explore it. The use of diffusion tensor imaging by Verma’s team was an important breakthrough, it should be noted. The trouble is, once more, those involved were rash in their interpretations of their own work.

“This study contains some important data but it has been badly overhyped and the authors must take some of the blame,” says Professor Dorothy Bishop, of Oxford University. “They talk as if there is a typical male and a typical female brain – they even provide a diagram – but they ignore the fact that there is a great deal of variation within the sexes in terms of brain structure. You simply cannot say there is a male brain and a female brain.”

Even more critical is Marco Catani, of London’s Institute of Psychiatry. “The study’s main conclusions about possible cognitive differences between males and females are not supported by the findings of the study. A link between anatomical differences and cognitive functions should be demonstrated and the authors have not done so. They simply have no idea of how these differences in anatomy translate into cognitive attitudes. So the main conclusion of the study is purely speculative.”

The study is also unclear how differences in brain architecture between the sexes arose in the first place, a point raised by Michael Bloomfield of the MRC’s Clinical Science Centre. “An obvious possibility is that male hormones like testosterone and female hormones like oestrogen have different effects on the brain. A more subtle possibility is that bringing a child up in a particular gender could affect how our brains are wired.”

In fact, Verma’s results showed that the neuronal connectivity differences between the sexes increased with the age of her subjects. Such a finding is entirely consistent with the idea that cultural factors are driving changes in the brain’s wiring. The longer we live, the more our intellectual biases are exaggerated and intensified by our culture, with cumulative effects on our neurons. In other words, the intellectual differences we observe between the sexes are not the result of different genetic birthrights but are a consequence of what we expect a boy or a girl to be.

Why so many people should be so desperate to ignore or obscure this fact is a very different issue. In the end, I suspect it depends on whether you believe our fates are sealed at birth or if you think that it is a key part of human nature to be able to display a plasticity in behaviour and in ways of thinking in the face of altered circumstance. My money is very much on the latter.


In their study, Verma and her colleagues, investigated the gender differences in brain connectivity in 949 individuals – 521 females and 428 males – aged between eight and 22 years. The technique they used is known as diffusion tensor imaging (DTI), a water-based imaging technology that can trace and highlight the fibre pathways that connect the different regions of the brain, laying the foundation for a structural connectome or network of the whole brain. These studies revealed a typical pattern, claim Verma and her team: men had stronger links between neurons within their cranial hemispheres while women had stronger links between the two hemispheres, a difference that the scientists claimed was crucial in explaining difference in the behaviour of men and women.

But the technique has been criticised. “DTI provides only indirect measures of structural connectivity and is, therefore, different from the well validated microscopic techniques that show the real anatomy of axonal connections,” says Marco Catani, of London’s Institute of Psychiatry. “Images of the brain derived from diffusion tensor MRI should not be equated to real connections and results should always be interpreted with extreme caution.”This point is backed by Prof Heidi Johansen-Berg, of Oxford University, who attacked the idea that brain connections should be considered as hard-wired. “Connections can change throughout life, in response to experience and learning. As far as I can tell, the authors have not directly related these differences in brain connections to differences in behaviour. It is a huge leap to extrapolate from anatomical differences to try to explain behavioural variation between the sexes. The brain regions that have been highlighted are involved in many different functions.”


‘Memories’ pass between generations

Generations of a family

Behaviour can be affected by events in previous generations which have been passed on through a form of genetic memory, animal studies suggest.

Experiments showed that a traumatic event could affect the DNA in sperm and alter the brains and behaviour of subsequent generations.

A Nature Neuroscience study shows mice trained to avoid a smell passed their aversion on to their “grandchildren”.

Experts said the results were important for phobia and anxiety research.

The animals were trained to fear a smell similar to cherry blossom.

The team at the Emory University School of Medicine, in the US, then looked at what was happening inside the sperm.

They showed a section of DNA responsible for sensitivity to the cherry blossom scent was made more active in the mice’s sperm.

Both the mice’s offspring, and their offspring, were “extremely sensitive” to cherry blossom and would avoid the scent, despite never having experiencing it in their lives.

Changes in brain structure were also found.

“The experiences of a parent, even before conceiving, markedly influence both structure and function in the nervous system of subsequent generations,” the report concluded.

Family affair

The findings provide evidence of “transgenerational epigenetic inheritance” – that the environment can affect an individual’s genetics, which can in turn be passed on.

One of the researchers Dr Brian Dias told the BBC: “This might be one mechanism that descendants show imprints of their ancestor.

“There is absolutely no doubt that what happens to the sperm and egg will affect subsequent generations.”

Prof Marcus Pembrey, from University College London, said the findings were “highly relevant to phobias, anxiety and post-traumatic stress disorders” and provided “compelling evidence” that a form of memory could be passed between generations.

He commented: “It is high time public health researchers took human transgenerational responses seriously.

“I suspect we will not understand the rise in neuropsychiatric disorders or obesity, diabetes and metabolic disruptions generally without taking a multigenerational approach.”

In the smell-aversion study, is it thought that either some of the odour ends up in the bloodstream which affected sperm production or that a signal from the brain was sent to the sperm to alter DNA.

Is it right to waste helium on party balloons?

The US has been selling off its helium reserve, established in the 1920s to provide gas for airships – but even so, shortages have been occurring.

Some scientists believe a finite resource that could one day run out should not be used for party balloons.


In the universe as a whole, it is one of the commonest elements, second only to hydrogen in its abundance. On Earth it is relatively rare, and the only element that escapes gravity and leaks away into space.

“All of the other elements we’ve scattered around the globe, maybe we can go digging in garbage dumps to get them back,” says chemist Andrea Sella, of University College London (UCL).

“But helium is unique. When it’s gone it is lost to us forever.”

Helium has the lowest boiling point of any element, at -269C, just a few degrees above absolute zero (-273C).

“We’re going to be looking back and thinking, I can’t believe people just used to fill up their balloons with it, when it’s so precious and unique,” says Cambridge University chemist Peter Wothers, who has called for the end to helium-filled party balloons.

“It is something we need to think about.”

That would mean an end to the old party favourite of breathing in helium from a balloon, and then talking in a high-pitched voices – a result of helium’s fast-moving molecules. But maybe this would be no bad thing, as it can cause dizziness, headaches and even death.

The gas, which is formed by the decay of radioactive rocks in the earth’s crust, accumulates in natural gas deposits and is collected as a by-product of the gas industry.

The United States is currently the world’s biggest supplier, with the bulk of it stored near Amarillo, Texas, in the national helium reserve – which alone accounts for 35% of the world’s current supply.

This was set up in 1925 as a strategic store for supplying gas to US airships, while after World War Two it provided coolant for missiles and rockets for the military and Nasa.

US airship USS Shenandoah, the first helium-filled rigid airship, 1923USS Shenandoah, the world’s first helium-filled rigid airship

But since the mid-1990s, with growing civilian demand for helium in the manufacture of semi-conductors and for MRI scanners, among other things, the US has been clawing back the cost of storing the gas by gradually selling it off on the open market.

Despite this, the price of helium has doubled over the past 10 years.

Scare stories about this or that resource running out are a commonplace of doomsayers – but this autumn, the world got a taste of what a helium shortage could mean.

US semiconductor manufacturers knew that under the terms of a 1996 law, the US helium reserve was legally obliged to turn off the tap last month.

Massive DNA volunteer hunt begins


Scientists are looking for 100,000 volunteers prepared to have their DNA sequenced and published online for anyone to look at.

The UK Personal Genome Project could provide a massive free tool for scientists to further understanding of disease and human genetics.

Participants will get an analysis of their DNA, but so will the rest of the world, and anonymity is not guaranteed.

They are warned there could be unknown consequences for them and relatives.

Unlocking the secrets of DNA could transform the understanding of disease.

“Start Quote

There is potentially huge public benefit, but there is the potential for it to rebound, but how that rebounds on the person, families and those yet to be born is very difficult to know”

Dr Peter Mills Nuffield Council on Bioethics

A deeper understanding of Alzheimer’s disease is emerging by looking for differences in the DNA of people with and without the disease.

Prof George Church, who runs the US version of the project, said analysing 100,000 genomes could lead to advances in common diseases such as diabetes.

He said: “We’re finding more and more of these common diseases are a collection of rare diseases.

“Cancer used to be a disease, then it broke up into lots of different diseases by tissue, then lots of sub-categories based on the genes that are impacted, so now it’s thousands of diseases.”

Participants will have to pass tests to prove they fully understand the risks of making their genetic identities freely available for the world to use before taking part.

There will be immediate risks and those that emerge as genetic technology advances including:

  • finding out about a genetic disease
  • a partner being put off by a higher risk of Alzheimer’s or other illnesses
  • targeted advertising or insurance premiums based on genomes
  • cloning without permission
  • copies of DNA being used to implicate people in a crime

Family factor

Dr Peter Mills, who is investigating the ethical issues around biological and health data with the Nuffield Council on Bioethics, told the BBC: “The difference with genetic data is you’re not just committing yourself to something you might not fully envisage, but you’re also implicating biological relatives.

“Start Quote

Human Genetics Alert would strongly advise people not to give their genetic information to a project which will share it with the world”

Dr David King Human Genetics Alert

“There is potentially huge public benefit, but there is the potential for it to rebound, but how that rebounds on the person, families and those yet to be born is very difficult to know.”

Cian Murphy, a 24-year-old PhD student at University College London, wants to take part. He said: “Very few people live their whole lives not affected by some genetic illness, your sample could be the difference between a cure being discovered or not.”

As part of the study, participants will find out intimate details about their genome, such as the presence of any high-risk breast-cancer genes.

They will be given a list of doctors they can go to if they need further medical advice.

While people will not have their name published, studies have shown it is possible to work out someone’s identity from genetic databases and other public records.

Dr David King, from the group Human Genetics Alert, said: “Human Genetics Alert would strongly advise people not to give their genetic information to a project which will share it with the world.

“Once your data is online, you will never be able to recall it. The project’s informed-consent procedures are not valid, because they do not tell you all of the risks. That is not informed consent.”

He warned the data could be used for any purpose including those people objected to ethically and said there was “no reason” for it to be public.

More genomes

Meanwhile, the government’s Genomics England project is trying to sequence 100,000 NHS patient’s DNA, which is private and carries a threat of legal action if patients are identified.

Sir John Chisholm, executive chairman of Genomics England, said: “We would want anyone consenting to their DNA being used for sequencing to have a clear understanding of what they are contributing to, and to do so on a voluntary basis which we understand will be the case with Personal Genome Project.

“Anyone who takes part in any initiative that involves giving a DNA sample for sequencing should be as clear at the time of giving their consent as they can be of how that sample will be used, and who will have access to it, and what future purposes it can be put to.”