Supercomputer Faster Than Humans Uses Less Power Than a Hearing Aid


Lawrence Livermore National Laboratory, one of the country’s top scientific research facilities,announced the new mega machine yesterday, which is the product of the IBM and the US government. This new supercomputer for use on national security missions, makes decisions like a human brain, and uses less power than a hearing aid.supercomputer-4

supercomputer-3

Lawrence Livermore National Laboratory, one of the country’s top scientific research facilities,announced the new mega machine yesterday, which is the product of the IBM and the US government. This new supercomputer for use on national security missions, makes decisions like a human brain, and uses less power than a hearing aid.

Using a platform called TrueNorth, a brain-inspired group of chips, which mimics a network 16 million neurons with 4 billion synapses worth of processing power, this supercomputer is capable of recognizing patterns and solving problems much like a human does.

While, AIs thinking the same way as humans do, have been a struggle to create for quite some time now, this machine finally does the trick by learning from their mistakes like humans and adapting quickly to changing, complex situations. This system uses just 2.5 watts of power- similar to the amount needed by a LED liightbulb!

SF Gate points out that in five years or so, the TrueNorth chips could even help smartphones achieve facial recognition, or smart glasses help blind people navigate their surroundings.

The array of TrueNorth chips, which set the lab back a relatively cheap-sounding $1 million, will be used in government cybersecurity missions. It will also help “ensure the safety, security and reliability of the nation’s nuclear deterrent without underground testing,” according to the press release.

 

Advertisements

Researchers develop efficient method to produce nanoporous metals


Nanoporous metals—foam-like materials that have some degree of air vacuum in their structure—have a wide range of applications because of their superior qualities.

They posses a for better electron transfer, which can lead to the improved performance of an electrode in an electric double capacitor or battery. Nanoporous metals offer an increased number of available sites for the adsorption of analytes, a highly desirable feature for sensors.

Lawrence Livermore National Laboratory (LLNL) and the Swiss Federal Institute of Technology (ETH) researchers have developed a cost-effective and more efficient way to manufacture nanoporous metals over many scales, from nanoscale to macroscale, which is visible to the naked eye.

The process begins with a four-inch silicon wafer. A coating of is added and sputtered across the wafer. Gold, silver and aluminum were used for this research project. However, the manufacturing process is not limited to these metals.

Next, a mixture of two polymers are added to the metal substrate to create patterns, a process known as diblock copolymer lithography (BCP). The pattern is transformed in a single polymer mask with nanometer-size features. Last, a technique known as anisotropic ion beam milling (IBM) is used to etch through the mask to make an array of holes, creating the nanoporous metal.

During the fabrication process, the roughness of the metal is continuously examined to ensure that the finished product has good porosity, which is key to creating the unique properties that make work. The rougher the metal is, the less evenly porous it becomes.

Researchers develop efficient method to produce nanoporous metals

“During fabrication, our team achieved 92 percent pore coverage with 99 percent uniformity over a 4-in silicon wafer, which means the metal was smooth and evenly porous,” said Tiziana Bond, an LLNL engineer who is a member of the joint research team.

The team has defined a metric – based on a parametrized correlation between BCP pore coverage and metal surface roughness – by which the fabrication of nanoporous metals should be stopped when uneven porosity is the known outcome, saving processing time and costs.

“The real breakthrough is that we created a new technique to manufacture nanoporous metals that is cheap and can be done over many scales avoiding the lift-off technique to remove metals, with real-time quality control,” Bond said. “These metals open the application space to areas such as energy harvesting, sensing and electrochemical studies.”

The lift-off technique is a method of patterning target materials on the surface of a substrate by using a sacrificial material. One of the biggest problems with this technique is that the metal layer cannot be peeled off uniformly (or at all) at the nanoscale.

The research team’s findings were reported in an article titled “Manufacturing over many scales: High fidelity macroscale coverage of nanoporous metal arrays via lift-off-free nanofrabication.” It was the cover story in a recent issue of Advanced Materials Interfaces.

Other applications of nanoporous metals include supporting the development of new metamaterials (engineered materials) for radiation-enhanced filtering and manipulation, including deep ultraviolet light. These applications are possible because nanoporous materials facilitate anomalous enhancement of transmitted (or reflected) light through the tunneling of surface plasmons, a feature widely usable by light-emitting devices, plasmonic lithography, refractive-index-based sensing and all-optical switching.

Researchers find tie between global precipitation and global warming.


The rain in Spain may lie mainly on the plain, but the location and intensity of that rain is changing not only in Spain but around the globe.

A new study by Lawrence Livermore National Laboratory scientists shows that observed changes in global (ocean and land) precipitation are directly affected by human activities and cannot be explained by natural variability alone. The research appears in the Nov. 11 online edition of the Proceedings of the National Academy of Sciences.

Emissions of heat-trapping and ozone-depleting gases affect the distribution of precipitation through two mechanisms. Increasing temperatures are expected to make wet regions wetter and dry regions drier (thermodynamic changes); and changes in will push storm tracks and subtropical dry zones toward the poles.

“Both these changes are occurring simultaneously in global precipitation and this behavior cannot be explained by natural variability alone,” said LLNL’s lead author Kate Marvel. “External influences such as the increase in are responsible for the changes.”

The team compared climate model predications with the Global Precipitation Climatology Project’s global observations, which span from 1979-2012, and found that natural variability (such as El Niños and La Niñas) does not account for the changes in global precipitation patterns. While natural fluctuations in climate can lead to either intensification or poleward shifts in precipitation, it is very rare for the two effects to occur together naturally.

“In combination, manmade increases in greenhouse gases and stratospheric ozone depletion are expected to lead to both an intensification and redistribution of global precipitation,” said Céline Bonfils, the other LLNL author. “The fact that we see both of these effects simultaneously in the observations is strong evidence that humans are affecting global precipitation.”

https://i2.wp.com/cdn.physorg.com/newman/gfx/news/2013/35-researchersf.jpg

Marvel and Bonfils identified a fingerprint pattern that characterizes the simultaneous response of precipitation location and intensity to external forcing.

“Most previous work has focused on either thermodynamic or dynamic changes in isolation. By looking at both, we were able to identify a pattern of precipitation change that fits with what is expected from human-caused climate change,” Marvel said.

By focusing on the underlying mechanisms that drive changes in global precipitation and by restricting the analysis to the large scales where there is confidence in the models’ ability to reproduce the current climate, “we have shown that the changes observed in the satellite era are externally forced and likely to be from man,” Bonfils said.

Fusion milestone passed at US lab.


Target alignment at NIF The

Researchers at a US lab have passed a crucial milestone on the way to their ultimate goal of achieving self-sustaining nuclear fusion.

Harnessing fusion – the process that powers the Sun – could provide an unlimited and cheap source of energy.

But to be viable, fusion power plants would have to produce more energy than they consume, which has proven elusive.

Now, a breakthrough by scientists at the National Ignition Facility (NIF) could boost hopes of scaling up fusion.

NIF, based at Livermore in California, uses 192 beams from the world’s most powerful laser to heat and compress a small pellet of hydrogen fuel to the point where nuclear fusion reactions take place.

The BBC understands that during an experiment in late September, the amount of energy released through the fusion reaction exceeded the amount of energy being absorbed by the fuel – the first time this had been achieved at any fusion facility in the world.

This is a step short of the lab’s stated goal of “ignition”, where nuclear fusion generates as much energy as the lasers supply. This is because known “inefficiencies” in different parts of the system mean not all the energy supplied through the laser is delivered to the fuel.

Nuclear fusion at NIF.

Hohlraum
  • 192 laser beams are focused through holes in a target container called a hohlraum
  • Inside the hohlraum is a tiny pellet containing an extremely cold, solid mixture of hydrogen isotopes
  • Lasers strike the hohlraum’s walls, which in turn radiate X-rays
  • X-rays strip material from the outer shell of the fuel pellet, heating it up to millions of degrees
  • If the compression of the fuel is high enough and uniform enough, nuclear fusion can result

But the latest achievement has been described as the single most meaningful step for fusion in recent years, and demonstrates NIF is well on its way towards the coveted target of ignition and self-sustaining fusion.

For half a century, researchers have strived for controlled nuclear fusion and been disappointed. It was hoped that NIF would provide the breakthrough fusion research needed.

In 2009, NIF officials announced an aim to demonstrate nuclear fusion producing net energy by 30 September 2012. But unexpected technical problems ensured the deadline came and went; the fusion output was less than had originally been predicted by mathematical models.

Soon after, the $3.5bn facility shifted focus, cutting the amount of time spent on fusion versus nuclear weapons research – which was part of the lab’s original mission.

However, the latest experiments agree well with predictions of energy output, which will provide a welcome boost to ignition research at NIF, as well as encouragement to advocates of fusion energy in general.

It is markedly different from current nuclear power, which operates through splitting atoms – fission – rather than squashing them together in fusion.

NIF, based at the Lawrence Livermore National Laboratory, is one of several projects around the world aimed at harnessing fusion. They include the multi-billion-euro ITER facility, currently under construction in Cadarache, France.

However, ITER will take a different approach to the laser-driven fusion at NIF; the Cadarache facility will use magnetic fields to contain the hot fusion fuel – a concept known as magnetic confinement.